[ACCEPTED]-How to visualize bytes with C/C++-c
You can use a function such as this, to 8 print the bytes:
static void print_bytes(const void *object, size_t size)
{
#ifdef __cplusplus
const unsigned char * const bytes = static_cast<const unsigned char *>(object);
#else // __cplusplus
const unsigned char * const bytes = object;
#endif // __cplusplus
size_t i;
printf("[ ");
for(i = 0; i < size; i++)
{
printf("%02x ", bytes[i]);
}
printf("]\n");
}
Usage would look like this, for 7 instance:
int x = 37;
float y = 3.14;
print_bytes(&x, sizeof x);
print_bytes(&y, sizeof y);
This shows the bytes just as raw 6 numerical values, in hexadecimal which is 5 commonly used for "memory dumps" like 4 these.
On a random (might even be virtual, for 3 all I know) Linux machine running a "Intel(R) Xeon(R)" CPU, this 2 prints:
[ 25 00 00 00 ] [ c3 f5 48 40 ]
This handily also demonstrates that 1 the Intel family of CPU:s really are little endian.
If you are using gcc and X, you can use 2 the DDD debugger to draw pretty pictures of your data 1 structures for you.
Just for completeness, a C++ example:
#include <iostream>
template <typename T>
void print_bytes(const T& input, std::ostream& os = std::cout)
{
const unsigned char* p = reinterpret_cast<const unsigned char*>(&input);
os << std::hex << std::showbase;
os << "[";
for (unsigned int i=0; i<sizeof(T); ++i)
os << static_cast<int>(*(p++)) << " ";
os << "]" << std::endl;;
}
int main()
{
int i = 12345678;
print_bytes(i);
float x = 3.14f;
print_bytes(x);
}
0
Or if you have the boost lib and want to 2 use lambda evaluations you can do it this 1 way ...
template<class T>
void bytePattern( const T& object )
{
typedef unsigned char byte_type;
typedef const byte_type* iterator;
std::cout << "Object type:" << typeid( T ).name() << std::hex;
std::for_each(
reinterpret_cast<iterator>(&object),
reinterpret_cast<iterator>(&object) + sizeof(T),
std::cout << constant(' ') << ll_static_cast<int>(_1 )&&0xFF );
std::cout << "\n";
}
Most (visual) debuggers have a "View Memory' option. IIRC 11 the one in Xcode is pretty basic, just showing 10 bytes in HEX and ASCII, with a variable 9 line length. Visual Studio (Debug->Windows->Memory 8 in Vs2008) can format the hex portion as 7 different integer lengths, or floating point, change 6 the endianess, and display ANSI or UNICODE 5 text. You can also set just about any number 4 for the width of the window (I think xcode 3 only lets you go to 64 bytes wide) The other 2 IDE I have here at work has a lot of options, though 1 not quite as many as VS.
A little bit by bit console program i whipped 1 up, hope it helps somebody
#include <iostream>
#include <inttypes.h>
#include <vector>
using namespace std;
typedef vector<uint8_t> ByteVector;
///////////////////////////////////////////////////////////////
uint8_t Flags[8] = { 0x01,0x02,0x04,0x08,0x10,0x20,0x40,0x80};
void print_bytes(ByteVector Bv){
for (unsigned i = 0; i < Bv.size(); i++){
printf("Byte %d [ ",i);
for (int j = 0;j < 8;++j){
Bv[i] & Flags[j] ? printf("1") : printf("0");
}
printf("]\n");
}
}
int main(){
ByteVector Bv;
for (int i = 0; i < 4; ++i) { Bv.push_back(i); }
print_bytes(Bv);
}
More Related questions
We use cookies to improve the performance of the site. By staying on our site, you agree to the terms of use of cookies.