pixel_data is a
When I do
printf(" 0x%1x ", pixel_data ) I’m expecting to see
But I get
0xfffffff5 as though I was printing out a 4 byte integer instead of 1 byte.
Why is this? I have given
char to print out – it’s only 1 byte, so why is
printf printing 4?
printf implementation is wrapped up inside a third party API but just wondering if this is a feature of standard
You’re probably getting a benign form of undefined behaviour because the
%x modifier expects an
unsigned int parameter and a
char will usually be promoted to an
int when passed to a varargs function.
You should explicitly cast the char to an
unsigned int to get predictable results:
printf(" 0x%1x ", (unsigned)pixel_data );
Note that a field width of one is not very useful. It merely specifies the minimum number of digits to display and at least one digit will be needed in any case.
char on your platform is signed then this conversion will convert negative
char values to large
unsigned int values (e.g.
fffffff5). If you want to treat byte values as unsigned values and just zero extend when converting to
unsigned int you should use
unsigned char for
pixel_data, or cast via
unsigned char or use a masking operation after promotion.
printf(" 0x%x ", (unsigned)(unsigned char)pixel_data );
printf(" 0x%x ", (unsigned)pixel_data & 0xffU );
Better use the standard-format-flags
printf(" %#1x ", pixel_data );
then your compiler puts the hex-prefix for you.
printf("%#04hhx ", foo);
length modifier is the minimum length.
printf is actually min-width. You can do
printf(" 0x%2x ", pixel_data & 0xff) to print lowes byte (notice 2, to actually print two characters if
pixel_data is eg