Fabricio Fabricio - 1 month ago 18
C Question

The real difference between "int" and "unsigned int"

int:

The 32-bit int data type can hold integer values in the range of
−2,147,483,648 to 2,147,483,647. You may also refer to this data type
as signed int or signed.

unsigned int :

The 32-bit unsigned int data
type can hold integer values in the range of 0 to 4,294,967,295. You
may also refer to this data type simply as unsigned.

Ok, but, in practice:

int x = 0xFFFFFFFF;
unsigned int y = 0xFFFFFFFF;
printf("%d, %d, %u, %u", x, y, x, y);
// -1, -1, 4294967295, 4294967295


no difference, O.o. I'm a bit confused.

Answer

Hehe. You have an implicit cast here, because you're telling printf what type to expect.

Try this on for size instead:

unsigned int x = 0xFFFFFFFF;
int y = 0xFFFFFFFF;

if (x < 0)
    printf("one\n");
else
    printf("two\n");
if (y < 0)
    printf("three\n");
else
    printf("four\n");
Comments