Pärserk Pärserk - 1 month ago 7x
C Question

Why does this uint32_t cast behave differently on iPhone simulator and iPhone device?

Why does the following code behave differently on iPhone simulator and device? Im running the simulator on an intel macbook pro and the device is an iPhone 5 (model MD297KS/A).


uint8_t original = 23;
uint8_t * pointerToOriginal = &original;
uint32_t * casted = (uint32_t *)pointerToOriginal;
printf("original: %u\ncasted: %u\n", original, *casted);

Output when ran on simulator:

original: 23
casted: 23

Output when ran on device:

original: 23
casted: 2755278871

I assumed the cast would result in garbage data being included in the casted integer so the device output makes sense to me, but why is the integer unaffected by the extra data introduced in the cast on simulator?


First of all your code will lead to undefined behavior. But to make things clear i will try to explain what is going on.

original is stored on stack. So when you take pointer to original you will get pointer to region with length 8bit in stack memory (this information available only for compiler). Like so:

  byte 0     byte 1    byte 2    byte 3

Lets say that stack starts from address 0. So pointerToOriginal will point to byte at address 0. Compiler know that pointerToOriginal points to 8bit value(because of it's type). So when unreferencing it will read exactly 1 byte starting from address 0. But when converting uint8_t* to uint32_t* you actually force compiler to read 4 bytes instead of 1. So you will end up with reading 4 bytes 3 of which going to be junk. On the simulator looks like memory region is filled with zero. So stack will looks like so:

  byte 0     byte 1    byte 2    byte 3

and when You dereferencing casted you will get 23 back. But on real machine it will contain only junk.

Illustration above not explaining one more advanced stuff - Big and Little Endian.