Why does the following code behave differently on iPhone simulator and device? Im running the simulator on an intel macbook pro and the device is an iPhone 5 (model MD297KS/A).
uint8_t original = 23;
uint8_t * pointerToOriginal = &original;
uint32_t * casted = (uint32_t *)pointerToOriginal;
printf("original: %u\ncasted: %u\n", original, *casted);
First of all your code will lead to undefined behavior. But to make things clear i will try to explain what is going on.
original is stored on stack. So when you take pointer to
original you will get pointer to region with length 8bit in stack memory (this information available only for compiler). Like so:
byte 0 byte 1 byte 2 byte 3 [????????][????????][????????]
Lets say that stack starts from address 0.
pointerToOriginal will point to byte at address 0. Compiler know that
pointerToOriginal points to 8bit value(because of it's type). So when unreferencing it will read exactly 1 byte starting from address 0.
But when converting
uint32_t* you actually force compiler to read 4 bytes instead of 1. So you will end up with reading 4 bytes 3 of which going to be junk.
On the simulator looks like memory region is filled with zero. So stack will looks like so:
byte 0 byte 1 byte 2 byte 3 
and when You dereferencing
casted you will get 23 back. But on real machine it will contain only junk.
Illustration above not explaining one more advanced stuff - Big and Little Endian.