I init a
uint32_t test = ((first_uint8 << 16) | second_uint8);
- for a uint32_t variable, why 0x170001 does not equal to 0x00170001?
It does. 0x170001, 0x0170001 and 0x00170001 are all equal.
2.if it is caused by I didn't memset test by 0, then test should also not be equal to 0x170001, it should be 0x11170001 or something with a garbage first byte.?
It is not caused by a missing
memset. Any assignment
test = X will set all bits in
3.is it caused by the compiler ignores the 0 in the front of hex value? I'm using Android NDK to compile my c code.
That could explain it but I doubt it. I think it is more likely that your being tricked by something else. Perhaps you are not actually running the code, you think. If it really turns out that it makes a difference whether you write
0x00170001, you should report it as a compiler bug - but be really sure before doing that.
See here for a working example compiled with
gcc : http://ideone.com/UR566Q