I would like to know the difference between
The two functions:
RTIMER_NOW() - returns the uptime in real-time timer (
rtimer) ticks modulo
rtimer overflow value. For example, on a platforms where
RTIMER_ARCH_SECOND is 32768 and the overflow happens every 2 seconds, the value will always be between 0 and 65535, inclusive.
clock_time() - returns the uptime in clock ticks. Unlike the rtimer ticks, these are monotonic (that is, always nondecreasing). By default there are 128 ticks per second (the constant
CLOCK_SECOND defines this).
Your other questions:
uint32_t, then values will always fit in a 32-bit integer. Otherwise on platforms where
int is 16 bits, values of type
clock_time_t might not fit in it.
How do you possibly think that would work? What is the vaulue of "simulation time" when the code is executing on real nodes? The answer is no, the emulated nodes in Cooja have no idea about the "simulation time".