This should be a common task, but I couldn't find any information in the forum (quick search only).
In my C code I declare a timer overflow frequency using:
#define TMR0_OVF_FREQ (F_CPU/(256.0*256.0)) ... const uint8_t TMR0_TICKS_PER_SECOND = (uint8_t) TMR0_OVF_FREQ; // Loss of precision here
This constant is an integer, so I want to use a pre-processor directive to check that the loss of precision is acceptable:
const uint8_t TMR0_ERROR = (uint8_t) (100.0 - 100.0*(((uint8_t) TMR0_OVF_FREQ)/TMR0_OVF_FREQ)); #if (TMR0_ERROR > 5) # warning "Timer0 error is larger than 5%!" #endif
But it doesn't seem to work. It compiles, but it will never generate a warning, even if I change the condition to > 0 and set the TMR0_ERROR to 1.