So this is probably a really basic C question, but it's been a few decades since my last C class.
I have some values used in my code that are the results of formulas. All of the variables are known at compile time, so I'd like them to be evaluated by the preprocessor, to avoid having them in the code (for both size and speed reasons). For example, I have a bit of code that uses a timer in CTC mode. Therefore, I need to set the OCR2A register to the CTC value. The formula is (1/freq)/(1/F_CPU/prescaler)-1. Now, in my particular case, it's easy enough to compute it and just say #define CTCVAL=25. Works fine, but not very portable across different chips.
I'd like to do something like
#define CTCVAL = (1/FREQ) / (1/F_CPU) / PRESCALER) -1
but that doesn't compile. Doing something like:
uint8_t ctcval= (1/FREQ) / (1/F_CPU) / PRESCALER) -1; OCR2A=ctcval; would work (with some more casts), but that means evaluating the expensive divisions at run time, and as this code is run often (I'm changing the CTC values of the fly based on another time), it seems very inefficient.
The other solution is something like
#if F_CPU = 80000000 #define CTCVAL 25 #ELSEIF F_CPU=16000000 #define CTCVAL 13 etc.
but that's not easy to maintain.
So is there a way to do the math in the preprocessor? Ideally without ending up linking in the floating point libraries? Not just for this computation, either, but more generally.