Hi All,
I've inhereted some code with _delay_ms() and _delay_us() calls, however the author has chosen to use floating point constants as arguments rather than integer constants. I've read that this can bloat the code, but does it adversely affect the run time of these delays?
Usually I write _delay_us(10) for 10us delays and these work fine (verified with a countetr/timer), however what would you expect from _delay_us(10.0)? Does this also affect the _delay_ms() calls by the same degree?
Thanks,
Mark.