I use an atmega16 at 16 MHz.
The microcontroller "talks" with spi to a DAC in order to produce a specific waveform of potentials. For the timing I use timer2 in CTC mode with a tick of 1 millisecond.
Also I need to toggle two pins (enable/disable of two switches) with accuracy of 1 microsecond.
Instead of using a software delay
_delay_us(10); //wait for 10us
I decided to use timer0 of atmega16 with 1 microsecond tick.
I have the interrupts setup, each one increments a "volatile" variable and in the main program I check the timers volatile variables in order to see if it's time to change something according to timing.
The problem is that timing gets real bad, perhaps out of control and I cannot understand why?
Thank you for your help.