I created a program that toggles an LED every 1 second. To achieve high accuracy, I used CTC mode in this way:
16MHz crystal (main MCU crystal oscillator), prescaler = 256, OCIE2A = 249, on every match variable timer2_cmc increases by 1, MCU sleeps between each timer tick.
When timer2_cmc is >= 250, MCU toggles LED.
When I simulate program, Stop Watch exactly increases 1000000us each time simulator hits led toggle breakpoint. So theorically, my code should be very accurate.
But when I compare the result on real hardware with time.is website, after several minutes (~20) it is noticeable that LED toggles aren't fully synchronous with time.is. Looks like LED toggle periods are slightly less than 1 second.
What is the problem? Is it because of crystal oscillator tolerance or I made a mistake in my code? Please don't suggest other crystal frequencies as solution.
Here are some parts of my code:
... Stack configuration ;Initialize Timer 1 ldi buffer,(1<<wgm21) sts tccr2a,buffer ldi buffer,(1<<cs21)|(1<<cs22) sts tccr2b,buffer ldi buffer,249 sts ocr2a,buffer ldi buffer,(1<<ocie2a) sts timsk2,buffer sei ... Set PORTB as output, Set sleep mode loop: com status ;Invert state out portb,status ;Here is LED toggle breakpoint wait: in buffer,mcucr ori buffer,(1<<bods)|(1<<bodse) ;Disable Brown-Out Detection while sleep out mcucr,buffer andi buffer,~(1<<bodse) ;Also out mcucr,buffer sleep cpi timer2_cmc,250 ;Clear carry if timer2_cmc >= 250 brcs wait ;Go to wait if carry is set clr timer2_cmc ;Reset timer rjmp loop timer2_cm: push buffer in buffer,sreg push buffer inc timer2_cmc pop buffer out sreg,buffer pop buffer reti