TCC0_CNT and _delay_() don't match

Go To Last Post
2 posts / 0 new
Author
Message
#1
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Hi everyone,

I am having an issue of counting the program running time in my Xmega128 A1u. The chip is programmed to run at a frequency of 32Mhz. 

int main(void)
{
	timerINI();
        init_USART();
	
    while (1) 
    {
	int16_t currentCycles= TCC0_CNT;
	dt=currentCycles-previousCycles;
	dt=dt*0.032;  //convert cycles into ms
	previousCycles=currentCycles;
    receive_data();
    data_calibration();
    SEND_TO_USART(); 
    }
}
void timerINI() {
	TCC0.CTRLA = TC_CLKSEL_DIV1024_gc;
	TCC0.CTRLB = TC_BYTEM_NORMAL_gc;
}

The result of dt is 35 ms. However when I added a 100ms delay such that

int main(void)
{
	timerINI();
	init_USART();
    while (1) 
    {
	int16_t currentCycles= TCC0_CNT;
	dt=currentCycles-previousCycles;
	dt=dt*0.032;  //convert cycles into ms
	previousCycles=currentCycles;
    receive_data();
    _delay_ms(100);
    data_calibration();
    SEND_TO_USART();
    }
}
void timerINI() {
	TCC0.CTRLA = TC_CLKSEL_DIV1024_gc;
	TCC0.CTRLB = TC_BYTEM_NORMAL_gc;
}

The result is now 41 ms. Shouldn't it be 135 ms instead? Does anyone have an explanation  of this scenario? 

The Baudrate of USART is set to 115200 in case this could affect the time counting.

Need help!  Thanks a lot.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

I believe the timer function is running with a frequency of 2MHZ meanwhile the  delay function is with 32MHZ. That's why I'm not getting a correct value.