Setting delay using timer: Clock speed

Go To Last Post
12 posts / 0 new
Author
Message
#1
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Hi,

 

     I was learning Timer/Counter peripheral in ATmega168, I want to toggle the LED at a given interval. So I configured timer to toggle OC0A pin.

 

If I understand correctly to get a delay, math would be following,

 

My F_CPU is 1MHz: 1000000 ticks per second

I set pre-scale to 1024, so 1000000/1024 = 976.56 tics per second, i.e., 0.9765 ticks per ms

 

So if I wanted a delay of 200 ms between toggles, I should set OCR0A = 195 (200*0.9765)

Since 8 bit timer only count upto 255, so maximum delay that I could get is 261ms.

and with 16 bit timer 67112 ms delay

 

Is everything up to this point is correct??

 

#include <avr/io.h>                        /* Defines pins, ports, etc */

#define LED PB0
#define LED_PORT POTRB
#define LED_DDR DDRB

static inline void initTimer(void) {
	TCCR0A |= (1 << WGM01);                                  /* CTC mode */
	TCCR0A |= (1 << COM0A0);           /* Toggles pin each cycle through */
	TCCR0B |= (1 << CS00) | (1 << CS02);               /* CPU clock / 1024, 1 tick every 0.9765ms */
}

int main(void) {
	// -------- Inits --------- //
	initTimer();
	LED_DDR |= (1 << LED);
	OCR0A = 195;                        /* around 200ms delay */

	// ------ Event loop ------ //
	while (1) {

		}                                                  /* End event loop */
		return 0;                            /* This line is never reached */
	}

 

How can get a delay of 1000 ms delay with 8 bit timer?? Is it only possible using interrupts.? In LPC2148, I could get desired delay without using any interrupt. Is it possible with Atmega168?

My objective is to create a function similar to _delay_ms()

 

Thanks

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Beware. Your 1000ms clock interval will give you 1000ms LED on, and 1000ms LED off, for a period of 2 seconds (0.5Hz).

 

To get longer than 255ms, count down in software. You would no longer be able to use the OCR mechanism to directly drive an LED. But, an OCR interrupt can be counted to make the interval you want.

 

The standard _delay_ms() uses a software timing loop, not a counter. As such, it  blocks (that is, it totally occupies the  MCU while delaying).

 

Jim

Jim Wagner Oregon Research Electronics, Consulting Div. Tangent, OR, USA http://www.orelectronics.net

Last Edited: Tue. Nov 6, 2018 - 07:34 PM
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

You can also slow the cpu clock using the clkpr reg, this will also slow the clock to the timer as well giving a longer (slower) toggle.

 

Jim

 

Click Link: Get Free Stock: Retire early! PM for strategy

share.robinhood.com/jamesc3274

 

 

 

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

I think a much better and more flexible approach would be to configure the timer to e.g. 1ms ticks, and then count those ticks in software.  For 1ms you could use a prescaler of 4 and an OCR0A of 249 (250 - 1).  Then either set up an interrupt and count ticks in the ISR, or sit in a polling loop watching for ticks and counting them in the loop.

Last Edited: Wed. Nov 7, 2018 - 05:12 PM
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

As noted, the achievable delay is not restricted by the size of the timer.

 

For something like _delay_ms, one can run the timer

at full speed and be accurate to a few CPU cycles.

Just keep reading the timer and doing the right thing.

An overflow has occurred when the current value is less than the previous value.

Don't use the overflow flag.  At best it complicates the logic.

 

If you want to have the 168 toggle an OCR bit with single-

cycle accuracy while the CPU mostly does other things,

that can be done, but it's trickier.

 

Absent another timer, you would need to read the

timer you have at least once every 256 timer ticks.

I'd recommend /1024 or /256 until close,

then shift to full speed and set the toggle-enable bit.

There are subtleties I will leave as an exercise for the reader. 

"SCSI is NOT magic. There are *fundamental technical
reasons* why it is necessary to sacrifice a young
goat to your SCSI chain now and then." -- John Woods

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

kk6gm wrote:

I think a much better and more flexible approach would be to configure the timer to e.g. 1ms ticks, and then count those ticks in software.  For 1ms you could use a prescaler of 4 and an OCR0A of 249 (250 - 1).

 

For F_CPU = 1MHz and prescaler at 4

Number of ticks per second = 1000000/4 = 250000

i.e., 250 ticks per ms

But you used OCR0A = 249 (250-1) why subtract 1

Then either set up an interrupt and count ticks in the ISR, or sit in a polling loop watching for ticks and counting them in the loop.

Like this 

uint16_t i = 0;

while(1 <=1000)    //for 1000 ms delay

{

OCR0A = 249;   //1ms

while(OCR0A != TCNT0);

TCNT0 = 0;   /reset counter

}

if this is incorrect can you show an example?

 

EDIT: There's no prescaler 4 starts at 8

 

ki0bk wrote:
You can also slow the cpu clock using the clkpr reg, this will also slow the clock to the timer as well giving a longer (slower) toggle.
 

I was reading that section today, when done I will give it a try

 

skeeve wrote:
Absent another timer, you would need to read the timer you have at least once every 256 timer ticks.
 

Why 256 timer ticks

Last Edited: Thu. Nov 8, 2018 - 06:45 PM
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

OK, my mistake in assuming a /4 prescaler.  Just use /8 and OCR0A of 124 (125-1).  OCR is always set to CNT-1, that's just the way synchronous counters tend to work.

 

Using your code as a starting point:

uint16_t i = 0;

// start timer here in CTC mode, generating 1ms ticks

while(i < 1000)    //for 1000 ms delay
{
    while (!(TIFR0 & (1 << OCFA)))  // wait for OCFA to be set
        ;
    TIFR0 = 1 << OCFA;              // reset (yes!) OCFA bit
    i++;
}

 

Last Edited: Thu. Nov 8, 2018 - 07:33 PM
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

kk6gm wrote:

OK, my mistake in assuming a /4 prescaler.  Just use /8 and OCR0A of 124 (125-1).  OCR is always set to CNT-1, that's just the way synchronous counters tend to work.

 

Using your code as a starting point:

uint16_t i = 0;

// start timer here in CTC mode, generating 1ms ticks

while(i < 1000)    //for 1000 ms delay
{
    while (!(TIFR0 & (1 << OCFA)))  // wait for OCFA to be set
        ;
    TIFR0 = 1 << OCFA;              // reset (yes!) OCFA bit
    i++;
}

 

 

Do I have to ENABLE interrupt sei() to use TIFR0 register bits as you used in your code?? or is sei() requited only when ISR needs to be executed?

 

uint16_t i = 0;

while(i <=1000)    //for 1000 ms delay

{

OCR0A = 249;   //1ms

while(OCR0A != TCNT0);

TCNT0 = 0;   /reset counter
i++;
}

Wouldn't ths work too??

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

athul wrote:
In LPC2148, I could get desired delay without using any interrupt.

Then perhaps you should be using that microcontroller.  Or at least explain the mechanism.  Tell any real microcontrioller application that will sit in place and do nothing else but mark time for 1000 milliseconds.  Or even >>1<< millisecond.  [edit]  lol -- with the ARM Cortex, you will be able to sit in one place and do nothing much faster and with more powerful 32-bit operations.

 

So if you have such a trivial application, what difference does it make?

athul wrote:
Wouldn't ths work too??

Load Atmel Studio onto your Windows PC.  Use the simulator to test your hypothesis.  What do you find out?

 

You need to define "work".  Neither approach will be guaranteed to be accurate if other interrupt sources are enabled.  If the only thing the app is doing is to count cycles till one second has elapsed, then why not use the 16-bit timer?  What accuracy do you need?  What accuracy is your AVR's clock source?  Your second approach will have a bit of jitter, and some delay in getting the counter reset.   A trick question, with a fast timer clock:  If the testing loop "hits" just when the TCNT rolls over, then it may never match this go-round and will have another 256 timer counts before you have another chance.

You can put lipstick on a pig, but it is still a pig.

I've never met a pig I didn't like, as long as you have some salt and pepper.

Last Edited: Sat. Nov 24, 2018 - 02:12 PM
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

athul wrote:

kk6gm wrote:

OK, my mistake in assuming a /4 prescaler.  Just use /8 and OCR0A of 124 (125-1).  OCR is always set to CNT-1, that's just the way synchronous counters tend to work.

 

Using your code as a starting point:

uint16_t i = 0;

// start timer here in CTC mode, generating 1ms ticks

while(i < 1000)    //for 1000 ms delay
{
    while (!(TIFR0 & (1 << OCFA)))  // wait for OCFA to be set
        ;
    TIFR0 = 1 << OCFA;              // reset (yes!) OCFA bit
    i++;
}

 

 

Do I have to ENABLE interrupt sei() to use TIFR0 register bits as you used in your code?? or is sei() requited only when ISR needs to be executed?

No, interrupts do not need to be enabled (and if you did enable them, you better have an associated ISR!)

Quote:

uint16_t i = 0;

while(i <=1000)    //for 1000 ms delay

{

OCR0A = 249;   //1ms

while(OCR0A != TCNT0);

TCNT0 = 0;   /reset counter
i++;
}

Wouldn't ths work too??

First, OCR0A is redundant here, why not just compare TCNT0 to a constant directly?

 

Second, the trick of setting TCNT0 back to 0 can always result in slipping a count or two, depending on how often your code gets around to checking the value of TCNT0.  The point of having all modern timers have built-in hardware to reset the timer is to eliminate this source of timing error.  So no, I would never use this approach that you have asked about.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Okay,

 

I wrote code using interrupt that will toggle a pin at 1 second interval

#include <avr/io.h>
#include <avr/interrupt.h>

// initialize timer, interrupt and variable
void timer1_init()
{
	// set up timer with prescaler = 64 and CTC mode
	TCCR1B |= (1 << WGM12)|(1 << CS11)|(1 << CS10);
	
	TIMSK1 |= (1 << OCIE1A);		// Output Compare A Match Interrupt Enable
	// initialize counter
	TCNT1 = 0;
	
	// initialize compare value
	OCR1A = 15625;
	sei();
}

ISR(TIMER1_COMPA_vect) {
	PORTC ^= (1 << 1);
}


int main(void)
{
	// connect led to pin PC0
	DDRC = 0XFF;
	
	// initialize timer
	timer1_init();
	
	// loop forever
	while(1)
	{
		
	}
}

It works

 

But When I change interrupt to trigger on Output Compare B Match Interrupt. It isn't working, Why?

 

#include <avr/io.h>
#include <avr/interrupt.h>

// initialize timer, interrupt and variable
void timer1_init()
{
	// set up timer with prescaler = 64 and CTC mode
	TCCR1B |= (1 << WGM12)|(1 << CS11)|(1 << CS10);
	
	TIMSK1 |= (1 << OCIE1B);		// Output Compare B Match Interrupt Enable
	// initialize counter
	TCNT1 = 0;
	
	// initialize compare value
	OCR1B = 15625;
	sei();
}


ISR(TIMER1_COMPB_vect) {
	PORTC ^= (1 << 0);
}

int main(void)
{
	// connect led to pin PC0
	DDRC = 0XFF;
	
	// initialize timer
	timer1_init();
	
	// loop forever
	while(1)
	{
		
	}
}

 

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Since you didn't set OCR1A, it is at its reset value of 0.  So your counter is just counting 0-0-0-0-0-0-0-0.......