Setting delay using timer: Clock speed

Go To Last Post
7 posts / 0 new
Author
Message
#1
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Hi,

 

     I was learning Timer/Counter peripheral in ATmega168, I want to toggle the LED at a given interval. So I configured timer to toggle OC0A pin.

 

If I understand correctly to get a delay, math would be following,

 

My F_CPU is 1MHz: 1000000 ticks per second

I set pre-scale to 1024, so 1000000/1024 = 976.56 tics per second, i.e., 0.9765 ticks per ms

 

So if I wanted a delay of 200 ms between toggles, I should set OCR0A = 195 (200*0.9765)

Since 8 bit timer only count upto 255, so maximum delay that I could get is 261ms.

and with 16 bit timer 67112 ms delay

 

Is everything up to this point is correct??

 

#include <avr/io.h>                        /* Defines pins, ports, etc */

#define LED PB0
#define LED_PORT POTRB
#define LED_DDR DDRB

static inline void initTimer(void) {
	TCCR0A |= (1 << WGM01);                                  /* CTC mode */
	TCCR0A |= (1 << COM0A0);           /* Toggles pin each cycle through */
	TCCR0B |= (1 << CS00) | (1 << CS02);               /* CPU clock / 1024, 1 tick every 0.9765ms */
}

int main(void) {
	// -------- Inits --------- //
	initTimer();
	LED_DDR |= (1 << LED);
	OCR0A = 195;                        /* around 200ms delay */

	// ------ Event loop ------ //
	while (1) {

		}                                                  /* End event loop */
		return 0;                            /* This line is never reached */
	}

 

How can get a delay of 1000 ms delay with 8 bit timer?? Is it only possible using interrupts.? In LPC2148, I could get desired delay without using any interrupt. Is it possible with Atmega168?

My objective is to create a function similar to _delay_ms()

 

Thanks

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Beware. Your 1000ms clock interval will give you 1000ms LED on, and 1000ms LED off, for a period of 2 seconds (0.5Hz).

 

To get longer than 255ms, count down in software. You would no longer be able to use the OCR mechanism to directly drive an LED. But, an OCR interrupt can be counted to make the interval you want.

 

The standard _delay_ms() uses a software timing loop, not a counter. As such, it  blocks (that is, it totally occupies the  MCU while delaying).

 

Jim

Jim Wagner Oregon Research Electronics, Consulting Div. Tangent, OR, USA http://www.orelectronics.net

Last Edited: Tue. Nov 6, 2018 - 07:34 PM
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

You can also slow the cpu clock using the clkpr reg, this will also slow the clock to the timer as well giving a longer (slower) toggle.

 

Jim

 

Click Link: Get Free Stock: Retire early!

share.robinhood.com/jamesc3274

 

 

 

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

I think a much better and more flexible approach would be to configure the timer to e.g. 1ms ticks, and then count those ticks in software.  For 1ms you could use a prescaler of 4 and an OCR0A of 249 (250 - 1).  Then either set up an interrupt and count ticks in the ISR, or sit in a polling loop watching for ticks and counting them in the loop.

Last Edited: Wed. Nov 7, 2018 - 05:12 PM
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

As noted, the achievable delay is not restricted by the size of the timer.

 

For something like _delay_ms, one can run the timer

at full speed and be accurate to a few CPU cycles.

Just keep reading the timer and doing the right thing.

An overflow has occurred when the current value is less than the previous value.

Don't use the overflow flag.  At best it complicates the logic.

 

If you want to have the 168 toggle an OCR bit with single-

cycle accuracy while the CPU mostly does other things,

that can be done, but it's trickier.

 

Absent another timer, you would need to read the

timer you have at least once every 256 timer ticks.

I'd recommend /1024 or /256 until close,

then shift to full speed and set the toggle-enable bit.

There are subtleties I will leave as an exercise for the reader. 

"For every Christmas tree lit before Thanksgiving,
an elf drowns a baby reindeer"

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

kk6gm wrote:

I think a much better and more flexible approach would be to configure the timer to e.g. 1ms ticks, and then count those ticks in software.  For 1ms you could use a prescaler of 4 and an OCR0A of 249 (250 - 1).

 

For F_CPU = 1MHz and prescaler at 4

Number of ticks per second = 1000000/4 = 250000

i.e., 250 ticks per ms

But you used OCR0A = 249 (250-1) why subtract 1

Then either set up an interrupt and count ticks in the ISR, or sit in a polling loop watching for ticks and counting them in the loop.

Like this 

uint16_t i = 0;

while(1 <=1000)    //for 1000 ms delay

{

OCR0A = 249;   //1ms

while(OCR0A != TCNT0);

TCNT0 = 0;   /reset counter

}

if this is incorrect can you show an example?

 

EDIT: There's no prescaler 4 starts at 8

 

ki0bk wrote:
You can also slow the cpu clock using the clkpr reg, this will also slow the clock to the timer as well giving a longer (slower) toggle.
 

I was reading that section today, when done I will give it a try

 

skeeve wrote:
Absent another timer, you would need to read the timer you have at least once every 256 timer ticks.
 

Why 256 timer ticks

Last Edited: Thu. Nov 8, 2018 - 06:45 PM
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

OK, my mistake in assuming a /4 prescaler.  Just use /8 and OCR0A of 124 (125-1).  OCR is always set to CNT-1, that's just the way synchronous counters tend to work.

 

Using your code as a starting point:

uint16_t i = 0;

// start timer here in CTC mode, generating 1ms ticks

while(i < 1000)    //for 1000 ms delay
{
    while (!(TIFR0 & (1 << OCFA)))  // wait for OCFA to be set
        ;
    TIFR0 = 1 << OCFA;              // reset (yes!) OCFA bit
    i++;
}

 

Last Edited: Thu. Nov 8, 2018 - 07:33 PM