Need for an external prescaler for atmega644p?

Go To Last Post
28 posts / 0 new
Author
Message
#1
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Hi,

I have a project which needs a 16 bit counter which is kept in sync to an external clock signal provided by another IC. I plan to use Atmega644p and it's 16bit Timer/Counter 1, which would be clocked by the external ic via the T1 pin.

However the hard part is, that this external clock signal should be prescaled by 32 to get the right counter speed. I understand from the Atmega644p datasheet, that when driving the Timer/Counter 1 via the T1 pin, you cannot use a prescaler. Did I understand this right?

If this is the case, can anyone recommend for a solution? Is there any cheap external prescaler chips in a very small package? At least my searches did not produce any results...

thanks in advance!
-Slintone

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Look for counters rather than prescaler. A 74HC590 is an 8 bit counter.

John Samperi

Ampertronics Pty. Ltd.

www.ampertronics.com.au

* Electronic Design * Custom Products * Contract Assembly

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

If you need a prescaler chip use 74HC4040.

What clock speeds you are measuring? Because you can only feed up to 8MHz to T1 pin when running the AVR at 16MHz.

Also another question is, why can't you just count the input pulses and process them after dividing them by 32 in software?

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

If your external clock frequency is not higher than Fosc/4 (or even Fosc/2 - consult your datasheet), you may not need a prescaler at all. Just make a software timer extender (if you really need the full 16-bit result) and use the combined (timer + extender) value shifted 5 times to the right. If 11 bits are enough just discard the 5 LSb of a timer value - no extender is necessary in this case.

Warning: Grumpy Old Chuff. Reading this post may severely damage your mental health.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Thanks for the input. The external clock signal is 1 MHz and my AVR MCU is running at 8 MHz internal clock. A prescaler of 32 would give me 2 seconds between timer overflow interrupts, which would be optimal for this application.

I have tried prototyping by driving the 1 MHz external clock signal directly into the T1 without any hardware prescaling, and it works OK when I implemented a software prescaler into the timer overflow interrupt. But for the final product I cannot afford to have a timer overflow interrupt ~15 times per second. There is just too much other timing critical stuff going on (like software implemented digital communication) that need occasionally to turn off all interrupts for a while.

The prescaler/counter chips you both suggested look interesting, but as it happens (of course) I'm also in a lack of PCB real estate here... :) I shall look into their datasheets and try to find the smallest possible package...

-Slintone

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

slintone wrote:
But for the final product I cannot afford to have a timer overflow interrupt ~15 times per second. There is just too much other timing critical stuff going on (like software implemented digital communication) that need occasionally to turn off all interrupts for a while.

This just clearly shows an improper implementation of your "software implemented digital communication". If you really "cannot afford to have a timer overflow interrupt ~15 times per second", then you will have problems even with your two seconds interrupt.

Warning: Grumpy Old Chuff. Reading this post may severely damage your mental health.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

slintone wrote:
But for the final product I cannot afford to have a timer overflow interrupt ~15 times per second.

This sounds strange.
Such a very small interrupt load should not harm anything.
Maybe you should rethink your concept.

slintone wrote:
I'm also in a lack of PCB real estate here... :)

Program an ATtiny13 or 10 as prescaler.

Peter

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

But what do you do with the external 1MHz signal? Do you have to measure it somehow, or why don't you use the AVRs frequency and use that for 2 sec timer?

You can always chain two AVR timers. Feed 1Mhz to timer of your selection, and feed the output of that timer to input of another timer. Zero external components, just uses two timers.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

OK, I understand that this sounds a bit strange. The digital communication is an I2C variant, and I have been disabling all interrupts while sending one command. I have not tought this to be overkill, as there are many types of interrupts happening which might mess up the I2C message if triggered in a bad place. If you think that a timer overflow interrupt is so small that it could be triggered even between I2C bits, then of course I could disable only certain more time consuming interrupts and keep the timer interrupt active. I have verified with a prototype, that if I disable _all_ interrupts during I2C messages, the timer will start to drift too badly because the timer overflow interrupt will not always be served immediately when necessary. If I could use a timer with an overflow interrupt occuring only once in 2 seconds, this problem would of course not completely disappear, but the timer error would diminish to be so small that it wouldn't be a problem in my application.

Meanwhile, I have found another solution possibility. I see that the Timer/Counter 2 on Atmega644p can also be clocked externally into the TOSC1 pin. Unlike with the T1 pin on Timer/Counter1, this input will go through a prescaler, so it would be possible to achieve the wanted 2 second timer loop.

But the Atmega644p datasheet is a bit vague here, and I cant decide if it's allowed to input more than 32kHz into the TOSC1 (I would need to input 1MHz)?

-Slintone

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

You can interrupt the I2C transfers even for years and everything will work without any problems because I2C, SPI and like interfaces are SYNCHRONOUS.

Warning: Grumpy Old Chuff. Reading this post may severely damage your mental health.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Jepael wrote:
But what do you do with the external 1MHz signal? Do you have to measure it somehow, or why don't you use the AVRs frequency and use that for 2 sec timer?

This external 1MHz signal is my only "accurate clock" as the MCU itself is running on it's internal 8 MHz clock which is not accurate enough. So I have to use this 1 MHz signal to maintain a more accurate clock that is necessary for a number of purposes.

It might be possible for me to clock the ACR MCU directly from the 1 MHz signal, but it sounds too risky. I cannot be sure that the 1 MHz signal is always available, and if it's not, I need to light up an error LED instead of just shutting the AVR down. Also I'm not thrilled of lowering the MCU clock speed to 1/8.

Jepael wrote:
You can always chain two AVR timers. Feed 1Mhz to timer of your selection, and feed the output of that timer to input of another timer. Zero external components, just uses two timers.

OK, this sounds interesting. Do you mean simply by physically routing an output to an input, or is this possible also in software?

-Slintone

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Why not use the hardware I2C of the AVR?

You can use T0 as prescaler (pin toggle on TCNT0 = 15) and feed it into T1 input.

Peter

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

And you can't use a 8MHz xtal for the AVR that then make a precise 1MHz clk to the "other" system?

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

danni wrote:
Why not use the hardware I2C of the AVR?

I tried, but the AVR HW I2C is not compatible with my sensor device, which uses 16bit commands and not 8bit like usually.

danni wrote:

You can use T0 as prescaler (pin toggle on TCNT0 = 15) and feed it into T1 input.
Peter

Yes, this seems to be the route of least compromise. I'll try this with my prototype.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

sparrow2 wrote:
And you can't use a 8MHz xtal for the AVR that then make a precise 1MHz clk to the "other" system?

I could, but I'm trying to avoid the cost and space requirements for an extra XTAL if I can. The PCB has to fit inside a 30mm x 80mm enclosure, so every square millimeter is valuable.

-Slintone

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

slintone wrote:

I tried, but the AVR HW I2C is not compatible with my sensor device, which uses 16bit commands and not 8bit like usually.

Which sensor would this be? Can you post a link to datasheet? Those kind of issues are *very* rare, but not unheard of. I have once ran into a I2C device which required a 5-bit internal block address after the 8-bit device address or something like that. Edit: I just wanted to make sure that it really wants 16 bits of data before ack and you cannot send it as two 8-bit transmissions with ack.

But if you have a software I2C master implementation, what difference does it make if there are 15 interrupts per second or one interrupt per two seconds? I use software I2C every day and I have serial port data coming in which is much more important than I2C comms, and both things still work. You just need to wrap your head around to make it work.

I understand that not missing the 1MHz ticks is a must - I would not want a wall clock drift time too. But counting the happened overflows in an interrupt and examining the results in main code should be easy. What do you do in the interrupt? You don't even need interrupts, if you make sure you poll the timer in main loop faster than every 65 milliseconds. (or if you use the two-timer linking trick, the overflow time is an eternity so to speak...)

Edit2: I also don't recommend using the TOSC for anything other than 32768Hz crystal.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

MBedder wrote:
You can interrupt the I2C transfers even for years and everything will work without any problems because I2C, SPI and like interfaces are SYNCHRONOUS.

Ah, what a major brain fart from my side! I'm so used to playing with simple clock-less serial protocols that the interrupt disabling just came from my spine.

So, thanks to your comments, I now see that regarding the I2C communication, this "problem" should be solvable fully in software. But I need to investigate the other modules of the software, because there are also other blocks where I have disabled interrupts (and they are probably for a better reason).

-Slintone

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Jepael wrote:

Which sensor would this be? Can you post a link to datasheet? Those kind of issues are *very* rare, but not unheard of.

I'm sorry, I cannot post the datasheet as it is not public. But the sensor IC wants to send 16bits as one message with just one ACK. My understanding of the Atmel HW I2C is that it wants to send an ACK for every received 8 bits (the standard I2C way).

But as said in the post above, I just figured out that allowing all interrupts during software I2C communication is not a problem. :)

Jepael wrote:

I understand that not missing the 1MHz ticks is a must - I would not want a wall clock drift time too. But counting the happened overflows in an interrupt and examining the results in main code should be easy. What do you do in the interrupt? You don't even need interrupts, if you make sure you poll the timer in main loop faster than every 65 milliseconds. (or if you use the two-timer linking trick, the overflow time is an eternity so to speak...)

The remaining problem is, that my software is not originally designed to react fast to interrupt requests. Even after enabling the interrupts during I2C messages, there are still plenty of other stuff going on where I have disabled ALL interrupts when actually I need to disable only some of them. So, just some code cleaning work to do in order to make this working.

-Slintone

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

By the sound of it, you need to sit down and have a serious look at your program design.

List the tasks / peripherals that need service. And specify a acceptable latency.

The whole point of hardware peripherals is to start them off, and they will IRQ when they have finished their job.

The actual service routines normally just fill / empty buffers and / or set flags.

As long as you always service one IRQ before the next one is due, your system is fully operational. So your 1MHz input signal will take 256us to generate an 8bit Timer Overflow IRQ, or even longer if you have a prescaler or use a 16bit timer.

You disable a particular peripheral interrupt Enable briefly while you access any non-atomic variables. This should not affect the latency of other peripherals.

David.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

slintone wrote:

I'm sorry, I cannot post the datasheet as it is not public. But the sensor IC wants to send 16bits as one message with just one ACK. My understanding of the Atmel HW I2C is that it wants to send an ACK for every received 8 bits (the standard I2C way).

Yes fair enough - Now I am sure that you know what you are doing and that is the only way.

I think David is right, you should reconsider the way your software works. But it is hard to give any specific details, other than what is already said. Do the minimum that is absolutely necessary in interrupts, and let the main program handle the rest. And when the main program is reading or writing variables that are shared between main program and interrupts, disable (all or one specific interrupt) for minimum amount of time (like read in some shared variable into temporary variable).

There are situations that need an exception to that, but it would be hard to describe examples (like some specific timing needs) or your need for those.

Btw, lucky you because you still have HW and SW resources available. One thing I am now struggling with is to determine frequency of two signals, other signal being near 16 or 32 kHz, armed with a GPIO pin, capable of external interrupt only, no timer there. Board space not critical so I did have to prescale the signal with a chip externally.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Jepael wrote:
And when the main program is reading or writing variables that are shared between main program and interrupts, disable (all or one specific interrupt) for minimum amount of time (like read in some shared variable into temporary variable).
A good way to do this is using the ATOMIC_BLOCK macro (assuming you are using Avr-gcc (WinAVR) as your compiler):
#include 
#include 
. . .
volatile uint16_t isr_variable;
. . .
int main( void )
{
    uint16_t tmp;
    . . .
    for ( ;; )
    {
        . . .
        ATOMIC_BLOCK( ATOMIC_RESTORESTATE )
        {
            tmp = isr_variable;
        }
        . . . do stuff with tmp instead of isr_variable
    }
}

The above code grabs the value of isr_variable safely, then works on it knowing the value won't change while you're processing it.

Stu

Engineering seems to boil down to: Cheap. Fast. Good. Choose two. Sometimes choose only one.

Newbie? Be sure to read the thread Newbie? Start here!

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

I will just add one thing
You could make a routine that regulary (when you know you have time) make a timing of the 1MHz clk and your speed, that can be done xtal accurate. Over a hour that will not change if temp volt etc. is about stabil.

Jens

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

flip flop divider? /2 /4 /8 /16 /32

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

santacruzbob wrote:
flip flop divider? /2 /4 /8 /16 /32

I already suggested 74HC4040 binary ripple counter that has 14 flip flops. No space. Rerouting AVR timers is the smallest option.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Thank you so much guys, this conversation has been a great lesson for me. I'm now in the process of refactoring the code to avoid unnecessary atomic blocks (=disabled interrupts). If everything goes well, I won't be needing any new hardware.

But there are still some challenges. For example, one sensor requires me to transfer hundreds of bytes of data from it immediately when it pulls a pin down and causes a pin change interrupt (PCINT). I have not found any other reliable solution besides doing the transfer inside the PCINT interrupt. It seems that this PCINT interrupt is triggerin too often, and as the software prescaler for Timer1 cannot trigger inside other interrupts, it will screw my clock too much.

I tried to implement it so that the interrupt just sets a flag, and the main program does the transfer when it detects that the flag is set. But it was not fast enough, as the main program has quite a lot of calculation to do and it cannot be polling the flag state often enough.

The scary idea of nesting interrupts has reached my mind. I have on some other project tried enabling interrupts while being inside an interrupt, and it worked, but I guess it is too much of a hazard for program flow predictability and stability...

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Okay, the PCINT interrupt can last as long as it needs but the timer overflow still needs to be handled while in PCINT interrupt, doing the transfer.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Exactly. I experimentally tried enabling interrupt nesting. And it seems to work perfectly. I have tried to figure out the possible problem scenarios which might occur because of the nesting, but actually I'm unable to come up with anything.

I came up with a list of things to do in order to be able to "safely" use nested interrupts:

1) Manually prevent any interrupt from re-triggering while processing one instance of it. (Maybe not needed if this happens automatically?)

2) Make sure that there is no shared resources between any interrupt handler routines.

3) Make sure that it is not a problem for the MCU's memory resources (=there is enough free sram) if all interrupts are being triggered at the same time.

Do you think that this makes any sense? Can you think of more rules? I know that my design is safe by these rules, and the system is actually workin perfectly now as we speak... :)

As I can't get rid of the massive PCINT interrupt handler, this is the only software-only solution I can think of, after investigating things quite a while.

-Slintone

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

1) that's the real issue with sei() in interrupts - a question of re-entrancy. You either have to be sure that the same int won't occur while it's being handled or, as you say, do something to prevent it - that is turning off its ??IE bit. If you don't do this then when the event occurs the ??IF bit will be set the fetch of opcodes in the current ISR code will finish execution of the current opcode then the ISR will be re-entered via the vector table jump.