I need an idea

Go To Last Post
13 posts / 0 new
Author
Message
#1
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Hello, I need to sample (and converts) my analog signal every 5ms (5.0ms).
Here is the part of my code, but I don't know is this the best way:

SIGNAL(SIG_ADC)
{
   r[0]=ADC;
}

int main(void)
{
  double t=0;
  f_init();

  while(1)
  {
    ... some code here

    TCCR1B = 0; //Stop Timer Counter 1
    t = 0.000125*TCNT1; //t is in mS
    _delay_ms(5-t);
    TCNT1=0;
    TCCR1B = (1<<CS0);//Start T/C1 
   sleep_mode();//idle mode and start ADC
   }
  return 0;
}

I use 16-bit Timer/Counter with no prescaler to catch the time needed for sample and conversions, then I just subtratc this time at 5ms and again start T/C1 and the sleep mode.
The bad side is that I can't use T/C1 for another purpose.
I would like to know is there better way to solve this?

Edit: conversation->conversions, english trouble :D

Last Edited: Fri. Sep 23, 2005 - 01:32 PM
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Set up the timer to generate an interrupt. The interrupt sets a flag. In the main loop check for the flag. That way the timer could still be multipurpose. One timer can be used for timing various things like ADC timing, checking push buttons, updating displays, updating pwm output etc.

Ralph Hilton

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

A really impressive example of very bad programming style:

- avoid floats inside interrupts !

- avoid delay loops inside interrupts !

- avoid subroutine calling inside interrupts !

Simple use the timer compare interrupt, add the delay value (integer !) to the compare register and start the conversion.

Since the timer register is only 16 bit integer it make absolutely no sense to calculate the time as float.

Peter

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Quote:
The bad side is that I can't use T/C1 for another purpose.

If your Timer1 range is either a divisor or a multiple of 5ms, piggybacking
is fairly easy. (divisor:) Keep a post-scaler variable which counts overflows;
when it counts to 5ms, start the ADC. (multiple:) Set one of the OCR1x
for 5ms; in the OC interrupt, start the ADC and Add 5ms to OCR1x (modulo
the Timer1 range). In both cases, the SIG_ADC will capture the completion.

In some newer devices (look for the ADTSn bits) you can start the ADC
automatically with either an overflow or an OC event.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

danni wrote:
A really impressive example of very bad programming style:

- avoid floats inside interrupts !

- avoid delay loops inside interrupts !

- avoid subroutine calling inside interrupts !


wait the second, what are you talking about? This is not inside interrputs, I only take the value of ADC in ISR, what's wrong with this?
danni wrote:

Since the timer register is only 16 bit integer it make absolutely no sense to calculate the time as float.
Peter

how do you mean no sense? Obvious you don't understand my code.

@mckenney
I think you don't understand me, my code works nicely and I just want to know is there any other solution without using timer/counter.
I use T/C1 for catchig the time between starting and finishing ADC, because every conversions has different time, afterwards I subtract this at 5ms, wait

_delay_ms(5-t); 

and again start ADC.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

The notion behind "piggybacking" is that you can consider the cost to be
a fractional timer, since you can get other uses out of the same timer. (I
find an N-millisecond clock tick to be useful for all sorts of things.)

Quote:
without using timer/counter

If your goal is to do timekeeping (5ms tick) using exactly 0 timers, I
think it will be trickier. You might be able to do something with the fact
that an ADC conversion takes a fixed length of time (13 ADC clocks)
computable ahead of time. To this you would have to add to cost of the
ISR and any other work you're doing in the main() loop to get your "t"
value. These costs may vary by optimization level, and certainly will
need to be re-computed every time you change anything.

Another possibility might be to use Free-Running mode with a Really
Big (ADC) prescale value so as to force its 13 clocks to equal 5ms;
the problem with this is that it also stretches the Sample/Hold period
(>500us) which may not agree with the device you're sampling.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

mckenney, the problem isn't how to delay 5ms, I can use a function from the avrlibc _delay_ms( double _ms) who perform a delay of _ms milliseconds

_delay_ms(5)

, the problem is how to catch the time between starting and finishing the ADC conversion + time of some my code (some my functions). Like you say, an ADC conversion takes a fixed length of time (13 ADC clocks), lets mark this time with Tadc, the time needed for execute some my functions is Tmy. When ADC is finished and before starting a new conversions I must wait Twait = 5ms - Tadc - Tmy. I was thiking that Twait will be constant, but it's not, every cycles I have the different time Twait and this is my problem. This is why I use the 16-bit Timer/counter to catch the time Tadc+Tmy, I was thinking that there is another solutions to catch this time.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

As has been mentioned before, set up the timer to interrupt every 5 ms. When it interrupts set a flag to start your ADC. When it is done, do calculations, and wait for timer interrupt. Will ALWAYS be 5 ms apart wit no worry about how long the ADC and your code (unless > 5ms) is.

Randy

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Hi Randy,I get it, thanks all for help
and if danny ( Peter ) can explain why my code is:
"A really impressive example of very bad programmng style"

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

One other thing you can do, if you'd like to not use any timers at all, is to just set your ADC to free-running at a known sampling rate. In the ADC interrupt handler, set up a little static counter - toss (or accumulate or what have you) all the ADC values until the counter hits the 5ms mark, then store it and raise a flag.

Down side, of course, is that your ADC is always running.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

slavko wrote:
Hi Randy,I get it, thanks all for help
and if danny ( Peter ) can explain why my code is:
"A really impressive example of very bad programmng style"

Sorry, it seems I was totally confused :?:

I have seen the whole code as the interrupt handler.

Naturally in the main you can use float and delay loops (also if not really needed).

I would start the AD-conversion inside the timer compare interrupt and then add the the desired delay time to the compare value.
Then the timer runs continously and you can still use the overflow and the other compare interrupt.

Peter

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

A simple method that I use for slow-moving signals such as temperature and current is to simply start the next A/D conversion in the timer tick ISR. By 5ms, the previous >>must<< be done; maybe a simple check to kick start the loop again if it gets hung up.

The sample can be stored in the ADC ISR as shown if desired. The processing can be done anytime it is convenient--in the ADC ISR; in the main loop based on a flag set in ADC ISR; in the main loop based on a timer tick flag; in the timer tick ISR itself. With this method there is no fussing with timer counts or timer operation once set up.

Lee

You can put lipstick on a pig, but it is still a pig.

I've never met a pig I didn't like, as long as you have some salt and pepper.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

I solve my problem with mckenney's idea, since my english ins't very well at the first time I did not understand what he said, but now is ok :D