I hope the subject is clear enough? Well, here is my problem. I use an ATMega 64 to do three major jobs on a PCB:
- generate a clock for a switched capacitor filter (LTC1067) with the Timer1
- convert the analog output behind the RMS-To-DC converter behind the filter through ADC0 to digital values
- convert the values to ASCII and send them to an UART to a PC with a terminal program running and logging the results at 57,6kbit/s
- ADC runs at 125 kHz, so one cycle is 8us long. One conversion in free running mode takes 13 cycles, so 13 x 8us equals to 104us. In theory I should receive one result in my ring buffer every 104 us.
- UART transmits at the speed of 57600 baud, that means 57600 bits per second which is 7200 bytes per second. My results in ASCII are 2-3 bytes long (one or two digits plus a CR) so in the worst case there are 2400 transmissions of ASCII value pairs every second which takes almost 417us per transmission.
- that means I lose around every 4th ADC value because the UART is too slow to read the RingBuffer before it gets overwritten?
I tried to solve the problem in getting a defined amount of ADC samples. I thought I'd set up another Timer (Timer0) and trigger the ADC through Timer0. I tried to trigger every 200us, every 400us and even every 500 us.
- every 200us seemed to look o.k. but the time base I calculated later did not match either. I assumed I trigger the ADC every 200us, on conversion takes 13,5 cylces in auto trigger mode. At 125kHz ADC clock this would be 108us. So I assumed I'd have one value every 200us. But this was not correct, I had less values. Maybe the INT to ASCII conversion and the UART takes too long?
- every 400 or even 500 us I received "0" in between my values in my RingBuffer so I think the ADC was going on too slow?
I hope that someone can clear my mind up and help me how to get the correct timebase. I appreciate any help.