Question: Why would the accuracy of the ADC conversions improve by going below the recommended lowest clock frequency for the ADC (ATxmega128A3U @12Mhz)?
- This is exactly what is happening on a project where the voltage of a LiPo battery is being constantly checked
- Now, the recommended values given by Atmel are the following:
- APPLICATION NOTE: Atmel AVR1300: Using the Atmel AVR XMEGA ADC
1.8 Conversion Speed (Page 13)
...Lowest ADC clock frequency for both XMEGA A and XMEGA D devices are 100kHz. See device datasheet for more information.
- DATASHEET: ATxmega256A3U / ATxmega192A3U / ATxmega128A3U / ATxmega64A3U
Table 36-9. Clock and timing. (Page 79)
...ADC Clock frequency -> Min. 100 kHz
- Working close to the lowest limit (12MHz / 128 => 93,75kHz) was giving me an error of -0,2v
- I increased the frequency by reducing the prescaler to 12MHz / 64 => 187,5 kHz which in theory should improve the accuracy but I was getting a bigger error of -0,4v
- Finally, changing into the other direction 12Mhz / 512 => 23,43kHz I finally got the exact value, no errors!!!
- This is nevertheless in theory not recommended, then why is it improving all meassurements? Even on the ADC-conversions of an external power supply. No errors going this slow.
- On posts like "Atmel Xmega ADC Problems & Solutions" from "Frank's Random Wanderings" I've seen that other people also see that the performance improves by going below 100kHz. In his case 62 kHz. But nowhere have I seen any explanation
- Any ideas?
Any hint would be greatly appreciated!
Tools and Controller used:
- AVR Studio 5.1
- AVRToolchain: AVRGCC\126.96.36.199
- Controller: ATxmega128A3U
ADCA.CTRLB = ADC_RESOLUTION_12BIT_gc; ADCA.REFCTRL = ADC_REFSEL_INT1V_gc; // 1V ADCA.PRESCALER = ADC_PRESCALER_DIV512_gc; // 12Mhz / 512 => 23,43kHz ADCA.CH0.CTRL = ADC_CH_GAIN_1X_gc | ADC_CH_INPUTMODE_SINGLEENDED_gc; ADCA.CALL = ReadSignatureByte(PRODSIGNATURES_ADCBCAL0); ADCA.CALH = ReadSignatureByte(PRODSIGNATURES_ADCBCAL1);