I've read the errata and some feed back on the A/D being 'broken' in the xmega "A" parts, and wonder if the same applies to the "D" series.
What I wanted to do was to measure the battery voltage (LIon) using the A/D unit. From what I've read the single ended mode is subject to considerable 'uncertainty' of about +- 6 bits. To be fair, Atmel does state that there is a 200 count 'noise' floor that needs to be measured and subtracted from all readings in single ended mode.
The recommendation was to use the differential mode. I would use a 2.5 volt external reference (3.3 v power supply) and scale the battery voltage so the expected 4.2 volt maximum reading scales to 2.5 volts. I would then set the negative input at 1.6667 volts which is the expected 2.8 volt minimum battery voltage scaled by the same input factor. This way a zero count represents the 2.8 lower battery voltage and the max count represents the 4.2 max battery reading. I end up with 11 bits of useful data. (The negative input is generated by a resistor divider across the reference voltage).
Only one channel of the A/D would be used as described above. I only need to measure the battery voltage a few times a minute, so I could run the A/D in the repeating mode using an ISR to read the result and update a running average. The update rate could be anything that gave me at least 5 samples per average period to report the current voltage say 4 x per minute.
Does anybody see a problem with this, or is the hardware too 'broken' for this application.?