I have noticed that the internal ADC inputs (bandgap, SCALEDVCC) have a fairly large offset error, typically in the order of 60-70mV.
Why is this? External measurements have an offset error of about 4mV compared to external GND, which I attribute to internal GND being slightly higher. The datasheet says that when measuring internal inputs in signed mode you can select either internal or pad ground, but I think the A3U only supports internal GND. In any case the datasheet implies that there should be no offset error in signed mode.
When measuring in unsigned mode you get a fixed offset, which is what seems to be happening here. Silicone bug perhaps?
Just to elaborate on exactly what I did:
I am measuring the bandgap with the ADC in signed mode against the internal GND reference. My reference voltage is a nominal 2.048V and was measured with a calibrated Fluke meter to be 1.036V, an error of 1.15%.
When I measure external voltages I get a 1.2% error. That is with the offset between external and internet GND calibrated out. Gain error is not calibrated as I don't need that much precision. I am fairly confident that the ADC is working well since the error I get is very close to the reference voltage error.
When I measure the bandgap voltage I get a result of 1056mV, or 5.3% error. When I measure SCALEDVCC I get a result of 2939mV from a 3000mV supply (measured 3004mV on the meter). It seems like both have a fixed offset error of about 60mV.
Note that the only thing which changes between measures, which are made sequentially, is the ADC MUX setting.