The modern AVRs can measure their own VCC with no external parts, by configuring the ADC to attach AVCC to the "reference" input and attach the internal bandgap to the "input" input. That is the input that is normally attached to the mux. This ability is handy, but the error is considerable. The error varies from one AVR to another and is also a function of the supply voltage. There seems to be something strange happening inside the AVR.
I tested 3 AVRs, varying the supply voltage from around 3 volts to 4 volts. A mega324pv gave an error that varied from -48 mv to - 65 mv. A mega329pv gave an error that varied from +82 mv to +110 mv. A mega649pv gave an error that varied from -46mv to - 70mv. I have attached some graphs. The meandering curves look random, but they are repeatable.
I measured the supply voltage with a calibrated DMM and think it is accurate to within +- 3 millivolts, and it is repeatable. I also used the DMM to measure the actual voltage of the AVR's internal references.
I find the AVRs can measure voltage fairly accurately when used in the normal manner, with the reference voltage connected to the "reference" input and the voltage to be measured connected to the mux input. While running these tests, I also had the AVRs measure the supply voltage this way. I used a voltage divider to reduce the supply voltage to less than the internal reference voltage, and connected this divided voltage to the ADC mux. The voltages measured this way had an error of less than 3 mv, usually less than 2 mv. Of course the measured voltages were a fraction of the actual VCC. To convert these to the VCC voltage, the error would be multiplied by the scale factor. The computed VCC still had less than 10 mv error.
The fourth and last graph, in my second post, shows one such test of measuring VCC with a divider.
The conclusion is, if you want to measure VCC accurately, you need to do it the old fashioned way, with a voltage divider hooked to the ADC mux input.