I've been using an ADC to monitor 3 voltages and my timer that occurs every 1 millisecond kicks off the beginning of the 3 conversions. I add up 500 of these and divide by 500 to get an average.
Something interesting is that in the breadboard environment it seemed to work better, and had better accuracy.
When testing it on a finished PCB however I noticed something odd, on the PCB it would "step" into a value and stay there even when the meter showed voltage was dropping steadily in a linear way.
I freely admit I am trying to get more accuracy than the 10 bits would suggest, and on the breadboard my averaging was doing a good job.
Last night I ended up making some test code to run on the breadboard vs the pcb and it proved what I thought might be happening. It would count up each ADC value reported during the number of conversions (1000) and keep track of how many of each result. It could display up to 10 of these on the lcd display.
Sure enough the breadboard was very noisy with all ten spots filled all the time. ADC values ranged from say 769 to 781.
My theory was that on the PCB, that I would be getting the accuracy I wanted when it was between ADC values and the duty cycle between them was providing the accuracy. But, with all the noisy results on the breadboard I wondered what I would get.
After loading the same code on the PCB, sure enough I only got 2 values reported. As the voltage dropped, you can clearly see the duty cycle (or number of times reported) moving from one ADC value to the next.
The "step" where the value would lock into a value for awhile is coming from the zone when only one ADC value is reported 1000 times because we don't know if those 1000 times are all at the top part of that value or the bottom part.
Yes, I know I was expecting more than the 10 bits can deliver, but I sure like the extra accuracy that can be obtained when it is in between ADC values.
It turns out that about 33% of my range is in between ADC values (and more accurate) and the other 66% are not.
Do some designs introduce random noise in an ADC circuit to get more accuracy?
What about a software method?
I am using a 100nF cap in the voltage divider now. If removed and there is more noise, might it actually be more accurate without it! ?