I have a project where the hardware is designed such that the ADC is referenced to AVCC (3.3V). AREF is connected to a 100nF cap to ground.
I wanted to add temperature measurement, so I have to switch the ADC's voltage reference to the internal 1.1V reference internally.
This requires a delay while AREF settles, something like 5ms. I'd rather avoid this if possible, and it's also undocumented. All the datasheet says is:
When the bandgap reference voltage is used as input to the ADC, it will take a certain time for the voltage to stabilize. If not stabilized, the first value read after the first conversion may be wrong.
"certain time" is not defined (resistance of the bandgap isn't specified), so I'm not sure I can count on that. Reducing the external capacitor to 10nF will help, but it's still not really reliable. I'd also argue that the datasheet's a little misleading, as many more than the "first" value read will be wrong.
I'd really like to avoid redesigning the hardware such that all of the ADC's are 1.1V referenced, if possible.
Is it reasonable/reliable to use 3.3V as the reference for the temperature sensor perhaps? This is probably riskier than moving to 10nF with an excessively safe delay, but maybe it's acceptable if lower temperature resolution is needed? (It does seem to "work")
Or is redesigning all the analog inputs to reference to 1.1V the only reliable option?