I conected the output of voltage divider formed from 10k resistor and LDR:
ADC o | | +5V o----10k----o----LDR----o GND
to ATmega328's ADC input and got measures of about 850. Then I tried to power the voltage divider from an output pin of a microcontroller and under the same light conditions the ADC readings dropped to about 650.
When I am using an output pin to power the voltage divider, the voltage at the pin supplying power to voltage divider measured with digital multimeter is only 2.8 V instead of Vcc.
The outcome is - I have to use one calibration constant when I am powering the voltage divider from a pin and another calibration constant when the voltage divider is powered straight from the +5V.
At first I thought I started ADC measuring too fast upon turning the supply pin HIGH but I am getting the same results even if the supply pin is HIGH all the time. I am assuming the voltage drop has something to do with the relatively high resistance of resistor (10k) + LDR (about 47k) but would still like to be sure if that is the cause.
I'd like to power the voltage divider from the pin because that way the device wouldn't drain a battery when not measuring light. If the cause of the voltage/measure drop is really high load impedance I might use a voltage follower or a transistor or simply edit the code to use different calibration constant. In that case (using the circuit without an op amp or a transistor) I'd like to know if there is some reason not to use output pin in such conditions.