Reading the atmel datasheets (mega8L datasheet to be exact) I find that the calibration byte for the internal oscillator only works for 5V operation.
At 3.3V operation the resulting operating frequency will be a bit lower. (about 0.965MHz compared to 1MHz).
Add to this the +-1% error and the worst case scenario is a frequency 5% lower than 1MHz. A serial com link will not be very reliable at 5% frequency error.
So, how can one best solve this problem?
Of course for hobbyist applications, one can simply adjust the calibration byte a bit and hope it works good enough.
I am thinking perhaps it is possible to do it in a more reliable way. How about some kind of auto-calibration during ISP programming?
Let's say we download a very simple app to the AVR that sets up a timer interrupt to occur at let's say 1kHz. At each timer interrupts the AVR sends a signal to the PC using the ISP cable. A program on the PC counts the number of signals during one second. If the PC program finds the frequency is off by too far, it simply adjusts the calibration byte a bit and restart the AVR program for a new test. This can be repeated until a good enough calibration value is found.
When the calibration value is found, the real AVR application can be downloaded to the AVR together with the new calibration value.
The good thing is that the above suggested process could be totally automated and it results in a very reliable operating frequency.
Other advantages with the method described above is that the oscillator could optionally be calibrated to other frequencies than thoose provided by atmel (1, 2, 4, 8MHz). A very handy feature since none of the frequncies provided by atmel are very UART-friendly.
What do you think? Is there a better solution to the problem?