I have a batch of boards using the ATmega168V-10AU flat pack chip, running at 3 volts and 8MHz (nominal) using the internal calibrated RC oscillator. According to the data sheet, the RC oscillator is pre-calibrated to as close as possible to the nominal frequency at 3 volts and temperature 25C (my room is actually at 23C), but I'm finding typical inaccuracies of 2% or more, usually on the slow side, requiring the calibration register to be incremented or decremented by 4 or more.
Several of these boards have had CPUs replaced in the belief that their USARTs were faulty, but so many were failing that I decided to look for a different cause - and this cause seems to be the clock inaccuracy. An asynchronous serial data clock really needs to be accurate to within 1%. 2% is OK if the signal is really clean. Anything much above this can cause serious data loss (and at 5% data loss is virtually guaranteed).
Does anyone know why these chips are not performing to spec, and just how far out we can expect this inaccuracy to be? Are the chips being calibrated correctly but drifting in storage, or is there an error in the calibration itself? Is the calibration likely to drift much over the life of the chip, and if so, by how much?