There's about 30 posts in the forums regarding the internal temperature sensors in the AVR processors. It's obviously a topic which has caused considerable pain and anguish in the past and I have no desire to 'rake over the coals' again so I'll keep this as specific and single dimensioned as I can.
We have a volume production unit ... it's for the UK market and is fine. It's a radio based product and we want to export it to the states. The FCC only approves the transmitter that its used with there down to -10 degrees Celsius.
The plan is/was to use the internal temperature sensor as a crude transmitter inhibit when the temperature dropped to the region of -7 or so.
The code is in place, the unit already uses analogue channels with the internal bandgap reference so its no great deal to read the internal channel for the temperature sensor. But the values we get back are way off from the typical values in the data sheet and vary greatly from processor to processor.
It's not the +/- 10 degrees, or quantization error or the bandgap reference ... its like 60 degrees off on some units.
We've characterized a few of the units and the temperature/reading value is pretty linear (typically 1.23mV per degree) but even the gradient of the line seems to vary from unit to unit.
My specific question is this:
Has anyone had a reasonable sized batch of 168s that did perform as the datasheet says ... ie 314mV +/- 10mV at 25 degrees?
(To put it into context we're getting 350 mV at 20 degrees instead of the 309'ish that we would expect.)
We can calibrate the units at room temperature as the last stage of the production/test process but the range of variablility and the unpredictablity of the line gradient is killing us on this one. We can't afford to do two point calibration and with the current variability a single point calibration won't cut it.