I have an ATtiny that I talk to using I2C, and one of the things I do is ask it for the temperature.
I'm seeing odd behavior ... when I ask the temperature after the AVR has been idle for some time (an hour, say) it tells me one thing (say, 17 degrees C), but if I've been talking to it a lot via USI just a few seconds later, I get readings of maybe 3-5 degrees higher (like 22 degrees C). Let it sit idle, ask it again, it's back at a low temperature again; this is very repeatable. I haven't yet experimented to see how quickly the temperature rises or falls, or get very good offset calibration.
Is it realistic to think that I'm observing the die heating up because of the work involved in handling several hundred USI requests? And if so, does anyone have suggestions about how to get more of a long-term stable temperature reading? I've not used this sensor before, I don't really know what to expect. Should I be averaging over some sample interval rather than just returning the latest reading?
The code basically sits in idle state most of the time, with a 4 MHz clock, except that I currently have it constantly doing ADC conversions while it does so. (The temperature sensor, and some external voltages.) There's a timer ticking; it pets the watchdog and flashes a LED. Basic power management tricks have been applied (nothing aggressive). Eventually it will take samples less often ... say, only after petting the watchdog.
Or else ... when the host talks to this slave via I2C, my current test rig gives it bursts of maybe 50, or 50 + 200, requests that both keep I2C active and involve a bit of work. ("In the field" such burst will be rare.) The I2C clock is closer to 400 KHz than 100 KHz. So I could imagine those bursts of work heat the die up enough that the temperature sensor is misleading.