## LM335 and ADC ....

17 posts / 0 new
Author
Message

I really hate to revisit topics already covered but my knowledge is pretty basic with regards to basic electronics and AVR's. I have read the ADC section on the AVR data sheet about 10 times and the data sheets on the LM335 quite a few times too and I am drowning in the information.

I am using an Atmega16 with an LCD screen and wish to read temperature using the LM335. I want to display the temp in °C to 1 decimal place. So far I have made a voltage divider with the LM335. The resistor I am using with the LM335 is about 37K (I just guessed a value). AREF is external 5V. I have the code working to read the ADC to 10bit accuracy and displaying it on the LCD screen. I am also reading from an LDR too and so I know the code works OK. I just can't work out how to get the accuracy and range from the LM335 that I need.

The output from the LM335 is supposedly 10mv/°K, so ideally I would like to read 1mv per ADC step. That way each step would be 0.1°. Is that right? I have no idea how to get that to happen. At the moment, if I warm up the LM335 by holding it with my fingers (I am guessing it would warm up almost 10°), the reading from the ADC drops by about 4.

If someone can explain to me in "simple" terms how I proceed from here I would really appreciate it.

Hook a pot from 5V to gnd, put wiper on a/d. Get it to display raw a/d 0-1023, then cvt to integer mv 0-5000, This converts to integer 'halfdegreesK' with a shift, simple algebra give degC and degF.

Imagecraft compiler user

Thanks for the help Bob. I did what you suggested, hooked a 10K pot from 5V to Gnd, put the wiper to the a/d. It displayed 0-1023 OK. "then cvt to integer mv 0-5000", I mulitplied the result * 4.8876 and it now displayed 0-5000. Is that what you meant? Excuse my ignorance. I am not sure how I relate this to my project.

What I would like is to have the greatest accuracy from 0 to 50C. Well after reading the spec sheet on the LM335 again and again I get the idea that it will output 2.7315V @ 0C and output 3.2315V @ 50C. Is there some way I can measure between these 2 voltages with all 1024 steps?

Sorry for the dumb questions.

EDIT: I just checked the LM335 output with a voltmeter and it is 1.17V @ approx 23C so this is way wrong to begin with. I must have it wired up wrong. I have a 27K resistor from +5V to the V+(LM335) and the V-(LM335) to GND. The A/D on the AVR is connected to V+ on the LM335. Any ideas where I went wrong?

You can measure between these two voltages by attaching the (-) input of a differential amplifier to 2.7315V (make it with a resistive divider) and the (+) input to the LM335. Your amplifier should have a gain of 10 (making the 0.5V span into 5.0V).

You're in luck... the ATmega16 has a differential input with a gain of 10. Accuracy isn't perfect, but it's probably better than a novice could build on the first try.

You can create your 2.7315V reference with a 10K in parallel with 49K on top, and 10K on the bottom.

Naturally, there are accuracy implications related to real parts instead of ideal.

EDIT: The LM335 needs a little bit more current than you get from a 27K resistor. iirc, it needs at least 400uA, and is characterized at 1.0mA. Resistor to 5V should be maybe 2.2K.

The LM335 puts out 10mv per degree K, and I knew one count on the a/d was about 5mv, so I saw that one could read to half a degree K, so the 0-5V range would give 500 halfdegrees or 250 degree range. You want more resolution, so the idea of having a diff amp with an offset adjust that amplifies the range you want to 0-5V will work fine.

Imagecraft compiler user

mneary wrote:
EDIT: The LM335 needs a little bit more current than you get from a 27K resistor. iirc, it needs at least 400uA, and is characterized at 1.0mA. Resistor to 5V should be maybe 2.2K.

Thanks, that made all the difference. Really basic stuff, I should have been able to work that out. I was confusing the LM335 with a thermistor and I read the resistor used should be close to the thermistor impedance. Way wrong.

Now I am sort of getting the hang of it. I finally realise what bobgardner's tips do for me too. My readout is now in mv. So when the voltage is 3V my reading is 3000etc etc. Makes it easier to see what is happening.

Now to work out how to use the differential input on the Atmega16.

Where are you from? Fill in your profile... maybe there are freaks near you!

Imagecraft compiler user

I am in Australia :D

Welcome!

Imagecraft compiler user

Hi

Never heard of that town.
Looked it up using zoomin.
Just south of Newcastle & very north of Sydney.

Ken

bobgardner wrote:
Welcome!

Thanks Bob. All the people in here have been a great help to me.
pykedgew wrote:
Hi

Never heard of that town.
Looked it up using zoomin.
Just south of Newcastle & very north of Sydney.

Ken

Yep, on the central coast of NSW. About an hour drive north of Sydney, half an hour south of Newcastle. Used to be a place full of holiday homes, but now it's just another suburb of Sydney with housing developments everywhere.

I have been following this thread and decided to take some time to review the LM335 datasheet. As a result, I have drafted the following and, I was wondering what others might think about what I have come up with. I haven't actually tried the following on a microcontroller as, I don't have an LM335 temperature sensor. But, it seems mathematically reasonable, at least to me.

Looking at the LM335 datasheet, it says:
0 Deg K = -273.15 Deg C

0V = 0 Degrees K

and,

Vo = 0.01V/degree K

Selecting a temperature range of 0 to 100 Degrees C:

Base calculations:
0 Deg C = 273.15 * 0.01V = 2.7315 VDC
100 Deg C = (273.15 + 100) * 0.01V = 3.7315 VDC

Side Bar:
For fine adjustment, page 5 of the LM335 datasheet has a scheme, in the form of a schematic, for correcting the slope and hence, the accuracy of the output voltage representing temperature.

Directly below that schematic diagram is a foot note that reads:

Quote:
"Calibrate for 2.982 V @ 25 Degrees C"

For a temperature of 25 Degrees C:
Vo = 0.01 * (273.15 + 25) = 2.9815 V, which tends to prove my base calculations. I leave the rest for you to prove out.

So, the LM335 output voltage range for a temperature range of 0 to 100 Degrees C is:
2.7315V < Vout < 3.7315V

In so many words, the LM335 output voltage of the LM335 temperature sensor from 0 to 100 Degrees C is:
2.7315V to 3.7315V = 0 to 100 Degrees C

Delta V (or change in output voltage) throughout the 0 to 100 degree temperature range of the LM335 is:
3.7315V - 2.7315V = 1.0000 VDC

You can use Y = mX + b, where:

Y = the temperature, in degrees C
m = the slope of the line of the linear graph (Delta Y)/(Delta X)
X = the LM335 output voltage (ultimately the ADC value) representing the measured temperature
b = the vertical offset from the origin of the line of the linear graph

To determine the temperature that the LM335 output voltage represents, first find the slope:

m = (Delta Y)/(Delta X) = (100 - 0)/(3.7315 - 2.7315) = 100 / 1 = 100

b = 0

But the X axis is offset so: Y = m(X - X1) + b
Where X1 = 2.7315

Y = m(X - 2.7315) + b = 100(X - 2.7315) + 0

Because b = 0, the equation reduces to:

Y = 100(X – 2.7315)

Scaling the ADC to 0.01 (ADC AVREF = 3.7315 VDC) volts per step, should allow the ADC to read the temperature with 0.01 Degrees C resolution directly:

V = 100(x – 2.7315)

If the ADC reading directly represents temperature, in Degrees C and, we divide the resolution by 10:

T = 10(ADC – 2.7315)

The above equation should provide for direct measurement of the temperature with a 0.1 Degrees C resolution - assuming that ADC AVREF is 3.7315 VDC.

I did this long winded example to show how I arrived at the solution I would use if wanting to measure temperature using the LM335.

You can avoid reality, for a while.  But you can't avoid the consequences of reality! - C.W. Livingston

Last Edited: Sun. Jan 14, 2007 - 06:49 AM

I did think of using an AVREF of 1.024V to get the 0.1 degree resolution, but wouldn't the LM335 need to output between 0 and 1.024V for that to work? Or have I got that all wrong?

Taipan wrote:
I did think of using an AVREF of 1.024V to get the 0.1 degree resolution, but wouldn't the LM335 need to output between 0 and 1.024V for that to work? Or have I got that all wrong?

I've corrected that...

Take another look at the math that I've outlined in my previous post.

You can avoid reality, for a while.  But you can't avoid the consequences of reality! - C.W. Livingston

I dont think you can have AREF as low as 1V, maybe the 3.3 version you can, not the 5V version.

I think I have worked it out. Just wondering if can someone tell me if I got it right ... or not. I have the AREF set as external 5V. I have set ADMUX so that ADC1 is positive differential input, ADC0 is the negative differential input with a 10X gain by doing this

`ADMUX = (1<<MUX3) | (1<<MUX0);`

I have made the 2.7315V reference using the voltage divider that mneary suggested and connected this to the negative differential input. I have connected the LM335 to the positive differential input. Is that the correct way to do it?

To test it I used a pot as a voltage divider and marked on it roughly where 2.7 to 3.2V is and connected it instead of the LM335 and I got readings of 0-500 approx over that voltage range. Which is what I was expecting. And this should translate as 0.0 to 50.0 degrees C from the LM335. I think.

Another basic problem, I have only worked with integers to date, and used itoa to convert to a string to write to the LCD screen. How would I do this using numbers to 1 decimal place?