## Minimum ADC resolution for PT100 measurement

19 posts / 0 new
Author
Message

Hi!

I have a question regarding PT100 measurement. From your own experience (if you have such), would you use, or have you used a built in 12bit ADC converter (like in new XMEGA)for PT100 measurement? What was your experience with it? What limits did the system have?

I have used once the 10bit ADC of the AT90CAN128 for PT100 measurement, but it was only +/-4Â°C accurate. So from this experience I'm a bit afraid if a 12bit ADC would be enough for +/-1Â°C or better accuracy.

If you have experience on this field, please feel free to share.

Thanks

A 12 bit ADC is 4 times the absolute count of a 10 bit ADC. So if you only got 4 degrees C resolution out of the 10 bit ADC, you would be at 1 degree C resolution with the 12 bit ADC.

But the resolution also depencd on the magnitude of the temperature you are attempting to measure, right?

The thing to consider, as well, is that the finer the resolution of the ADC (number of bits) the harder it is to manage the noise in the connecting wiring and surrrounding circuitry, forcing you to consider a 4-wire RTD measuring scheme.

There will be some point where the noise level in the real world starts approaching the resolution of the ADC and so, extraordinary measures will be required to minimize the ambiant noise.

After all, if the ambiant noise is two or three times that of the minimum voltage an ADC can represent, the least significant bit or two of any measurement system become useless.

You can avoid reality, for a while.  But you can't avoid the consequences of reality! - C.W. Livingston

http://cds.linear.com/docs/LT%20Journal/LTC2400_1100_Mag.pdf
See "Figure 3.Half-bridge interface" to a 24-bit Delta Sigma ADC.

"Dare to be naïve." - Buckminster Fuller

PT100 produces a resistance change in the order of 0.38ohm/C with a base resistance of 100 ohm at zero Celsius.

There must have been something realy screwy with the setup if ten bit A/D produced a measurement with a tolerance of +/-4C.
I use a commercial 12 bit A/D ( inside a PLC module and class A RTD sensor for an error of 0.3C +/- 1 bit

ignoramus wrote:

There must have been something realy screwy with the setup if ten bit A/D produced a measurement with a tolerance of +/-4C.

Well looking at the AT90CAN128 datasheet I would have expected a littlebit better results too, but it's basically not a very precise ADC.

Right now I'm looking at a quite cheap solution, which is a littlebit overkill (regarding resolution, and precision), but I like the basic idea behind it - the ratiometric measurement - because then all you need a precison resistor.
I'm reading Microchip's Aplication Note AN1154 "Precision RTD Instrumentation for Temperatur Sensing" http://ww1.microchip.com/downloa...

and I'm having hard time to understand the statment on the 3. page under "RA Tolerance and Measurement Accuracy" chapter, that if I use 1% precison resistor, then why would I have 20Â°C measurement error.

According to application note Equation1: RTD is calculated with the following eqution:
RTD = Ra*(Code/(Max-Code))

Which means to me, that if I have Ra with 1% tolerance, then I will have an error on RTD also 1%.

1% error on PT100 at 0Â°C is =100*0,01= 1 Ohm
PT100 has roughly 0,385 Ohm/Â°C temperatur coefficient, with 1 Ohm absolute error that would give littlebit over 2Â°C degrees of error. BUT DEFINATELY NOT 20Â°C.

Correct me if I'm wrong.

With an RTD You are looking at fixed change in resistance per degreeC.
At zero degrees C the resistance of a PT100 RTD is 100 ohms.

The bias resistor used in the example is 6800 ohms.
1% change in 6800 would represent significantly more than 1% change in the RTD to maintain constant voltage across the RTD at some reference temperature.
A better way would be to use a PNP transistor biased for the fixed current sourcing operation ( and temperature compensated with a diode junction in the bias network .
This way the the RTD will operate with a constant current source which will not be affected by RTD resistance changes.

ignoramus wrote:
With an RTD You are looking at fixed change in resistance per degreeC.
At zero degrees C the resistance of a PT100 RTD is 100 ohms.

The bias resistor used in the example is 6800 ohms.
1% change in 6800 would represent significantly more than 1% change in the RTD to maintain constant voltage across the RTD at some reference temperature.

A better way would be to use a PNP transistor biased for the fixed current sourcing operation ( and temperature compensated with a diode junction in the bias network .
This way the the RTD will operate with a constant current source which will not be affected by RTD resistance changes.

And you want to use as small a bias current as you can possibly get away with, to avoid "Self heating" of the RTD element.

You can avoid reality, for a while.  But you can't avoid the consequences of reality! - C.W. Livingston

ignoramus wrote:
With an RTD You are looking at fixed change in resistance per degreeC.
At zero degrees C the resistance of a PT100 RTD is 100 ohms.

Yes I know that. I wrote that.

Quote:
The bias resistor used in the example is 6800 ohms.
1% change in 6800 would represent significantly more than 1% change in the RTD to maintain constant voltage across the RTD at some reference temperature.

If I have an equation like this:

x = a*y
and I introduce 1% change on "y", by multiplying it by 1,01 then I suppose, that "x" will change it's value by 1% too.

So back to the original equation in the Application Note:
RTD = Ra*(Code/(Max-Code))

I would think that the calculated value of RTD will be off by 1% compared to the real value , if the Ra happens to have a 1% higher(or lower) value (6868 Ohm) than what is calculated with (6800 Ohm).
So this is more like calculation error in this case, because the real value of Ra is not known (unless one deosn't measure very precisely).

Just wondering... how do you verify and validate your PT100 circuit? Send it out to some specialized lab that compares it to an even more accurate temp.sensor?

Quote:
And you want to use as small a bias current as you can possibly get away with, to avoid "Self heating" of the RTD element.

Self heating starts to have a measurable effect on PT100 over 500uV according to the app note.

My question ragarding this: If you use small current, then you have a small voltage drop on PT100, and then the noiseimmunity will be less.?

lammelm wrote:

If I have an equation like this:

x = a*y
and I introduce 1% change on "y", by multiplying it by 1,01 then I suppose, that "x" will change it's value by 1% too.

No, it will be the 1% error, multiplied by the value of a, plus or minut the error in a

You can avoid reality, for a while.  But you can't avoid the consequences of reality! - C.W. Livingston

jayjay1974 wrote:
Just wondering... how do you verify and validate your PT100 circuit? Send it out to some specialized lab that compares it to an even more accurate temp.sensor?

For now we are talking about if a 1% bias resistor makes 20Â°C error or just 2Â°C.
This is pretty easy to validate.
My aim is to have a precision around +/- 0.1Â°C for which I will have to ask some professional people to validate, or get a very precise thermometer.

lammelm wrote:
Quote:
And you want to use as small a bias current as you can possibly get away with, to avoid "Self heating" of the RTD element.

Self heating starts to have a measurable effect on PT100 over 500uV according to the app note.

My question ragarding this: If you use small current, then you have a small voltage drop on PT100, and then the noiseimmunity will be less.?

That is what differential amplifiers, Wheatstone bridges, 4-wire measuring schemes and good shielding against ambiant electrical noise are for.

Single-ended measurement schemes are for low accuracy applications.

Somewhere in the design process, you need to determine what the acceptable resolution and accuracy requirement needs to be and design accordingly.

Good design Engineers do a fine job at controlling sensitivity and noise and, they always get the specified range and resolution, right? And those same engineers have something else to deal with that you don't - cold junction compensation.

You can avoid reality, for a while.  But you can't avoid the consequences of reality! - C.W. Livingston

The problem with !% spec is as follows:

assume constant temperature -> RTD does not change resistance
assume constant supply to the bias circuit

Voltage at the input to the ADC is

Vsupply * Rrtd/( Rbias +Rrtd)

lets put in some numbers
Vsupply = 5volts
Rrtd = 100
Rbias = 6800
so ADC input is 5/69 volts

Now lets see what happens when Rbias changes by 1%

Voltage input into ADC is 5/69.68

We now need to ask ourselves what does this correspond to in terms of temperature error.

In other words what would the value of Rrtd have to be to give us this kind of reading provided R bias is 6800?

Does this help with the 1% conundrum?

There is some very basic physics You can utilise to calibrate the RTD sensor.

Triple point of water and boiling point of water at STP.

ignoramus wrote:
There is some very basic physics You can utilise to calibrate the RTD sensor.

Triple point of water and boiling point of water at STP.

And you need to get some clean water :) Tap water with different minerals can change this significantly, which means from few to several 0.1Â°C of temperaturdifference.

A formula's sensitivity to small changes in parameters is fairly complex to calculate algebraically.

OTOH, you have a calculator. Put in some test values. Then alter one of the parameters by 1% and recalculate. Observe the result.

In practice, many parameters can change by a few percent.

David.