## Discriminating Small Resistances

11 posts / 0 new
Author
Message

I am working toward some electronics for model rockets and I've come up against a problem that I'm not sure how to solve. As part of the error checking in the ignitor circuit I want to be able to check the continuity state. To me that means three possibilities: valid, open-circuit and shorted. The main problem is that a valid resistance will be in the range of .5 ohms to 10 ohms, somewhere out at the end of a 2' wire. I think I can handle the open-circuit to valid transition, but I just can't figure out how to deal with the low end. A compounding problem is that since this is an ignitor, I want to use the absolute minimum current through it possible to make the reading. I can take it in short bursts of course and minimize the heating, but to be on the ultra-safe side, I would like to keep the reading current down below 5mA.

So, are there any ideas on how to determine the difference between .5 ohms and 0 ohms at the end of what might as well be two foot antenna?

Martin Jay McKee

P.S. Lest anybody think that I'm leaving all the hard work to everyone else, in future I'll need to figure out the protection from the ignition voltage and an analog method ( as opposed to the three-state ) of reading the resistance proper at a later date.

As with most things in engineering, the answer is an unabashed, "It depends."

Stealing Proteus doesn't make you an engineer.

Lets do the Ohm's Law dance for a bit, here. Assume your 5ma, maximum. 10 ohms will give you 50mV drop and 0.5 ohms will give you 2.5mv drop. Thats not much to work with. Even with Arnold's 4-wire system, which is about your only hope.

The common scheme with 4 wires is to put the current through two wires (one on each side of the test resistor). No current through the other two means that they will not have any voltage drop and will read the actual test voltage.

You might consider something like 50 ma, on for a few milliseconds and off for many seconds. Since your tester generates the pulses, it "knows" when they are and can measure precisely when the current is flowing. This would give you a 10X improvement and your voltage range would be 500mv to 25mv. A 10bit ADC using Vref of 5V has about 5mv resolution. So, you should be able to distinguish between a short and a 0.5 ohm igniter. Using a lower reference, say 2.5V, would give you a 2X further improvement on resolution.

So, you SHOULD be able to do it without any op-amps or anything special other than the 4-wire setup. An AVR, almost any with an ADC, should be able to handle it.

Jim

Until Black Lives Matter, we do not have "All Lives Matter"!

Thanks both. I'm not sure where my head went as I totally forgot about the four wire setup. That will certainly improve the situation. And yes, it was the 2.5mV drop that was most troubling.

As for increased test current, it is possible, It may be safe to increase it some, but probably not by a factor of ten. The trouble is that some ignitors will trigger all the way down to 75 mA, longer pulse granted, but I don't want to tempt fate, factor of two safety margin is possible, four is better. Then again, I was more or less planning on amplification, so it won't be a problem if an op-amp or three is needed in the long run.

I'll see what I can work out, having gotten a bit more direction.

Martin Jay McKee

As with most things in engineering, the answer is an unabashed, "It depends."

Here's an example how you could check the ignitor circuit with an ATtiny25:

The ADC of the ATtiny25 can operate in unipolar differential mode. In this mode the differential voltage between two input pins is amplified (1x, 20x) by the internal gain stage and then fed into the ADC.

In unipolar-differential mode a differential voltage in the range of 0V to VREF/GAIN is converted into a 10-bit integer value.

With VREF = internal 2.56V reference (Tolerance: 2.33V...2.79V)
and GAIN = 20x the ADC can measure voltages in the range of min. 0V to 2.33V / 20 = 116.5mV.

If PB2 is set LOW a current of approx. 10mA will flow into the ignitor circuit and causing a voltage drop in the range of 5mV...100mV.

This voltage drop is amplified by 20 and then quantisized by the ADC.

With AREF = 2.79V you get a worst case resolution of approx. 2.79V / 20 / 1024 = 136ÂµV.

The circuit needs to be calibrated. I recommend you read the application note AVR120: Characterization and Calibration of the ADC on an AVR.

From the ATtiny AVR serie you could use:
ATtiny25/45/85 (8 pin)
ATtiny261/461/861 (20 pin)

From the ATmega series you could use almost any AVR with 40 pins and up with build in ADC. Usually these parts come with a selectable 10x gain and support only a bipolar-fifferential mode which gives you only 9-bit resolution. With a 10x gain you have to double the test current.

Don't start the ADC directly after enabling the current source. The circuit needs to tune in first. With the given values you should wait (15kOhm + 15kOhm) * 10nF * 5 = 1.5ms.

Regards
Sebastian

## Attachment(s):

Silly idea... why not charge a little capacitor up to a certain voltage, then connect it to the ignitor with a MOSFET or so. Then measure the time it takes to discharge the cap.

Sebastian,
Thank you very much for the circuit and explaination. Happily ( and somewhat surprisingly given my analog design skills ) I was thinking along the same lines. Though as I've been working on a different part of the design most of the day I hadn't put value onto anything. I also hadn't thought about the disadvantages of the bipolar differential mode; yet more mines to watch out for.

I have read through the application note ( AVR120 ), but it has been a while and I've never needed the extra accuracy it can give me. Now is certainly a time to look at it more closely.

Jayjay,
I'll have to think about that a bit, there would be a tradeoff between the resolution ( how long the discharge is ) and how much power is discharged, I'm not sure where it would sit in comparison. But it's an idea I should at least be able to disregard in an educated way.

Thanks again.

Martin Jay McKee

As with most things in engineering, the answer is an unabashed, "It depends."

Martin, note that calibration can be done very easily. If your application operates in a small temperature range, then you can expect that the offset of the internal amplifier doesn't drift significantly. Thus you don't even need to measure the amplifier offset with the ADC.

For calibration it would be sufficient to do a two pont measurement:
1. Replace the ignitor circuit by a 0.5 Ohm resistor, measure the voltage drop across the resistor via the ADC and store the result in the internal EEPROM.
2. Repeat the process with a 10 Ohm resistor.

For calibration you may consider to oversample the reference signal and calculate the average value.

jayjays ideas is interesting. But I think it would be bit bit tricky because of the very low resistence of the ignitor circuit:
- the cable resistance would influence the result
- Discharging a 2.2ÂµF capacitor from 5V to 1.5V via a 0.5 Ohm resistor takes 1.3Âµs. Thus you would need a large capacitor to gain reasonable results or you have to speed up the AVR to its max frequency.
- You have to test at which capacitor size your ignitor circuit is triggered by your test circuit.

Regards
Sebastian

For a simple highly sensitive miliohmeter try

http://iq-technologies.net/proje...

This was what i had hoped id find in preference to my above post.

I had built this unit and used it with great success in my previous venture bare board testing service )

Your ignitor ought not be triggered by a 0.1V peak signal.