How to Scale ADC value?

Go To Last Post
27 posts / 0 new
Author
Message
#1
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Greetings folks -

 

This question has its roots in a hardware question so lets start there.

 

This involves my M328P accelerometer/data logger system that I sell. At the present time, I just allow it to die when the battery runs down. This can potentially result in the loss of up to 24 hours of data as the data file on a uSD card is closed, and a new one opened, every 24 hours. Also, however, there IS the potential for faulty card operation if the supply voltage drops below the spec'd minimum and this could lead to invalid data being stored; in many ways, this is even worse than data loss.

 

So, my plan is to save a reading of the reference taken with a known good battery and 3.3V supply (as ADC reference) during manufacture, and save that in EEPROM. When the battery discharges, Vcc will decrease. Since it is the ADC conversion reference, Vref READINGS will increase. I will choose some arbitrary threshold, perhaps (3.3/3.0 = 1.1) of the original reading as the cutoff point. Since there are tolerances involved (esp for the reference), I want the cuttoff value to be a proportion of the original reading, not some fixed number.

 

So, we finally get to the programming question. Is there some process other than floating point, that I can use to compute the cuttoff value? I can get close enough if the ratio is 1.1 but what if I want the cuttoff to be 2.9V? Then, the ratio becomes 1.138. Maybe 1.125 would be close enough and that would be easy to do. But, is there something more general that does not involve floating point? Note that I plan to store in EEPROM the scaled value, so that it does not have to be computed over and over.

 

Ideas/suggestions/etc. are thus solicited.

 

Many thanks

Jim

Jim Wagner Oregon Research Electronics, Consulting Div. Tangent, OR, USA http://www.orelectronics.net

Last Edited: Thu. Aug 23, 2018 - 09:25 PM
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

 

 

But, is there something more general that does not involve floating point

Well how much more general?  If the allowed % droop is always the same (say for example 13.8%), then a quick integer scaling can readily be applied.  Of course, ultimately you are just checking against an integetr ADC value.

 

So say you read an initial CAL reading of 341 (for a 1.1V internal ref and AVcc=ADCef=3.3V). so as AVcc droops with the battery, this initial value will climb. 

 

So for 13.8% in the AVR CAL MODE simply take your initial cal reading, mult by 291 (easy) then divide by 256 (easier--toss low byte).   So your  CAL reading has become 388 (round) & is stored for future use.

 

note 291/256=1.136 (gives 13.6%)

 

Note that you should be prepared to mult 1023 by a cal val (say up to 500max), 1023*500===>19 bits max mult result. 

 

Thereafter you steadily read your ADC & compare to this stored threshold (388)...when you cross the line, shutdown!

 

You could instead constantly recalc the threshold based on merely storing the cal reading (341), or store the cal reading & constantly rescale the ADC in a similar calculation.

However, it seems to makes more efficient sense to calc the threshold one time, during cal,  then simply compare the ADC reading.

 

If you want to vary the %, during cal, it's not hard to figure out a corresponding multiplier...keep the divider at 256.

   

 

 

 

 

    

When in the dark remember-the future looks brighter than ever.   I look forward to being able to predict the future!

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Thanks.

 

Need to think about that a bit.

 

Jim

Jim Wagner Oregon Research Electronics, Consulting Div. Tangent, OR, USA http://www.orelectronics.net

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

In general I do integer scale via V2 = (M * V1) / D.  Then I make D a power of 2 so I end up with V2 = (M * V1) >> S.  If you make S equal to 8 or 16, I've found that avr gcc is smart enough to just whack off the low byte or word, making it even quicker (but it's rare that scaling needs to be extremely fast anyway).  

 

Keep in mind that your maximum error using this method should never exceed 1/(2^S), or half that if you round up before the shift.  Also remember that your intermediate (M * V1) may overflow an integer and you may need to promote at least one of the values (e.g. by declaring M as a uint32_t) to assure a 32-bit intermediate result.

 

EDIT: For your example of 1.138, you could do

 

Scaled_val = (74575UL * REF_val) >> 16.

 

74575 is 1.138 * 65536.  The 65536 is the multiplier because of the later divide by 65536 (>>16).  The intermediate value will not overflow as long as REF_val is less than 57292.

Last Edited: Thu. Aug 23, 2018 - 10:42 PM
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

I do not understand why you want to do this.

 

The internal reference of the ADC is pretty accurate, And if you use Vcc as the reference and measure Vref, there are also no components added (such as resistor dividers) which can introduce additional inaccuracies.

The biggest inaccuracy is probably your 3.3V "known good" power supply, and you don't want to add that into your calculations.

So simply use a fixed value as cut-off point, add some averaging or smarter filtering just to be sure your backup function does not get triggered by some rogue ADC reading.

 

If you really want to callibrate, then callibrate to real parameters.

You could for example run the AVR from a big Capacitor, then disconnect the power supply. During discharge of the capacitor continuously write ADC values to a serial port or EEPROM until the AVR browns out. Then add a margin, and use that as the point to take action.

 

You can also run the AVR from the Big Capacitor and do a uSD write, and measure the voltage drop over the capacitor caused by a uSD write.

With a 1F Capacitor you may be able to do a final uSD write even if the battery is suddenly taken out of the system. And (try to) add a mark that the file might be unreliable, but it is of cource probably better to add CRC's to your files as an integrity check.

Doing magic with a USD 7 Logic Analyser: https://www.avrfreaks.net/comment/2421756#comment-2421756

Bunch of old projects with AVR's: http://www.hoevendesign.com

Last Edited: Fri. Aug 24, 2018 - 12:15 AM
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

@ka7ehk wrote:

Need to think about that a bit.

Me too!  I'm still waiting for the spinning in my head to stop...

Greg Muth

Portland, OR, US

Xplained/Pro/Mini Boards mostly

 

Make Xmega Great Again!

 

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Greg_Muth wrote:

@ka7ehk wrote:

Need to think about that a bit.

Me too!  I'm still waiting for the spinning in my head to stop...

I promise, it's not hard when it finally "clicks".  The biggest thing is to be able to calculate your limits to prevent intermediate value overflow (especially critical if you use a 16-bit intermediate and don't promote up to a 32-bit intermediate).

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

If you log the battery voltage as well, you can post process the data.
The other choices are fixed point or floating point calcs. If you’re not concerned about code size, execution time and you stay within the confines of precision, then floating point is a simple solution. With fixed point you get to decide the precision.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

I need to determine the shutdown threshold once, only, BUT I am getting low on code space. Now at 87% or so on M328 with a bit more other processing to add, so I don't want to use up memory needlessly. There are no other FP operations in the whole system.

 

Right now, I am thinking that 1.125 is an adequate factor. It will be above the minimum supply voltage for the uSD (at least the one I looked at). And, at this voltage, both the battery and the boosted supply drop pretty fast with operating time. At this point, I would just like to close the data file properly so the data is preserved, shut down every thing else except the RTC to maintain a time-base correction capability, and thats it.

 

So, I think that is what I will do. Thanks for the suggestions, folks. This really has helped me think through what is really necessary, here, and what isn't.

 

Jim

Jim Wagner Oregon Research Electronics, Consulting Div. Tangent, OR, USA http://www.orelectronics.net

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

I do not understand why you want to do this.

I think he is actually doing what you suggest...

 

You have 2 voltages, the internal 1.1V fixed & the external.  Either can be used as the ADC ref & either can be measured by the ADC.  One is always measured as a ratio of the other.

 

A) If you pick the higher ext voltage as the adc ref, then the known internal ref can be measured against it (& mathematically the reverse---since the internal voltage is already known.).

 

B) If you pick the internal 1.1V ref as the adc ref, then anything measured must be supplied at a lower voltage--implying an energy-robbing resistor divider on the 3.3V.

 

Hence option A is "best" & since both options are merely ratios, the resulting goodness should be nearly identical.

 

 

@Muth:

I'm still waiting for the spinning in my head to stop...

   but it adds stability!   cheeky

When in the dark remember-the future looks brighter than ever.   I look forward to being able to predict the future!

Last Edited: Fri. Aug 24, 2018 - 12:59 AM
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

avrcandies wrote:

A) If you pick the higher ext voltage as the adc ref, then the known internal ref can be measured against it (& mathematically the reverse---since the internal voltage is already known.).

 

B) If you pick the internal 1.1V ref as the adc ref, then anything measured must be supplied at a lower voltage--implying an energy-robbing resistor divider on the 3.3V.

 

Hence option A is "best" & since both options are merely ratios, the resulting goodness should be nearly identical.

Agree

 

Quote:

@Muth:

I'm still waiting for the spinning in my head to stop...

   but it adds stability!   cheeky

"Ahh, I finally get it!"...topples onto the floor...

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Guess I'm missing the key concept here.

 

If you have one I/O pin free, then couldn't you just use the pin to turn on either a high side PFET transistor switch, or a low side NFET, to turn on a resistor divider connected to the battery?

The resistor divider can be high values, with a small cap, or low values, (parallel resistance < 10K), without a cap.

It will draw a little bit of current, but the battery voltage doesn't change that quickly, and only episodic sampling ought to be necessary.

 

Depending upon the amount of code space available, one could tweak the sampling interval based upon the current battery value.

(Take more frequent samples as the voltage gets closer to the threshold, (yes, a vicious cycle...)).

 

One could use the internal ADC Ref, (as is eluded to above), or turn on an external (precision) reference with the same I/O pin used to turn on the resistor divider.

 

The up side of all of this is that then one simply stores an ADC value as the threshold, no conversions required.

If the characteristics of the "recommended" battery are known, then one might not even need to use the EEPROM, one might even get away with hard coding the threshold.

 

I guess it boils down to how miserly one wants to be with the battery's energy, and by just how much does episodic ADC sampling of the signal actually shorten the battery life?

 

Note that most of my battery projects run on a car battery, not something the size of a pencil eraser, so take the above for what it's worth!

 

JC 

 

 

 

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

The up side of all of this is that then one simply stores an ADC value as the threshold, no conversions required.

With the divider & other variability's, a conversion to a comparison value (threshold) is needed at some point. 

In either method, if only a sloppy trip level is needed, a fixed number can be hard coded (the other method uses no resistors, so perhaps a little less slop).

In either method, most avrs have a +/-10% internal ref tolerance...that's a lot of slop!

 

 In the past I'd add the external fet, etc...but this other way save parts (fet resistors, etc) & at least seems pretty dirt simple

When in the dark remember-the future looks brighter than ever.   I look forward to being able to predict the future!

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

This is a product. I don't want to mod the board - it would cost me (relatively) a lot. But, I want to improve the operation of the unit. One way of doing this is to make the end of battery life cleaner. Close open files on the uSD card, for example. So, I can use the currently unused M328P ADC and the unused internal voltage reference. By using Vcc as the ADC reference, I can measure the voltage reference. When Vcc starts to drop, I know that the battery is near the end if its life and it is time to shut things down, cleanly. 

 

I measure Vref using Vcc as reference because Vcc > Vref and I need no voltage divider (a mod that would also use extra power). So, it is very close to "free" (certainly, hardware-wise). And, it gives more predictable performance to the user. That is what I am after.

 

Jim

Jim Wagner Oregon Research Electronics, Consulting Div. Tangent, OR, USA http://www.orelectronics.net

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Another option would be to just shorten the time between closing the files to something

less than 24 hours, possibly configurable by the end user for their particular needs.

 

--Mike

 

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 1

Jim you should not HAVE open files on SD except for the very limited window when you write: open, seek to end, append, close. Don't leave files open all the time. FAT os notorious for corruption as it has no form of error recovery. So if the storage media is removed or power fails in mid-use the allocation chain can be completely screwed. Back in the old PC DOS days who remembers how often you'd run chkdsk and often there'd be "orphaned chains" and so on. If you ever saw a FILE0000.CHK you know what I mean!

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Cliff -

 

I'd like to do what you suggest, but I am concerned about the time and battery energy required to open the file. Everything has to be complete in 10ms and it typically happens every 100ms. The 10ms time comes from the rate at which the sensor provides raw samples and wakes up the logger system. The 100ms time is set by the typical number of raw samples that are combined into a "processed sample".

 

And then there is all that energy required to open and close the file every 100ms.

 

Jim

Jim Wagner Oregon Research Electronics, Consulting Div. Tangent, OR, USA http://www.orelectronics.net

Last Edited: Fri. Aug 24, 2018 - 02:38 PM
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

I'd log the sector reads to see how many we're talking about. Of course it will depend on the file layout on the disk. The fopen() will have to read directory sectors to try and find the file entry which may involve more sector reads dependent on how "busy" the directory is. Similarly if you seek to a position that spans several AUs there may be a number of FAT reads.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Thanks. I really don't know how to do the things you suggest. But, I will try.

 

As far as "file layout", I start with an "empty" disk, add ALOG0.csv, close it 24 hours later, then open ALOG1.csv, wash and repeat. So, I have no idea how things are arranged on the disk.

 

I simply write to the buffer (which I think is 256 bytes) and FatFS appears to flush that to the disk when it gets full. Each data write (a csv record) from my system (at the typical 100ms interval) is typically 40-45 bytes. 

 

Some have suggested a shorter file open time. Given the data and the post processing needed, that would start to be problematic for the end user. 

 

Jim

Jim Wagner Oregon Research Electronics, Consulting Div. Tangent, OR, USA http://www.orelectronics.net

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Opening and closing the file will not have an effect on post processing.
After creating the file, subsequent writes would append to the file. The user sees no difference. The benefit to your system is that the filesystem spends more time in a completed state.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

I was not suggesting that opening and closing a file would  effect post processing. What WOULD effect it is if the files are made shorter duration so that there are more of them for a given interval (say, several months or more). For the data this generates, there seems to be a "sweet spot" for post processing between files that are too large and too many files and that spot seems to be in the vicinity of 24 hour file duration.

 

Jim

Jim Wagner Oregon Research Electronics, Consulting Div. Tangent, OR, USA http://www.orelectronics.net

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

I wasnt suggesting changing the file structure. Create the same size files with the exact same data but just don't leave the file open when not writing (instead for each write, open it again and seek to the end before you write more)

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Kartman is the one who seemed to suggest that I was relating file opening and closing to changes in post processing, which I did not intend to suggest.

 

I am concerned about the extra time and battery energy required to re-open the file for every write. The uSD card is probably the dominant energy user in the entire system. It appears to have a significantly higher current draw when it is actually written by FatFS. And, I only have a 10ms window to get everything related to file storage completed. So, I am worried about the extra time that opening and closing adds to the process. Sorry, but I am having a hard time quantifying all this, as I know that I really need to do; it does not seem to be all that consistent for different brands of uSD cards.

 

Jim

Jim Wagner Oregon Research Electronics, Consulting Div. Tangent, OR, USA http://www.orelectronics.net

Last Edited: Sat. Aug 25, 2018 - 04:22 PM
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

ka7ehk wrote:

Greetings folks -

 

This question has its roots in a hardware question so lets start there.

 

This involves my M328P accelerometer/data logger system that I sell. At the present time, I just allow it to die when the battery runs down. This can potentially result in the loss of up to 24 hours of data as the data file on a uSD card is closed, and a new one opened, every 24 hours.

 

Whenever I do something that involves periodically recording data, I ALWAYS do a complete Open, Write, Flush, Close sequence precisely for the reason that the running code may be interrupted (by the user or, as in your case, by a low voltage condition). As far as the extra current needed to do this... use a correspondingly larger capacity battery pack.

 

You may also wish to use a CPU supervisor (typically a 3 pin TO-93 part that watches Vdd and pulls RESET low at a certain voltage).

 

Maxim / Dallas makes parts like these... and you could use one not to reset the CPU, but to drive a port pin to tell you when the battery voltage falls to a certain point to make your program save, flush and close file(s) before the voltage drops further.

Gentlemen may prefer Blondes, but Real Men prefer Redheads!

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

This is an existing product. If I could have increased the battery pack size, I would have done so, long ago. Ditto, other changes mentioned. This is ONLY an attempt to improve an existing system. 

 

Jim

Jim Wagner Oregon Research Electronics, Consulting Div. Tangent, OR, USA http://www.orelectronics.net

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

ka7ehk wrote:

This is an existing product. If I could have increased the battery pack size, I would have done so, long ago. Ditto, other changes mentioned. This is ONLY an attempt to improve an existing system. 

 

Jim

 

OK then what about using an SRAM buffer (an array) to record, say, 1 second's worth of readings (I assume that would be 10 readings?) or maybe 10 seconds worth (100 readings), then do one big Open, Write, Flush, Close on the "big block" of data?

 

That would decrease the number of open/close cycles by a factor of 10 (or 100) and yet the most data you could lose due to battery depletion would be 1 second's worth (or 10 seconds worth).

 

You said that each data block is about 45 bytes... so 10 seconds worth would be only 4500 bytes. Hopefully you have enough free SRAM for that?

 

To trigger a "write big block" operation you could use something like a uint32_t variable modulo 100. Start at zero, increment each time you write to the ram array, then dump the array to the SD card each time ("variable mod 100 == 0" and "write to array == finished")

 

Maybe?

 

Gentlemen may prefer Blondes, but Real Men prefer Redheads!

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

That is a great idea. May not be able to make a buffer that large but the principle is worth checking.

 

Thanks

Jim

Jim Wagner Oregon Research Electronics, Consulting Div. Tangent, OR, USA http://www.orelectronics.net