how long does sample and hold take.

Go To Last Post
8 posts / 0 new
Author
Message
#1
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

In mega64 data sheet and CAN128 data sheet, in free running or auto tigger modes it says:

"The actual sample and hold takes place 1.5 ADC clock cycles after the start of a normal conversion"

On the timing diagram, sample & hold is shown as a single event.

But I need to know when the sample takes place and when the hold takes place. Perhaps in the data sheet they mean that the sample starts at 0 and the hold takes place after 1.5 ADC clock cylcles. Does anybody know?

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Look at your AT90CAN128 data sheet section “Analog Input Circuitry”. This is on page 273 of the 4250E CAN 12/04 revision sheet.

It shows an equivalent circuit representation of the sample and hold as a resistance / capacitance type of circuit. It takes time to charge the sample capacitor and the ADC input circuit impedance that you design has the greatest effect on this sample time. It is always necessary to filter out any ACD sample frequencies that are over half the ADC sample rate (Nyquist). Just calculate your programmed ACD clock speed, divide by the number of ACD cycles per conversion and divide the total in half. Any ACD signal input frequency higher than this will not sample correctly, no matter what.

The single sample and hold event shown in the documentation is the point at which the sample gate is shut off. Up until this shutoff event, the sample and hold capacitor has been connected and charging or discharging as a function of its R/C time constance (again your circuit design has a major effect on its responsiveness or lack thereof).

After the sample gate is shut off, the capacitor will hold its charge value at the same level without responding to any input changes until the ACD measurement cycle is finished.

The only thing we do know for sure is when the hold takes place, as show in the data sheet as the “sample & hold” point. We also know that the sample circuit takes time to get a sample as determined by its R/C timing, so the sample period can not be instantaneous (it will take some period of time as determined by the circuit). The simplest assumption is the sample period is show on the data sheet timing diagrams as the time between “MUX and REFS Update” until “sample & hold” (this may or may not really be correct). ATMEL could clarify this.

No ACD sample and hold will ever be an exact representation of a constantly changing input signal. By following the design guidelines in the data sheet, it should be a very usable approximation of the actual input voltage at the moment the sample capacitor is isolated by disconnecting the sample gate.

All of the above only applies to single ended ACD samples. The differential input samples have different considerations covered in the data sheet.

The answer is the sample takes place over a period of time and the hold is the shutoff point when input voltage changes are ignored until the next sample. If you stay well within the Nyquist limit, try experimenting with different input impedances to see what works best for your input voltages. Even if ATMEL gave you a hard number for the sample period, your circuit input impedance would still have the largest effect on sampling accuracy.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Quote:

The simplest assumption is the sample period is show on the data sheet timing diagrams as the time between “MUX and REFS Update” until “sample & hold” (this may or may not really be correct). ATMEL could clarify this.

Some other microcontrollers allow adjustment of the sample time to accomodate (say) high input impedance signals. It would be nice if we also had some control in AVRs. I suppose that we >>do<< have some--if you run a 50kHz ADC clock vs. a 200kHz ADC clock, wouldn't the sample time be 4x as long?

Lee

You can put lipstick on a pig, but it is still a pig.

I've never met a pig I didn't like, as long as you have some salt and pepper.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

I should be ok as my input is from an op-amp, so it's a low impedance source.

I have no issues with Nyquist....the input is low frequency but there are a lot of them to scan, so I'm using the maximum 1MHZ ADC clock.

I found a new feature on the CAN128 ADC (not sure whether the mega128 has it), there is a high speed mode where greater accuracy can be achieved at high ADC clock rates. As to be expected this at the expense of power consumption.

Unfortunaately the ADC characteristics in the datasheet have a few blanks, so the actual accuracy has not yet been stated.
A nice feature, if it really makes any difference.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

hello,
having the same question as sparkymark it would be helpufl to know if Mike's assumption is correct and
sampling (swich closes) is started as soon as ADSC is set and it is stopped (switch opens) after 1.5 ADC cycles?
is this correct?

many thanks in advance
gerhard

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

blast from the past!
I'm fairly certain that the sample starts straight away and the the hold takes place ater 1.5 ADC cycles

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Take a look at the AT90CAN128 revision 12/04 data sheet table 99 on page 270. If you interpret “Sample & Hold” to really mean “Sample Time Until Hold”, then the sample period will be 14.5 cycles, 1.5 cycles or 2 cycles from start of conversion, depending on which type of ADC conversion is being done.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Hello...

This old topic aren't answered yet.
I Just need that data too. So I try to measure it.
And result seems to be that sampling time is 1 adc clock and hold start as per datasheet.

I measured that in 'myself' way.

cbi ADC pin
restart ADC ;must restart prescaler!
nop , nop ..... variable
sbi adc pin ;just one cycle H pulse
cbi adc pin

wait adc to complete
show result

and varying these nops I just search where singlecycle pulse change ADC result from 0 to something other.