Measuring DC supply ripple voltage

Go To Last Post
3 posts / 0 new
Author
Message
#1
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Hi!

I would like to measure the ripple voltage on a DC-DC step down converter output.

The DC-DC step-down converter is using TPS5410 integrated switch IC from Texas Instruments, which has a switching frequency of 500 kHz. On the output I use 100uH/2A inductor and 100uF/50V low ESR capacitor for output filter. According my estimate calculation, and the TI SwitcherPRO simulation the ripple voltage should be less than 10mV or less.

The problem is: my PicoScope 2205 USB oscilloscope has a 8 bit vertical resolution, and for measuring the 5V output voltage on my DC-DC converter, the oscilloscope uses +/-10V measurement range.
This means, that the LSB of the ADC is 20V/256 = 79mV
And so measuring 10mV noise is impossible, since all I see is this 79mV noise, which comes from the LSB noise of the ADC.


Can somebody suggest a solution for this?

Thanks

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Switch your 'scope input to AC coupling (if it doesn't support this, add a 0.1 uF capacitor in series with the input). Then use any scale you wish - e.g. 10 mV/div.

Warning: Grumpy Old Chuff. Reading this post may severely damage your mental health.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

MBedder wrote:
Switch your 'scope input to AC coupling (if it doesn't support this, add a 0.1 uF capacitor in series with the input). Then use any scale you wish - e.g. 10 mV/div.

Thank you very much. So simple, and elegant solution.

(Btw. I should really learn, how to use a scope properly :) )