Trying to understand unity gain amplifier

Go To Last Post
8 posts / 0 new
Author
Message
#1
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

My background is software, and my knowledge of electronics is pretty limited. I'm trying to understand the purpose of a unity gain amplifier.

As I understand it, the purpose is to match output and input impedance. If a device providing input voltage/current to the op-amp has high output impedance, then it must draw more power from its source for any given output than a device with lower output impedance, is that right?

If so, I don't see the problem with feeding its output voltage/current into a device with low input impedance. Wouldn't the input device just take what it needs and be happy?

Would someone mind showing me what I'm missing? Why is the op-amp needed?

Thanks

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Consider a device with output impedance of 1kohm connected to a input with 1ohm impedance. What you get then is effectively a voltage divider. In the example above, if the output device tries to give out 1V, the input device would only see 0.001V as a result of the impedance mismatch.

If you swap the values, then the input would see ~0.999V.

For high frequency signals, the impedance matching is more important to prevent reflections and increased cable loss.

Quote:
Wouldn't the input device just take what it needs and be happy?

No, the input does not take what it needs, how could it do that? It simply uses whatever signal is present, and in a "high output/low input" impedance scenario, the signal present is attenuated because of the voltage divider as mentioned.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

So the problem would be that the output device would have to draw an excessive amount of current from its source to provide what the input device needs? That makes sense.

And the point of the op-amp is to amplify what the output device provides to give the input device what it needs?

Thanks

Last Edited: Thu. Jan 26, 2012 - 04:07 PM
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Depends on what you are trying to transfer...
If the intention is to transfer voltage, the receivers input impedance needs to be as large as possible, since as kherseth said, the output impedance of the transmitter and the input impedance of the receiver form a voltage divider. The receiver will only "see" the voltage that falls between its input impedance.
If the intention is to transfer power instead of voltage, then the input impedance and the output impedance must be exactly the same (Google: Maximum Power Transfer Theorem).

Unity gain amplifiers are used in the first case; to transfer voltage, because they have a large input impedance and a very low output impedance, so they are able to "receive" the full voltage signal and "transmit" it with very little signal loss. Imagine that you have a transmitter and a receiver with similar impedances, this will cause your receiver to get a signal attenuated by half, the solution? Put a unity gain amplifier in the middle of the two.

Daniel Campora http://www.wipy.io

Last Edited: Thu. Jan 26, 2012 - 04:12 PM
  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Quote:

So the problem would be that the output device would have to draw an excessive amount of current from its source to provide what the input device needs?

Forget about the current drawn, the purpose of using unity gain amps has nothing to do with that.

Daniel Campora http://www.wipy.io

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

I think the whole purpose of a unity gain amp is to amplify the current as it is not amplifying the voltage.

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

True, that's another of the reasons to use the unity gain amp. If the receiver draws more current than what the receiver is able to deliver, an amp (with a current capability large enough to handle the receiver demands) is a good choice.

Daniel Campora http://www.wipy.io

  • 1
  • 2
  • 3
  • 4
  • 5
Total votes: 0

Ok, I think I understand now. Thanks very much.