## Trying to understand unity gain amplifier

8 posts / 0 new
Author
Message

My background is software, and my knowledge of electronics is pretty limited. I'm trying to understand the purpose of a unity gain amplifier.

As I understand it, the purpose is to match output and input impedance. If a device providing input voltage/current to the op-amp has high output impedance, then it must draw more power from its source for any given output than a device with lower output impedance, is that right?

If so, I don't see the problem with feeding its output voltage/current into a device with low input impedance. Wouldn't the input device just take what it needs and be happy?

Would someone mind showing me what I'm missing? Why is the op-amp needed?

Thanks

Consider a device with output impedance of 1kohm connected to a input with 1ohm impedance. What you get then is effectively a voltage divider. In the example above, if the output device tries to give out 1V, the input device would only see 0.001V as a result of the impedance mismatch.

If you swap the values, then the input would see ~0.999V.

For high frequency signals, the impedance matching is more important to prevent reflections and increased cable loss.

Quote:
Wouldn't the input device just take what it needs and be happy?

No, the input does not take what it needs, how could it do that? It simply uses whatever signal is present, and in a "high output/low input" impedance scenario, the signal present is attenuated because of the voltage divider as mentioned.

So the problem would be that the output device would have to draw an excessive amount of current from its source to provide what the input device needs? That makes sense.

And the point of the op-amp is to amplify what the output device provides to give the input device what it needs?

Thanks

Last Edited: Thu. Jan 26, 2012 - 04:07 PM

Depends on what you are trying to transfer...
If the intention is to transfer voltage, the receivers input impedance needs to be as large as possible, since as kherseth said, the output impedance of the transmitter and the input impedance of the receiver form a voltage divider. The receiver will only "see" the voltage that falls between its input impedance.
If the intention is to transfer power instead of voltage, then the input impedance and the output impedance must be exactly the same (Google: Maximum Power Transfer Theorem).

Unity gain amplifiers are used in the first case; to transfer voltage, because they have a large input impedance and a very low output impedance, so they are able to "receive" the full voltage signal and "transmit" it with very little signal loss. Imagine that you have a transmitter and a receiver with similar impedances, this will cause your receiver to get a signal attenuated by half, the solution? Put a unity gain amplifier in the middle of the two.

Daniel Campora http://www.wipy.io

Last Edited: Thu. Jan 26, 2012 - 04:12 PM

Quote:

So the problem would be that the output device would have to draw an excessive amount of current from its source to provide what the input device needs?

Forget about the current drawn, the purpose of using unity gain amps has nothing to do with that.

Daniel Campora http://www.wipy.io