Post History
If I have a filter stage between an antenna and an LNA, wouldn't I want the load impedance(input to the LNA) to be large with respect to the source(antenna) in order to not attenuate the signal ev...
Answer
#1: Initial revision
<blockquote> If I have a filter stage between an antenna and an LNA, wouldn't I want the load impedance(input to the LNA) to be large with respect to the source(antenna) in order to not attenuate the signal even further?</blockquote> Not necessarily. What you want is maximum <i>power</i> transfer, not maximum voltage. Power is the limiting factor, and the domain where signal to noise ratio really needs to be considered. If you have an antenna with 50 Ω output impedance and an LNA with 1 kΩ input impedance, for example, then much of the power received by the antenna will go unused, and signal to noise ratio will suffer as a result. Consider the extremes. If the antenna is completely unloaded, then no power is delivered by it because the output current is 0. If the antenna is shorted, then no power is delivered by it because the voltage is 0. The maximum power transfer happens when the load impedance matches the source impedance, which is 50 Ω in this example. In the case of a 1 kΩ LNA on a 50 Ω antenna, you want something in between that presents a 50 Ω load to the antenna, but drives the LNA with 1 kΩ impedance. When no power is lost, then such a converter would actually cause a higher output voltage by a factor of sqrt(1000 / 50) = 4.5. Put another way, you get a 4.5x bump in voltage seen by the LNA due to proper impedance matching. That's 13 dB better than with the original impedance mismatch. The above assumes a perfect lossless converter, which of course doesn't exist. However, various L-C networks or even RF transformers can do better than 13 dB loss. Let's say your matching network has 3 dB loss. That means you're still 10 dB better off than with the original impedance mismatch.