Q&A

# Filter Impedance Consideration

+2
−0

In a lot of textbooks, filters seem to be designed around a matching source and load impedance. However, If I have a filter stage between an antenna and an LNA, wouldn't I want the load impedance(input to the LNA) to be large with respect to the source(antenna) in order to not attenuate the signal even further?

I guess another way to phrase this is why do we care about maximum power transfer at this stage and not preserving as much of the signal amplitude as possible instead?

Why should this post be closed?

+2
−0

Apart from the need to match impedances to prevent possibilities of signal reflections and the knock-on issue of signal nulls at your receiver LNA input, an antenna "expects" to be terminated in the "right" impedance in order to get best performance.

The ratio of E-field to H-field of a radio wave turns out to be volts per amp and that, as you probably know, is measured in ohms. In other words, the signal travelling towards your antenna experiences the "impedance of free space" and that is about 377 ohms. In effect your antenna is an impedance transformer converting a signal with an impedance of 377 ohms to something around 50 ohms (antenna type dependent). This effectively reduces the voltage by a ratio $\sqrt{\frac{377}{50}}$ if it were at all possible to measure voltage in free-space directly.

An antenna is like a fishing net - it has an effective surface area (despite it possibly looking just like a single wire) and, that "fishing net" is "capturing" both an electric field (volts per metre) and a magnetic field (amps per metre). Together, when multiplied you get watts per square metre hence, the antenna effective area is "capturing" watts (usually sub pico watts).

To convert that power into a signal-voltage, you need a resistor i.e. the load resistance required by the antenna to make it work most effectively.

Looking at things from another direction, 50 ohms (being a pretty normal standard for radio input and output impedance) does not generate a whole lot of thermal noise voltage for a given bandwidth and, this is important because, thermal noise can be a very significant factor in limiting how small a signal a particular radio design can effectively receive and demodulate. Keeping to a common impedance (i.e. 50 ohms or thereabouts) is a way of keeping signal-to-noise ratio as good as possible.

But, if you have a LNA that has 1 kohm input impedance (resistive) then there's absolutely no problem in using a filter that can convert impedance from around the 50 ohm mark up to 1 kohm. You wouldn't use resistors of course because that just wastes signal power but, you would use L-Pads, T-networks or $\pi$ networks to convert impedance without power loss.

+2
−0
If I have a filter stage between an antenna and an LNA, wouldn't I want the load impedance(input to the LNA) to be large with respect to the source(antenna) in order to not attenuate the signal even further?

Not necessarily. What you want is maximum power transfer, not maximum voltage. Power is the limiting factor, and the domain where signal to noise ratio really needs to be considered.

If you have an antenna with 50 Ω output impedance and an LNA with 1 kΩ input impedance, for example, then much of the power received by the antenna will go unused, and signal to noise ratio will suffer as a result.

Consider the extremes. If the antenna is completely unloaded, then no power is delivered by it because the output current is 0. If the antenna is shorted, then no power is delivered by it because the voltage is 0. The maximum power transfer happens when the load impedance matches the source impedance, which is 50 Ω in this example.

In the case of a 1 kΩ LNA on a 50 Ω antenna, you want something in between that presents a 50 Ω load to the antenna, but drives the LNA with 1 kΩ impedance. When no power is lost, then such a converter would actually cause a higher output voltage by a factor of sqrt(1000 / 50) = 4.5. Put another way, you get a 4.5x bump in voltage seen by the LNA due to proper impedance matching. That's 13 dB better than with the original impedance mismatch.

The above assumes a perfect lossless converter, which of course doesn't exist. However, various L-C networks or even RF transformers can do better than 13 dB loss. Let's say your matching network has 3 dB loss. That means you're still 10 dB better off than with the original impedance mismatch. 