Is there a way to reliably measure antenna return loss outside a lab?
Assume I'm a complete beginner at RF.
Is there a way to measure return loss of an antenna, in such a manner that I can reliably reproduce the measurement later on?
What I'm talking about is producing the antenna characteristic graph to show what frequency it was adapted to and how wide it is. That is, the classic frequency vs dBm graph with a dip at the expected center frequency.
I've never quite managed to do this in a satisfying manner. I can do it in two ways, either the manual way which involves using a spectrum analyser with tracking generator and a 50ohm directional coupler. I connect the tracking generator to the input of the directional coupler, then measure how much energy that bounces back. I also have access to an antenna measurement instrument that does all of this automatically.
Using either method, I get a graph that looks somewhat correct. The antennas are typically either 433MHz or 902MHz 0 gain omnidirectional with a width of +-/50MHz from center at most.
However, if I nudge the setup and place the antenna slightly differently, or just leave it and do it again another day, the energy dip can move some +/-30MHz. I've tried to use a fixture so that the antenna sits mounted & grounded the same way every time, but still there's considerable variations.
I'm not using any signal damper, could that be a problem? Am I wrong thinking the spec should be able to deal with its own tracking generator?
Or am I naive to think I can do this accurately outside a lab? Will EMI really affect measurements that much?