Many inexpensive VSWR meters have variable sensitivity using a potentiometer to adjust the ‘set’ level, and then a switch is operated to read the VSWR directly from the meter scale.
It is often observed that the measured VSWR depends on the power level. In most cases, this is attributable to a flaw in the design.
The figure above shows the circuit of one commercially produced VSWR meter for hams of the type being discussed.
The relationship between RF voltage into the diode / filter capacitor circuit and the DC out is not a linear one because of the diode voltage drop.
The meter is usually calibrated with the instrument set for minimum sensitivity, ie the scale markings are laid out at minimum sensitivity where the diode voltage drop is relatively small.
When sensitivity is increased, the diode voltage drop is larger relative to the RF input voltage, and the relationship between RF voltage and DC output of the detector is not the same.
The effect of this is that at maximum sensitivity, such an instrument will indicate lower VSWR than at minimum sensitivity (at which they are usually calibrated). The error increases as sensitivity is increased.
For this reason, unless you have proven that the scale markings are reasonably accurate at high sensitivity, you should only accept the VSWR value read from the instrument at lower sensitivity (ie higher input power).
Advice to use low power readings, or worse to calibrate or test instruments at low power is flawed.
© Copyright: Owen Duffy 1995, 2017. All rights reserved. Disclaimer.