Digital directional wattmeter – based on G8GYW – part 2

Digital directional wattmeter – based on G8GYW – part 1 laid out the basis of a project. This article discusses some changed code and calibration.

Changed code

Most of the code was changed, importing work done on other projects.

The important thing is that the code provides for a third order polynomial curve fit to measured data.

Also included is a calibration mode which displays the calculated voltage at the forward and reverse detectors given the nominal 1% voltage dividers in the circuit and the measured ADC reference voltage on this chip.

The display is quite different, several options were explored for practicality for tuning an ATU at low power. Values are not displayed unless there is good confidence in the value. For example, forward power must be above 50mW to display, and likewise, VSWR will only be displayed where reflected power ws greater than 50mW.

The code also provides for storing the calibration parameters in EEPROM rather than inline in code.

Calibration

The wattmeter was calibrated to a Bird43 wattmeter with 100H element using a 100W transmitter and 10 and 20dB attenuators.

First step, the attenuators were measured using a NanoVNA to have 10.3 and 20.3dB attenuation.

A series of measurements were made and Bird43 power reading, attenuation, and indicated forward detector voltage were made and tabulated.

From these, power input to the DUT was calculated and an estimate of uncertainty was added to calculate a table of V, P, and symmetric uncertainty of P.

Above are the calcs, and the table in green was pasted into Veusz to perform the curve fits.

Above is a plot of the measured and calculated RF power vs displayed detector voltage, and second and third order polynomial curve fits.

The second order model is good, probably good enough to use, and the third order model is better.

The calibration looks good from 50mW to 10W, a range of 23dB.

One of the code changes was to allow separation of calibration data from executable code by storing the former in EEPROM, so multiple instances use the same code, but different EEPROM images.

Above is a formatted dump of the EEPROM structure for the calibrated prototype.

An approximate calibration

For an approximate calibration, one approach is to use the same curve fit coefficients and correct the vreff and vrefr values to match the individual chip (there is quite a wide tolerance on the underlying AREF voltage).

Initially calculate \(vreff_{nom}=vrefr_{nom}=AREF_{nom} \frac{18000+10000}{10000}\), likewise for vrefr.

My suggestion is to measure AREF with an accurate meter (there is a test point on board), and use that actual value to calibrate vrreff and vrefr (initially).

WARNING: The following procedure might not be safe with some transmitters.

Load the cal mode firmware and run 5W or so into it with an open circuit on the output jack, and write down the forward and reverse detector voltages displayed (top and bottom rows respectively). Now calculate \({vrefr}={vreff} \frac{v_{fwd}}{v_{rev}}\). This will approximately calibrate the difference in the voltage dividers and the detector responses.

The file nominal.ddwm included in the github repo is a binary EEPROM image for nominal AREF (1.1V). It could be used for a rougher calibration.

A work in progress…