Antenna Factor is often given / used as a parameter for an antenna (system).
An antenna with (nearly) constant AF can be quite convenient to simple field strength measurement where the AF value establishes a simple relationship between antenna terminal voltage and the external electric field strength.
Antenna Factor (AF) is the ratio of field strength to antenna terminal voltage for an antenna, dimensionally \({AF}=\frac{E}{V}=\frac{V/m}{V}=1/m\), AF units are 1/m or can be expressed in dB as \(AF_{dB}=20 \log_{10} AF \text{ dB/m}\).
It is lazy practice (though not uncommon) to simply express AF in dB, but wrong.
In practice, an antenna may have AF specified at its own terminals based on an assumed load of 50+j0Ω, and when used with a 50+j0Ω receiver with 50Ω coax connection, an adjustment for (frequency dependent) coax loss may be appropriate (though in many practical scenarios, the error of a short section of coax is insignificant in terms of overall measurement uncertainty).
The case is different for a receiver of say 80+j0Ω input impedance used with a length of 50Ω coax where the coax has loss and transforms the receiver input impedance to something else at the antenna’s terminals, so the antenna calibration may no longer apply.
Knowledge informs the issues and solutions. Understand the concepts, analyse the specifications in the context intended, design the test setup including adjustment factors that may be needed.
I see some specious online discussion by the gathered experts stating that \(AF=\frac{R_{freespace}}{R_{receiver}}\) but the concept is flawed in many ways. The fact that it is dimensionally wrong is a hint, and it is naive in other respects.