APRS: does pre-emphasis make much difference

This morning I have been observing a weak APRS signal from a digipeater located on a prominent hill about 200km away, VK1RGI-1.

VK1RGI-1 is characterised by having a pre-emphasised transmission, but about 1.5KHz deviation, about 6dB low.

Nevertheless, VK1RGI-1 packets can be fairly reliably decoded at a strength that does not show any segments on the IC2200H S meter, less than -115dBm (by measurement). We might expect that the ambient noise temperature is around 2,000K, and the noise power captured in a 15kHz wide receiver would be around -125dBm (allowing 1dB of line loss), so S/N on VK1RGI-1 should be around 10dB, sufficient for good BER on most Bell 202 modem chips.

It is ham folk lore that signals have to be full scale to decode, when that is observed, something is wrong.

Observation of Kenwood mobiles which do not used pre-emphasis and tend to have deviation around 3kHz is that they need to indicate S5 on the S meter for fairly reliable decodes, and never decode at S3 (-107dBm), so they must be at least 8dB stronger than VK1RGI-1 to decode at all, another 2dB gives fairly reliable decoding (unless the signal is flutter affected).

The difference is a little surprising considering that VK1RGI-1 is already 6dB low in deviation.

We might expect that digipeater or iGate receivers that do not use pre-empahsis, whilst catering for non-compliant transmitters, are probably towards 10dB less sensitive to compliant pre-emphasised signals.

Compliance with standards and conventions is very low in ham radio, especially in APRS.


  • It is possible to reliably decode quite weak APRS signals, down to about 10dB above the noise.
  • Failure to comply with the pre-emphais convention disadvantages non-compliant trackers when received on a compliant de-emphasised receiver.