QRSS observation utilities

This article describes some utilities used to facilitate collection and analysis of grabber images.

Archiving a set of grabber images from a website

Some grabbers show an image online for 10min until it it replaced. To avoid sitting watching these in real time, there is advantage in downloading these images. This involves:


I use nnCron LITE to schedule downloads.

The line

5-59/10 * * * * c:\qrss\getit

added to the cron.ini file will run the batch script c:\qrss\getit.bat at 5min past the hour, and every 10min after that.

This entry can be commented out when not needed by inserting a # symbol at the beginning of the line.

Download and rename...

@echo off
rem to be run by cron at xx:x5
rem add the following line to cron.ini
rem 5-59/10 * * * * c:\qrss\getit

rem if you do not have TZ set in your environment, set it appropriately here
rem set TZ=EST-10

rem set CWD to archive directory
cd \qrss\archive

for /F %%i in ('timestmp -i -g') do set timestmp=%%i
set timestmp=%timestmp:~2,9%0

::curl -o G6AVK\%timestmp%.jpg http://www.qsl.net/g6avk/capt.jpg
curl -o W4HBK\%timestmp%.jpg http://www.qsl.net/w4hbk/SL1.jpg
::curl -o VK1OD\%timestmp%.jpg http://vk1od.net/guest/capture/capt.jpg

Above is a batch file that I use under Win7 to get the image files and save them with a meaningful name.

@echo off
rem to be run by cron at xx:x5
rem add the following line to cron.ini
rem 5-59/10 * * * * c:\qrss\getit

rem if you do not have TZ set in your environment, set it appropriately here 
set TZ=EST-11

set LFTP=c:\lftp\lftp

rem set CWD to archive directory
cd \qrss\archive

for /F %%i in ('timestmp -i -g') do set timestmp=%%i
set timestmp=%timestmp:~2,9%0

::curl -o G6AVK\%timestmp%.jpg http://www.qsl.net/g6avk/capt.jpg
::curl -o W4HBK\%timestmp%.jpg http://www.qsl.net/w4hbk/SL1.jpg
::curl -o VK1OD\%timestmp%.jpg http://vk1od.net/guest/capture/capt.jpg

::%LFTP% -c get1 -o NMSU/%timestmp%.jpg -c http://qrss.nmsu.edu/spotC.jpg
%LFTP% -c get1 -o W4HBK/%timestmp%.jpg -c http://www.qsl.net/w4hbk/SL1.jpg
::%LFTP% -c get1 -o ZL2IK/%timestmp%.jpg -c http://zl2ik.com/Argo.jpg
::LFTP% -c get1 -o G6AVK\%timestmp%.jpg -c http://www.qsl.net/g6avk/capt.jpg

Above is a variation using LFTP as the download tool. It has proven more reliable than curl. 

Correct setting of environment variable TZ is vital to getting correct GMT timestamps (though Windows 'knows' your time zone, it doesn't automatically create or update this environment variable).

The utility program TIMESTAMP.EXE is by myself, and a link below allows download. As used here, it allows setting a variable to the GMT timestamp (YYMMDDHHM0).

The program curl is available using the link below. It is a flexible download / upload program.

As can be seen from the script, I file the files from each grabber in separate directories.

The filenames collate in time order, so selecting a set for Rot'n'Stack is made easy.

@echo off

for %%F in (*.jpg) do convert %%~nF.jpg -quality 50 c\%%~nF.jpg 

Some grabbers create images with unnecessarily high quality at the expense of file size. The above batch file uses Imagemagick's convert command to convert and copy files to the c subdirectory. For example, images from W4HBK are reduced in size by almost 90% and the reduction in visual quality for the purpose is insignificant. 

Publishing a grabber

A grabber is published by copying its images to a web server periodically.

Firstly, the computer clock needs to be synchronised to an accurate time server. I used Dimension4.

After evaluating Argo, Spectrum Lab was chosen for the grabber.

Spectrum Lab has a facility to capture images periodically and also to run arbitrary commands periodically.

Setting capture in Spectrum Lab

Above, under Options/ScreenCapture on the PeriodicActions tab, the screen is captured to two files on the hour and every 10min, then upload.bat is used to upload one of the files to the website using FTP.

cd \qrss

\bin\lftp\lftp -f upload.lftp
timestmp -igb0 -c "lftp RC=%ERRORLEVEL%" -f upload.log

The file transfer could be run directly from Spectrum Lab, but the use of upload.bat allows creation of a log file to monitor operation. LFTP has been used for the transfer as it has good error recovery, it succeeds where others fail.

set ssl:verify-certificate no
set xfer:log-file lftp02.log
set xfer:log yes
set net:max-retries 5

open -u user,password vk1od.net
put capture/capt.jpg

The uploaded image file (of constant name) is displayed by a php web page.

 The file with the variable name is for archival purposes.

rem run this from cron with the following entry in cron.tab
rem 2 * * * * \qrss\mirror.bat

cd \qrss
\bin\lftp\lftp -f images.lft

nnCron Lite is used to schedule a batch file at 2min after the hour. The batch file (mirror.bat) above runs a Python script to clean up old files, and then LFTP to mirror the archive files in the local directory to the web server. The mirror function includes deleting files on the server that are no longer in the local directory (eg after the clean process).

import glob
import os
import datetime
import time
import sys



#get list of files
for file in files:
  #for each file calculate age, and if too old, delete.

Above is clean.py, a Python script to delete image files named SL*.jpg that are more than 1 day old.

open -u user,password vk1od.net
mirror --reverse --delete -I SL*.jpg --verbose capture/ /

LFTP is used to perform the mirror operation, above is the script for LFTP. The FTP account used is rooted at a subdirectory of the web tree. Without this facility, the destination should be the path of an appropriate subdirectory.

The server is configured to allow directory listing of the directory that contains the archived capture files.

Web display page

If PHP is available on the web server, it provides facility a smarter web page.

Intelligent refresh time

printf(" <META HTTP-EQUIV=\"REFRESH\" CONTENT=\"%d\">\n",600+rand(30,45)-time()%600);

Displaying an offline graphic

$filename = 'capture/capt.jpg';
if (file_exists($filename) && (time()-filemtime($filename))>3600) {
<img src="Offline.png" height="551" width="1121">
<td><img src="capture/capt.jpg" height="551" width="1121"></td>

Excessive image size

Image files do not need to be huge to be useful, indeed making them too large detracts from their usefulness as it creates upload/download issues, storage issues etc. The image file below was 85kB before addition of the number graphics. In Spectrum lab, JPEG quality of 50% is quite sufficient and images that display without zooming on most modern screens suits most users, so less than 1200 pixels wide and less than 600 pixels high seems a good compromise (image below is 1121x551). Forcing zoom of images degrades quality in most cases!

Maximising usefulness of the display


Above is a grab with my beacon running locally. Some key points in the display:

  1. datetime in UTC;
  2. audio levels set so that strong signals to not push the background noise off the graph;
  3. audio levels so that signals to not exceed the maximum on the graph;
  4. minute timescale displayed and clock synchronised to UTC, a little over 10 minutes span is ideal for most purposes;
  5. actual radio frequency shown on spectrum scale (rather than audio frequency);
  6. a pallete that permits observation of signal amplitude from the waterfall display.

Save an archive of the grabs as explained earlier, and make it available for users. Dropboxes appeal to some, but they are quite intensive on bandwidth, especially with large histories, and are less useful to end users with limited bandwidth.



Version Date Description
1.01 02/07/12 Initial.
1.02 31/08/12  PHP web page, upload logging. 
1.03 03/12/12  Image notes. 

© Copyright: Owen Duffy 1995, 2017. All rights reserved. Disclaimer.