Morse CW decoding


When scheduling an observation like

I have the impression that the gain is pushed high when selecting CW,
is it true? I can imagine why, if it is the case.

What is the demodulation used for the audio in this case. I think it is still FM.
Would’nt it better be AM in this case?

What do you use for decoding this kind of transmission, in either cases,
in direct using the satnogs-client, or after downloading audio from the site.


The gain looks more or less OK, but the offset of the waterfall needs to be adjusted. I would set it to be between -85 and -45 dB.

It uses an SSB demodulator, which is why you can hear the CW. The tone is low because of frequency error, either on the satellite or your receiver. It would nominally be at 1 kHz.

Thank you for your answer.

So it is SSB (of course, sorry for the question again), and demodulator of audio indeed depends on the encoding we selected.

OK, the gain is correct, but why when I see
the background noise is around -85dB and when we see 21932
(same day 4 hours before same configuration), then the background is
around -75dB. What is the cause of this change of background level?
Is it due to the change of scale in frequency (should change -6dB)?

I guess the waterfall offset can be adapted from the line
set cbrange [-100:-50]
of From what I usually see in my results, -85 will be a good value.

For the tone, I think I am around 1ppm of error because I checked my RTL before
using, but it may have drifted a bit. I didn’t check it recently. Anyhow, it can make up
to 150Hz of error already.


Why do you expect the two waterfalls to have similar levels? Are you using the exact same setup with the exact same settings?

Anyway, you can probably adjust your gain to a lower level, if you wish. The absolute level does not matter. What matters is the signal to noise ratio and that the receiver is not overloaded. Since I can clearly distinguish different signals in the satellite downlink, your receiver does not seem to be overloaded, and that’s why I said the gain seems OK.


I don’t know about satnogs tuning, but for the antenna, preamp, rtl and what
I launch, they are exactly the same, the only changes are the decoding and satellite in

I understand about SNR, but I nevertheless wonder why the noise level is
changing constantly without any apparent context change.

What is the change that I miss?

Sorry, I missed that the two observations were from the same ground station.

The waterfall in observation 21935 has a bandwidth of 48 kHz. The one in observation 21932 has a bandwidth of 200 kHz. So the two waterfalls have different FFT resolution (Hz / pixel). The one with higher bandwidth will represent more power per pixel. Whether that explains the difference I can not tell.

OK, thanks. It was my last guess.