Missing Waterfall with usb_claim_interface error -6

After almost a year, my VHF station started to misbehave with corrupted waterfalls over the weekend so I took the opportunity to update to a new SD card and all new Satnogs client.
It was a very smooth transition and I was up and running in about 1/2 hour.
However, after 3 successful observations, the station started to to not show the waterfall.

I sshed into the Pi and found the following;

> pi@satnogs-vhf:/tmp/.satnogs/data $ systemctl status satnogs-client
> ? satnogs-client.service - SatNOGS client
>    Loaded: loaded (/etc/systemd/system/satnogs-client.service; enabled; vendor preset: enabled)
>    Active: active (running) since Mon 2020-04-27 07:48:57 PDT; 3h 36min ago
>  Main PID: 308 (satnogs-client)
> Tasks: 73 (limit: 2200)
>    Memory: 641.9M
>    CGroup: /system.slice/satnogs-client.service
>        +- 308 /var/lib/satnogs/bin/python3 /var/lib/satnogs/bin/satnogs-client
>        +- 746 /usr/bin/python3 /usr/bin/satnogs_fsk_ax25.py --soapy-rx-device=driver=rtlsdr --samp-rate-rx=2.048e6 --rx-freq=145825000 --file-path=/tmp/.satnogs/data/receiving_satnogs_2108779_2020-04-
>        +- 792 /usr/bin/python3 /usr/bin/satnogs_cw_decoder.py --soapy-rx-device=driver=rtlsdr --samp-rate-rx=2.048e6 --rx-freq=145910000 --file-path=/tmp/.satnogs/data/receiving_satnogs_2108780_2020-0
>        +- 850 /usr/bin/python3 /usr/bin/satnogs_cw_decoder.py --soapy-rx-device=driver=rtlsdr --samp-rate-rx=2.048e6 --rx-freq=145961200 --file-path=/tmp/.satnogs/data/receiving_satnogs_2108781_2020-0
>        +- 913 /usr/bin/python3 /usr/bin/satnogs_amsat_fox_duv_decoder.py --soapy-rx-device=driver=rtlsdr --samp-rate-rx=2.048e6 --rx-freq=145920000 --file-path=/tmp/.satnogs/data/receiving_satnogs_210
>        +- 949 /usr/bin/python3 /usr/bin/satnogs_cw_decoder.py --soapy-rx-device=driver=rtlsdr --samp-rate-rx=2.048e6 --rx-freq=145910000 --file-path=/tmp/.satnogs/data/receiving_satnogs_2108783_2020-0
>        +- 988 /usr/bin/python3 /usr/bin/satnogs_cw_decoder.py --soapy-rx-device=driver=rtlsdr --samp-rate-rx=2.048e6 --rx-freq=145855000 --file-path=/tmp/.satnogs/data/receiving_satnogs_2108784_2020-0
>        +-1023 /usr/bin/python3 /usr/bin/satnogs_amsat_fox_duv_decoder.py --soapy-rx-device=driver=rtlsdr --samp-rate-rx=2.048e6 --rx-freq=145880000 --file-path=/tmp/.satnogs/data/receiving_satnogs_210
> 
> Apr 27 10:34:27 satnogs-vhf satnogs-client[308]: [INFO] Using format CF32.
> Apr 27 10:41:16 satnogs-vhf satnogs-client[308]: usb_claim_interface error -6
> Apr 27 10:41:16 satnogs-vhf satnogs-client[308]: usb_claim_interface error -6
> Apr 27 10:41:16 satnogs-vhf satnogs-client[308]: [INFO] Using format CF32.
> Apr 27 11:00:54 satnogs-vhf satnogs-client[308]: usb_claim_interface error -6
> Apr 27 11:00:54 satnogs-vhf satnogs-client[308]: usb_claim_interface error -6
> Apr 27 11:00:54 satnogs-vhf satnogs-client[308]: [INFO] Using format CF32.
> Apr 27 11:09:42 satnogs-vhf satnogs-client[308]: usb_claim_interface error -6
> Apr 27 11:09:42 satnogs-vhf satnogs-client[308]: usb_claim_interface error -6
> Apr 27 11:09:42 satnogs-vhf satnogs-client[308]: [INFO] Using format CF32.

It sounds a lot like the latter few posts on this issue.

Is there anything I can test / do before I reboot.
I would hate to lose any trouble shooting data with a reboot.

Thanks.

1 Like

I’m both sad and happy that someone else is experiencing this problem.

A reboot “solved” the problem for me but as you mentioned that also made it impossible to reproduce the problem (at least so far).

This should show which process has locked the sdr:

sudo apt-get install lsof
sudo lsof | grep '/dev/bus/usb'

If I start rtl_test (to simulate using the sdr since my station doesn’t have an observation at the moment) I get this:

rtl_test  28474                               pi  mem       CHR      189,2                 8517 /dev/bus/usb/001/003
rtl_test  28474                               pi    9u      CHR      189,2       0t0       8517 /dev/bus/usb/001/003
rtl_test  28474 28480 rtl_test                pi  mem       CHR      189,2                 8517 /dev/bus/usb/001/003
rtl_test  28474 28480 rtl_test                pi    9u      CHR      189,2       0t0       8517 /dev/bus/usb/001/003

You should get something similar, but not rtl_test. The second field is the process id. Running ps -ef | grep 28474 (subsitite 28474 with the process id from your result) might yield some interesting information.

This is what the lsof shows;

pi@satnogs-vhf:/tmp/.satnogs/data $ sudo lsof | grep '/dev/bus/usb'

satnogs_f 746 satnogs mem CHR 189,4 8025 /dev/bus/usb/001/005
satnogs_f 746 satnogs 9u CHR 189,4 0t0 8025 /dev/bus/usb/001/005
satnogs_f 746 752 satnogs_f satnogs mem CHR 189,4 8025 /dev/bus/usb/001/005
satnogs_f 746 752 satnogs_f satnogs 9u CHR 189,4 0t0 8025 /dev/bus/usb/001/005
satnogs_f 746 753 satnogs_f satnogs mem CHR 189,4 8025 /dev/bus/usb/001/005
satnogs_f 746 753 satnogs_f satnogs 9u CHR 189,4 0t0 8025 /dev/bus/usb/001/005
satnogs_f 746 759 soapy::so satnogs mem CHR 189,4 8025 /dev/bus/usb/001/005
satnogs_f 746 759 soapy::so satnogs 9u CHR 189,4 0t0 8025 /dev/bus/usb/001/005
satnogs_f 746 782 satnogs_f satnogs mem CHR 189,4 8025 /dev/bus/usb/001/005
satnogs_f 746 782 satnogs_f satnogs 9u CHR 189,4 0t0 8025 /dev/bus/usb/001/005
satnogs_f 746 783 satnogs_f satnogs mem CHR 189,4 8025 /dev/bus/usb/001/005
satnogs_f 746 783 satnogs_f satnogs 9u CHR 189,4 0t0 8025 /dev/bus/usb/001/005
satnogs_f 746 787 satnogs_f satnogs mem CHR 189,4 8025 /dev/bus/usb/001/005
satnogs_f 746 787 satnogs_f satnogs 9u CHR 189,4 0t0 8025 /dev/bus/usb/001/005
satnogs_c 792 satnogs 10u CHR 189,4 0t0 8025 /dev/bus/usb/001/005
satnogs_c 792 satnogs 11u CHR 189,4 0t0 8025 /dev/bus/usb/001/005
satnogs_c 792 799 satnogs_c satnogs 10u CHR 189,4 0t0 8025 /dev/bus/usb/001/005
satnogs_c 792 799 satnogs_c satnogs 11u CHR 189,4 0t0 8025 /dev/bus/usb/001/005
satnogs_c 792 805 soapy::so satnogs 10u CHR 189,4 0t0 8025 /dev/bus/usb/001/005
satnogs_c 792 805 soapy::so satnogs 11u CHR 189,4 0t0 8025 /dev/bus/usb/001/005
satnogs_c 792 819 satnogs_c satnogs 10u CHR 189,4 0t0 8025 /dev/bus/usb/001/005
satnogs_c 792 819 satnogs_c satnogs 11u CHR 189,4 0t0 8025 /dev/bus/usb/001/005
satnogs_c 792 821 satnogs_c satnogs 10u CHR 189,4 0t0 8025 /dev/bus/usb/001/005
satnogs_c 792 821 satnogs_c satnogs 11u CHR 189,4 0t0 8025 /dev/bus/usb/001/005
satnogs_c 850 satnogs 10u CHR 189,4 0t0 8025 /dev/bus/usb/001/005
satnogs_c 850 satnogs 11u CHR 189,4 0t0 8025 /dev/bus/usb/001/005
satnogs_c 850 857 satnogs_c satnogs 10u CHR 189,4 0t0 8025 /dev/bus/usb/001/005
satnogs_c 850 857 satnogs_c satnogs 11u CHR 189,4 0t0 8025 /dev/bus/usb/001/005
satnogs_c 850 863 soapy::so satnogs 10u CHR 189,4 0t0 8025 /dev/bus/usb/001/005
satnogs_c 850 863 soapy::so satnogs 11u CHR 189,4 0t0 8025 /dev/bus/usb/001/005

I had to snip a LOT off as it just kept scrolling but you get the idea.

The ps command returned the following;

pi@satnogs-vhf:/tmp/.satnogs/data $ ps -ef | grep 850

satnogs 850 308 0 09:43 ? 00:02:11 /usr/bin/python3 /usr/bin/satnogs_cw_decoder.py --soapy-rx-device=driver=rtlsdr --samp-rate-rx=2.048e6 --rx-freq=145961200 --file-path=/tmp/.satnogs/data/receiving_satnogs_2108781_2020-04-27T16-43-10.out --waterfall-file-path=/tmp/.satnogs/data/receiving_waterfall_2108781_2020-04-27T16-43-10.dat --wpm=12 --rigctl-port=4532 --gain-mode=Overall --gain=32.8 --antenna=RX --decoded-data-file-path=/tmp/.satnogs/data/data_2108781
pi 1584 503 0 14:22 pts/0 00:00:00 grep --color=auto 850

It seems that as soon as the first observation failed, all the others just keep piling up behind it.
Fail once, fail forever.

I am in no real rush to get back up and running.
Thanks for the help so far, is there anything else I can test?

1 Like

Great information. I have no further ideas at the moment, but hopefully the satnogs team can chip in. Or maybe I’ll dream up some ideas when sleeping :slight_smile:

I’ve seen that usb claim error a lot in the times where a previous observation’s task was still running (and indeed claiming the usb device) after it should have ended.

Please see if there are abandoned processes leftover once this happens again.

So for now, I should just reboot and get back to working?
Or how do I check if there is an abandoned process?

Hm, let me see the output of

ps -aux | grep "satnogs_"

, please.

pi@satnogs-vhf:/tmp/.satnogs/data $ ps -aux | grep “satnogs_”

pi 1633 0.0 0.0 7352 536 pts/0 S+ 14:50 0:00 grep --color=auto “satnogs_”

Looks pretty clean :joy:

I can’t do much today as I’m at work.

Are you familiar with running a gr-satnogs flowgraph from CLI?

No I am not. This is the first time I have kicked the tires on the new build.

1 Like

Ok, then you should probably just reboot to get your station working again. I can offer to help digging into this the next time it happens and I’m hopefully not at work then. If you’re lucky, someone else might jump in…

Not sure if it helps, but here is the output from a healthy station. Note that all references are from the same process id:

pi@satnogs-fhab-1:~ $ sudo lsof | grep '/dev/bus/usb'
satnogs_f 29177                          satnogs  mem       CHR      189,2                 8517 /dev/bus/usb/001/003
satnogs_f 29177                          satnogs    9u      CHR      189,2       0t0       8517 /dev/bus/usb/001/003
satnogs_f 29177 29183 satnogs_f          satnogs  mem       CHR      189,2                 8517 /dev/bus/usb/001/003
satnogs_f 29177 29183 satnogs_f          satnogs    9u      CHR      189,2       0t0       8517 /dev/bus/usb/001/003
satnogs_f 29177 29184 satnogs_f          satnogs  mem       CHR      189,2                 8517 /dev/bus/usb/001/003
satnogs_f 29177 29184 satnogs_f          satnogs    9u      CHR      189,2       0t0       8517 /dev/bus/usb/001/003
satnogs_f 29177 29186 frame_fil          satnogs  mem       CHR      189,2                 8517 /dev/bus/usb/001/003
satnogs_f 29177 29186 frame_fil          satnogs    9u      CHR      189,2       0t0       8517 /dev/bus/usb/001/003
satnogs_f 29177 29187 json_conv          satnogs  mem       CHR      189,2                 8517 /dev/bus/usb/001/003
satnogs_f 29177 29187 json_conv          satnogs    9u      CHR      189,2       0t0       8517 /dev/bus/usb/001/003
satnogs_f 29177 29188 tcp_rigct          satnogs  mem       CHR      189,2                 8517 /dev/bus/usb/001/003
satnogs_f 29177 29188 tcp_rigct          satnogs    9u      CHR      189,2       0t0       8517 /dev/bus/usb/001/003
satnogs_f 29177 29189 udp_msg_s          satnogs  mem       CHR      189,2                 8517 /dev/bus/usb/001/003
satnogs_f 29177 29189 udp_msg_s          satnogs    9u      CHR      189,2       0t0       8517 /dev/bus/usb/001/003
satnogs_f 29177 29190 soapy::so          satnogs  mem       CHR      189,2                 8517 /dev/bus/usb/001/003
satnogs_f 29177 29190 soapy::so          satnogs    9u      CHR      189,2       0t0       8517 /dev/bus/usb/001/003
satnogs_f 29177 29191 fir_filte          satnogs  mem       CHR      189,2                 8517 /dev/bus/usb/001/003
satnogs_f 29177 29191 fir_filte          satnogs    9u      CHR      189,2       0t0       8517 /dev/bus/usb/001/003
satnogs_f 29177 29192 coarse_do          satnogs  mem       CHR      189,2                 8517 /dev/bus/usb/001/003
satnogs_f 29177 29192 coarse_do          satnogs    9u      CHR      189,2       0t0       8517 /dev/bus/usb/001/003
satnogs_f 29177 29193 pfb_arb_r          satnogs  mem       CHR      189,2                 8517 /dev/bus/usb/001/003
satnogs_f 29177 29193 pfb_arb_r          satnogs    9u      CHR      189,2       0t0       8517 /dev/bus/usb/001/003
satnogs_f 29177 29194 quadratur          satnogs  mem       CHR      189,2                 8517 /dev/bus/usb/001/003
satnogs_f 29177 29194 quadratur          satnogs    9u      CHR      189,2       0t0       8517 /dev/bus/usb/001/003
satnogs_f 29177 29195 pfb_arb_r          satnogs  mem       CHR      189,2                 8517 /dev/bus/usb/001/003
satnogs_f 29177 29195 pfb_arb_r          satnogs    9u      CHR      189,2       0t0       8517 /dev/bus/usb/001/003
satnogs_f 29177 29196 dc_blocke          satnogs  mem       CHR      189,2                 8517 /dev/bus/usb/001/003
satnogs_f 29177 29196 dc_blocke          satnogs    9u      CHR      189,2       0t0       8517 /dev/bus/usb/001/003
satnogs_f 29177 29197 ogg_encod          satnogs  mem       CHR      189,2                 8517 /dev/bus/usb/001/003
satnogs_f 29177 29197 ogg_encod          satnogs    9u      CHR      189,2       0t0       8517 /dev/bus/usb/001/003
satnogs_f 29177 29198 delay27            satnogs  mem       CHR      189,2                 8517 /dev/bus/usb/001/003
satnogs_f 29177 29198 delay27            satnogs    9u      CHR      189,2       0t0       8517 /dev/bus/usb/001/003
satnogs_f 29177 29199 fir_filte          satnogs  mem       CHR      189,2                 8517 /dev/bus/usb/001/003
satnogs_f 29177 29199 fir_filte          satnogs    9u      CHR      189,2       0t0       8517 /dev/bus/usb/001/003
satnogs_f 29177 29200 quadratur          satnogs  mem       CHR      189,2                 8517 /dev/bus/usb/001/003
satnogs_f 29177 29200 quadratur          satnogs    9u      CHR      189,2       0t0       8517 /dev/bus/usb/001/003
satnogs_f 29177 29201 moving_av          satnogs  mem       CHR      189,2                 8517 /dev/bus/usb/001/003
satnogs_f 29177 29201 moving_av          satnogs    9u      CHR      189,2       0t0       8517 /dev/bus/usb/001/003
satnogs_f 29177 29202 vco_c24            satnogs  mem       CHR      189,2                 8517 /dev/bus/usb/001/003
satnogs_f 29177 29202 vco_c24            satnogs    9u      CHR      189,2       0t0       8517 /dev/bus/usb/001/003
satnogs_f 29177 29203 multiply_          satnogs  mem       CHR      189,2                 8517 /dev/bus/usb/001/003
satnogs_f 29177 29203 multiply_          satnogs    9u      CHR      189,2       0t0       8517 /dev/bus/usb/001/003
satnogs_f 29177 29204 fir_filte          satnogs  mem       CHR      189,2                 8517 /dev/bus/usb/001/003
satnogs_f 29177 29204 fir_filte          satnogs    9u      CHR      189,2       0t0       8517 /dev/bus/usb/001/003
satnogs_f 29177 29205 quadratur          satnogs  mem       CHR      189,2                 8517 /dev/bus/usb/001/003
satnogs_f 29177 29205 quadratur          satnogs    9u      CHR      189,2       0t0       8517 /dev/bus/usb/001/003
satnogs_f 29177 29206 dc_blocke          satnogs  mem       CHR      189,2                 8517 /dev/bus/usb/001/003
satnogs_f 29177 29206 dc_blocke          satnogs    9u      CHR      189,2       0t0       8517 /dev/bus/usb/001/003
satnogs_f 29177 29207 clock_rec          satnogs  mem       CHR      189,2                 8517 /dev/bus/usb/001/003
satnogs_f 29177 29207 clock_rec          satnogs    9u      CHR      189,2       0t0       8517 /dev/bus/usb/001/003
satnogs_f 29177 29208 binary_sl          satnogs  mem       CHR      189,2                 8517 /dev/bus/usb/001/003
satnogs_f 29177 29208 binary_sl          satnogs    9u      CHR      189,2       0t0       8517 /dev/bus/usb/001/003
satnogs_f 29177 29209 frame_dec          satnogs  mem       CHR      189,2                 8517 /dev/bus/usb/001/003
satnogs_f 29177 29209 frame_dec          satnogs    9u      CHR      189,2       0t0       8517 /dev/bus/usb/001/003
satnogs_f 29177 29210 frame_dec          satnogs  mem       CHR      189,2                 8517 /dev/bus/usb/001/003
satnogs_f 29177 29210 frame_dec          satnogs    9u      CHR      189,2       0t0       8517 /dev/bus/usb/001/003
satnogs_f 29177 29211 iq_sink7           satnogs  mem       CHR      189,2                 8517 /dev/bus/usb/001/003
satnogs_f 29177 29211 iq_sink7           satnogs    9u      CHR      189,2       0t0       8517 /dev/bus/usb/001/003
satnogs_f 29177 29212 waterfall          satnogs  mem       CHR      189,2                 8517 /dev/bus/usb/001/003
satnogs_f 29177 29212 waterfall          satnogs    9u      CHR      189,2       0t0       8517 /dev/bus/usb/001/003
satnogs_f 29177 29213 satnogs_f          satnogs  mem       CHR      189,2                 8517 /dev/bus/usb/001/003
satnogs_f 29177 29213 satnogs_f          satnogs    9u      CHR      189,2       0t0       8517 /dev/bus/usb/001/003
satnogs_f 29177 29214 satnogs_f          satnogs  mem       CHR      189,2                 8517 /dev/bus/usb/001/003
satnogs_f 29177 29214 satnogs_f          satnogs    9u      CHR      189,2       0t0       8517 /dev/bus/usb/001/003

Ah, mine are all in batches. Like the first failed and then others just queued up behind.

I am going to give it a few hours before I reboot in case someone else has something they want me to check… I would really like help others get to the bottom of this and get it fixed… if two stations have had the issue, others are bound to come unstuck with it.

2 Likes

A few hours latter and the Pi has hard crashed. CF card light (the green LED) is hard on and I can not SSH into it any more.
Power cycle to reboot and we have scheduled a few passes for the night.
See what the SoCal night brings.

Random idea (that many times is unfortunately true): Corrupted SD?

I suspect that’s what might have happen to the original 8gig card that has been running the station for the past year.
This time with the new build, I used a brand new out of the packet high quality 32 gig card.
Its possible. Highly unlikely, but possible.

Also, this is a very straight forward build, RTLSDRv3, no biasT scripts, straight out of the box Satnogs station.

The station failed again last night. So right now it is in the fault condition.
Passes are being scheduled but no waterfalls are being uploaded and the files are backed up.

Are there any other tests I can run before I reboot it?

If you can please post also your Advanced > Support output.
Are you on a rpi3 ?

------------[ copy here ]------------

{
“versions”: {
“satnogs-client”: “1.3.1”,
“satnogs-client-ansible”: “202004031132”,
“satnogs-flowgraphs”: “1.1.1-1”,
“gr-satnogs”: “2.1.2-1”,
“gr-soapy”: “2.1.2-1”,
“gnuradio”: “3.8.1.0~rc1-2”,
“satnogs-config”: “0.10.1”
},
“state”: {
“is-applied”: false,
“pending-tags”: null
},
“system”: {
“distribution”: {
“DESCRIPTION”: “Raspbian GNU/Linux 10 (buster)”,
“RELEASE”: “10”,
“CODENAME”: “buster”,
“ID”: “Raspbian”
},
“pending-updates”: false,
“platform”: {
“system”: “Linux”,
“node”: “satnogs-vhf”,
“release”: “4.19.97-v7+”,
“version”: “#1294 SMP Thu Jan 30 13:15:58 GMT 2020”,
“machine”: “armv7l”,
“processor”: “”
},
“memory”: {
“total”: 971059200,
“available”: 178212864,
“percent”: 81.6,
“used”: 737644544,
“free”: 102244352,
“active”: 290447360,
“inactive”: 378675200,
“buffers”: 2002944,
“cached”: 129167360,
“shared”: 1556480,
“slab”: 43167744
},
“disk”: {
“total”: 30789259264,
“used”: 3762208768,
“free”: 25728819200,
“percent”: 12.8
}
},
“configuration”: {
“satnogs_antenna”: “RX”,
“satnogs_api_token”: “[redacted]”,
“satnogs_rf_gain”: “32.8”,
“satnogs_rx_samp_rate”: “2.048e6”,
“satnogs_soapy_rx_device”: “driver=rtlsdr”,
“satnogs_station_elev”: “1017”,
“satnogs_station_id”: “908”,
“satnogs_station_lat”: “33.49625”,
“satnogs_station_lon”: “-117.087264”
}
}
------------[ copy end ]-------------

Yes, Raspberry Pi 3

I am at a loss. Closest thing I can think of is USB power issue… but why after the update? makes no sense.

Thanks for taking a look.
It is a genuine Raspberry Pi power supply, but I will change it tonight when I get home just the same.
I have rebooted the station.