Uploading IQ files to S3

client

#1

It’s sometimes useful to have IQ data available for debugging, or decoding frames that are off frequency. I’ve made a simple script to automatically upload the IQ data to S3.

Notes:

  • Saving IQ will cost about 100-500MB of storage space for each pass depending on length and sample rate - This may cause SD cards to fail sooner due to the extra write IOPS.
  • You can end up with lots of space used in S3 which will cost you each month. You can setup a life cycle policy to manage how long observations are kept for.
  • Satnogs client stores IQ files as short type, and for most programs will need to be converted to complex. You can do this in GNU Radio by having File Source -> IShort To Complex -> File Sink
  • Once converted you can use them in GQRX with file=file-path-complex.raw,rate=48000,repeat=false,throttle=false as the device
  • This script will delete the file regardless of if it is successful or not - this is on purpose for my use case as I don’t want the SD card filling up

In AWS:

  1. Create an S3 bucket
  2. Create an IAM User with programatic access. Note down the access key and secret key.
  3. Create an IAM policy on the user that provides access to the bucket - change YOURBUCKETNAMEHERE to your bucket name
    {
     "Version": "2012-10-17",
     "Statement": [
         {
             "Effect": "Allow",
             "Action": [
                 "s3:PutAccountPublicAccessBlock",
                 "s3:GetAccountPublicAccessBlock",
                 "s3:ListAllMyBuckets",
                 "s3:ListJobs",
                 "s3:CreateJob",
                 "s3:HeadBucket"
             ],
             "Resource": "*"
         },
         {
             "Effect": "Allow",
             "Action": "s3:*",
             "Resource": "arn:aws:s3:::YOURBUCKETNAMEHERE"
         },
         {
             "Effect": "Allow",
             "Action": "s3:*",
             "Resource": "arn:aws:s3:::YOURBUCKETNAMEHERE/*"
         }
     ]
    }
    

On the station:

  1. Install AWS CLI
    sudo apt-get install python-pip
    sudo pip install awscli
    sudo pip install requests==2.2.1 # revert the requests library install that awscli does
    
  2. Setup a directory for storing IQ data. This could be tmpfs, a folder or an external disk.
    sudo mkdir /iq
    sudo chown satnogs:satnogs /iq
    
  3. Configure AWS CLI - set your access key and region
    su satnogs -s /bin/bash
    aws configure
    
  4. Install the script - Change YOURBUCKETNAMEHERE to your bucket name- If you have another script your running just incorporate it into that
    cat << EOF >> /tmp/iq-upload
    #!/bin/bash
    # Copies IQ files to S3. Deletes the file reagrdless if it succeeded or not to upload it
    
    # Create a folder /iq - you can make this a seperate mount to a tmpfs or external hdd
    # Make sure `satnogs` user has access to it
    # Install aws-cli - https://docs.aws.amazon.com/cli/latest/userguide/cli-chap-install.html
    # configure the aws-cli creds from the satnogs user `su satnogs -s /bin/sh`
    # Run satnogs-setup and set post observation script to be this file with {{ID}} as the first qrgument
    #    eg - /usr/local/bin/iq-upload {{ID}}
    
    BUCKET="YOURBUCKETNAMEHERE"
    mv /iq/iq.raw /iq/$1.raw 
    nohup sh -c "/usr/local/bin/aws s3 cp /iq/$1.raw s3://$BUCKET/$1.raw; rm /iq/$1.raw" >/dev/null 2>&1 &
    EOF
    sudo mv /tmp/iq-upload /usr/local/bin/iq-upload
    sudo chmod a+x /usr/local/bin/iq-upload
    
  5. Run satnogs-setup
    • Advanced -> SATNOGS_POST_OBSERVATION_SCRIPT -> /usr/local/bin/iq-upload {{ID}}
    • Advanced -> ENABLE_IQ_DUMP -> Yes
    • Advanced -> IQ_DUMP_FILENAME -> /iq/iq.raw
    • Apply

#2

I’m using a modified version of this now that uses

BUCKET="satnogs-iq"
mv /datadrive/iq.raw /datadrive/$1.raw 
nohup sh -c "sleep 600; /usr/bin/flock -n upload /usr/local/bin/aws s3 mv --exclude '*' --exclude 'iq.raw' --recursive  --include '*.raw' '/datadrive/' s3://$BUCKET/"  >/dev/null 2>&1 &

with iq being saved on /datadrive/ external disk. This way failed uploads are cached until next observation.


#3

Might be good place to ask, why are my file names from post observation script ending up like:

64 -rw-r–r-- 1 satnogs satnogs 38M Jul 15 07:38 IQlinux; GNU C++ version 6.2.0 20161010; Boost_106100; UHD_003.009.005-0-unknown??20190715073855.raw

the /var/lib/satnogs/pos.sh is:
#!/bin/sh
# try date without quotes: DATE=date "+%Y%m%d%H%M%S"
DATE=date +%Y%m%d%H%M%S <-- there are backticks in the script
/bin/mv /mnt/usb/iq.raw “/mnt/usb/IQ${DATE}.raw”

$ cat /etc/default/satnogs-client
IQ_DUMP_FILENAME="/mnt/usb/iq.raw"
SATNOGS_POST_OBSERVATION_SCRIPT="/var/lib/satnogs/pos.sh"

I always have to find by inum and rename :\


#4

Not sure how, but it seems stdin is some how getting into either the mv command or the date command… which I didn’t think was possible…


#5

Not an answer to your question but I can give some hints on code quotes in here:

Use three backticks followed by the syntax highlighting tag of your choice (you can omit this if you only want to paste text!), like ‘’'bash (replace the ticks with backtics) to start a highlighted code-block. At the end add another three back tics to end that code-block.
Like:

#!/bin/sh
# try date without quotes: DATE=date "+%Y%m%d%H%M%S"
DATE=`date +%Y%m%d%H%M%S`
/bin/mv /mnt/usb/iq.raw “/mnt/usb/IQ${DATE}.raw”

You can also use single backtics to mark a single line or a few words as code like: char array[15]


#6

Solution was to read the above carefully - this works w/o the funny stuff GNU C++ :slight_smile:

SATNOGS_POST_OBSERVATION_SCRIPT="/var/lib/satnogs/pos.sh {{ID}}"

$ cat pos.sh
/bin/mv /mnt/usb/iq.raw "/mnt/usb/$1.raw"