Introduction and lots of questions

Greetings, I happened on this project while looking for antenna rotator designs. I am part of a weather satellite project and was wondering if there was an easy way for us to share the data?

The project is on GitHub and is called Raspberry NOAA V2. It would be trivial for us to update the software to optionally upload .wav from the three NOAA Sats and Meteor M2. It would also be easy to upload the QPSK from the M2 decode. Our project makes a local web page for stations to view their images so we have the imagery available as well.

There used to be a project that took the pristine images from the NOAA sats and made them publicly available for people to stitch together. Most of us are not generating the pristine but if that were of use, it is an available option.

We have more than 1000 stations in operation and there is currently no data sharing. About half, guessing, are producing long passes with quality antennas. Would any of this data be of use to this project?

Since the data would not pass through the SatNOGS software we would have to make our own API style interface. Is there documentation for how the upload works?

Thank you,
Colin K6JTH


@pierros Maybe you can point Colin in the right direction?

1 Like

Hi @colinsk, thanks for reaching out and welcome!

Some quick answers and a long one:

That sounds as an interesting project! Let’s keep it in mind.

As SatNOGS project one of our main missions is to collect and share data from satellites, telemetry or payload data. So, yes these data will be useful, and help with our mission.

Currently there are three ways for uploading data to SatNOGS:

  1. In SatNOGS Network, when an observation is scheduled on one of the stations then the station has the permission to upload the results of the observation and relate them with the observation id.

  2. In SatNOGS DB, Network or 3rd party clients uploading via the SiDS protocol data frames.

  3. In SatNOGS DB, currently only some of the clients (is still in experimental status) upload waterfall as data with some metadata using hdf5 format for transferring these data to DB. We call this as Artifact (see bellow) endpoint.

On our current plans, and relatively soon, we want to move all the data to DB and use the 2nd and 3rd ways only. In far future maybe we will be able to merge those two too. For our discussion, a quick definition, we call all the results of an observation as well as their metadata as Artifacts, whatever they are, for example an image, a frame, an audio file, an I/Q file etc.

Saying that, it will be great, stations of your project sending data to DB. Currently it seems that none of the above ways are compatible, however the 3rd one is designed in that way to accept sooner or later any kind of Artifacts from Network or 3rd party clients.

Current documentation of the API can be found in here, however it is probably missing some details especially around the Artifacts as it is still in an experimental phase.

The Artifacts endpoint currently expects a request with an API key for authentication which has as data an hdf5 file and optionally an observation id when this comes from Network observations. The hdf5 file should contain the metadata of the observation, like observation start and end time, satellite details etc, and the actual data.

As we want to move forward with the Artifacts endpoint, it would be great to have one more project join this process and give valuable feedback. Some of the challenges I can think of and we can start discussing are:

  1. What type of Artifacts are we going to expect
  2. What kind of metadata should be added in the hdf5 files for each type of Artifacts
  3. The structure of the hdf5 files so they can carry one or more types of Artifacts
  4. How we are going to visualize and give access to all these type of Artifacts

@colinsk I’m sorry for this long answer but I thought it would be useful to have a good overview how are things now and where we aim to go. I understand that uploading files following this way is not as simple as you may expected. An alternative would be to create another API endpoint, maybe temporary, but I would like to avoid it as this will add extra development and maintenance effort. One last comment is that we can start with a bare minimum and build on it, this is what we currently experiment with uploading waterfall as data.

Let me know what are your thoughts.

1 Like

Thank you for the detailed reply. You guys know a lot more about Satellite data in general than I do. Forgive me if I state the obvious to you, I just want to make sure you know what data we collect.

A normal day on a station looks like this:
At midnight, download tle data and calculate the elevation of every visible pass of NOAA15, NOAA18, NOAA19 and Meteor M2. Each station has set a limit threshold, below this limit passes will not be recorded. Often passes overlap. Meteor data is preferred and always takes over the receiver during the start of the pass. Otherwise the first NOAA pass has priority. The pass is collected as a .wav file. After the pass, the data is processed and an image set is created. There are options for what workflow is used to process images. The images are then put on a local server and normally have no public facing interface. Each pass is processed into many enhancements.

The hardware at a station is usually a v-dipole or a QFH tuned to 137mHz to 138mHz. Some stations use filters and preamplifiers. This depends on the noise levels near the station. Then a SDR and a raspberry pi for computation. Setting up the software takes about 45 minutes so the barrier to entry is very low for a station.

There is an ever increasing configuration file that sets all the capture and processing options. It also has settings to post images to email, Twitter and discord.

We have all of the data regarding the start and end and the path through the sky. We generate some extra visitation for the pass including a spectrogram showing the noise levels, a polar plot showing the path of the satellite and a histogram of the raw image to help with gain settings.

Adding an upload to this post processing is really easy and adding the specifics of the pass is also easy. We take some of the information you need and annotate the image with it.

Writing software to pull this data back and stitch large mosaics together is not very difficult but do to our resources would be a later project. We have the image side of the code already but setting up automation to choose the images is a more difficult task.

Here is a typical processed image from Meteor