SatNOGS and orbit determination - Collaboration with Celestrak

[-- for context, this post is a continuation of one taking place here, started by @sarah_srogers, regarding the SATNOGS community’s ability to help her find her Cubesat, once it’s deployed. @pierros graciously moved my thoughts here, so it wouldn’t clutter up Sarah’s thread, 'cause like a forum noob :unamused: I dropped a thought grenade into a perfectly straightforward conversation, and it kinda esploded over everything… but my intent was NOT to commit threadjacking, so Thx again for the mod-move ! --]

So now the only thing left is to wait for TLE’s to show up…
and wait…
and wait…
and wait some more…

and then, after an almost unbearable delay, where you’ll be wringing your hands in despair because of your wayward 3U “baby” (they’re so cute when stowed), you get that first “official” TLE, and hurry off to perform your first pass check in…
only to then fume and chafe over how the TLE you’ve got isn’t exactly the right one for YOUR bird, (because you never hear it for the scheduled pass) but might in fact be a catalog mistag (NORAD’s, not the SATNOGS DB), and then wait some more until the whole system corrects itself.
(just don’t go holding your breath while you wait folks…)

So, with the above prognostications, I guess I’m finally out of lurker mode…
Time to introduce myself, (in the wrong forum, of course:roll_eyes:), announce a few things, and propose a few ideas, which I will certainly move over to the proper forums, once someone tells me what they should be.

I’m long winded though, so go grab your favorite libation, and settle in for my “pitch”.

Hey everyone… name’s Frank Snyder, located in Huntsville, AL.
Aerospace Engi-nerd, Software Parseltongue, Electronics Hacker, Maker, and warranty breaker.
I enjoy clipping copper in the garage, and long walks in the rain (to the mast) to test real-world SWR.

…and in full disclosure up front, I work for a small software company called Analytical Graphics, Inc. (AGI for short). We make a few commercial software products you might have heard of, some open-source ones as well (CesiumJS), and also host & run some services you might not know.
This isn’t a shill post though…

Specifically I work with a team at AGI that includes T.S. Kelso (yes, THAT Kelso of :star:Celstrak fame :star:), who has been doing an amazing job trying to reconcile the data that comes from HIS official TLE sources (cough… 18th SPCS), with the realities of real-world observations.
He’s up to his neck in conflicting information, often caused by the very real problem of limited resources, overworked people, and Murphy’s Law hanging over everything you do. The 18th guys do try their best to get this stuff right… but sh*t happens, then rolls downhill, and somewhere at the end of that messy trail are those “little guys” who hope to fly serious science missions in (n)U sized packages (where Int n={1,2,3…}), but with very few resources to actually produce their own ephemerides, or TLE’s.
The mechanics of performing an OD (orbit determination) on a satellite aren’t all that hard. It’s basic math & physics at work, with Sir Isaac & his Perturbation Band members setting the beat, and a healthy dose of filtering just to keep Murphy (& Kalman) happy.
Associating astrometric observations to actual Satellite Identity, however, is MUCH harder than you might think, and when multiple new tracks get deployed in quick succession, like for a OneWeb, or a Starlink, or a Cygnus/ISS deployment, well things get very confusing, very quickly.

So, why am I here ?

I want to change that, and I think the SATNOGS network is in a unique position to do so, if it chooses to do so, because every individual ground station contributor will obviously have a say in this. I’ve stood up a few (still testing) stations to dig into the possibility of using raw IQ measurements (pre-Doppler correction) to then extract said Doppler values and feed them into yet another obscure-sausage-making process that performs the Orbit Determination, ObsAssociation (aka ID match), aaaaand… ultimately spitting out TLE’s at the end. Maybe, with enough observations, those TLE’s could be better for everyone, especially this community.

Yes, there are a few of you actually doing that kind of work with your own ground stations, and I’d very much like to chat with you all… but in due time.

Right now, though, I’d like to ask a question pertinent to “finding” a brand new bird in orbit, because that’s exactly what Sarah is hoping to do in the next few days, weeks, or (shudder) months…
Which might not be that straightforward…
Like it doesn’t sound like there’s a SATNOGS scheduling mode akin to the “Amber Alert” for Cubesats, right ?

Has anyone here tried it though?
Like, has anyone here tried to point a somewhat directional beam, maybe squished down in elevation, and a little wider in “azimuth”, down at their ground station’s horizon and waited for something… anything… maybe even the thing you’re looking for, to wander through ?
Basically create an RF fence for birds to cross and be counted & cataloged as they radiate on their merry way.

You’d still need something to guide you in where to broadly point, like some notional or “first guess” TLE that’s perhaps more closely associated with the deploying vehicle’s ephemeris at the time of release (ISS, Cygnus, etc…) but since those TLE’s are probably quite well characterized, the initial “guess” it gives you on which azimuth to steer to and park, waiting for a pass, that could arrive early, or late, depending on how the vehicle actually deployed into the orbital plane, should be useful.

This is a very different “mission” than the data/telem collection that SATNOGS was originally formed to do, but it’s an equally important mission because the network should be very capable of doing so.

< enhacementTagforGitLabPurposes>
I’m suggesting that SATNOGS can help lift the veil on both OD/TLE fitment, as well as ID verification.

  • SATNOGS should be able to schedule an “Amber Alert” pass for vehicles that got “lost” in the TLE bureaucracy, or haven’t been found yet to be assigned one.
  • SATNOGS ground stations that participate in this find & go seek behavior should also capture the raw IQ of what they “see”/hear for the purposes of collating them into an Orbit Determination sequence.
    < /enhacementTagforGitLabPurposes>

There should be some mechanism that’s part of the multiple ranking systems already included into the Network, to drill into the Identity issue.
Right now there exist assessment for the overall quality of the RF signal, aka the observation, but nothing that touches on deeper reasons for a failure. TLE mistagging is one of those issues that you want to find, and stomp on as soon as you can, because everything the network does hinges on the quality of those TLE’s. It’s in the best interest of the Network to assist in policing that particular aspect of the system.
…because like it or not, with so many other large constellations of commercial birds getting launched in the months & years to come, the problems with TLE quality, and Identity are just going to get worse.

Possibly the SATNOGS solutions could eventually form into a DB extension of RF “signatures” for the things you WANT to be tracking in the DB, tracing back to what frequency/modulation/etc… was actually observed so comparisons could be made, even automated. I saw that there was already work going on behind the scenes to auto-qualify good observations, but there’s equal importance in qualifying why failures occurred.

It’s one thing to expect BPSK signal around ~430MHz, but if you observe a completely different RF signature, that still exhibits a satellite-grade Doppler shift, then you can raise the question about ID for that pass. i.e. I heard {something}, but it was CW down on the 2m… so was there a mistag ?

I don’t see those kinds of sub-qualifiers in the color-coded ranking that currently exists, but it might be good to have a discussion about adding them. There’s a TON of stuff that gets miss-cataloged all the time. Just ask T.S. the next time you get a chance to meet him… the stories he could tell :wink:

Anyway, I’ll wrap up this poor excuse for a novella by asking the more senior crew here to point me to the right section of the community forums to continue this discussion… or to the door if it’s not something they think would fit into the SATNOGS vision. I’m new here, and completely understand that rocking the boat is often frowned upon.
I’ll still continue to bring both my UHF & VHF stations online to become a productive member of this society, because… reasons, but I’ll keep my efforts on ID and OD improvements down to a dull roar…

Cheers,

-Lucky

8 Likes

Hello @luckyjunknowitz , and welcome to our community. I took the liberty to split of the post as a separate topic since this will be a rather lengthy discussion. I (and other core SatNOGS team members) will be responding soon on the issues/ideas you raised since we have been working around a lot of them already. Cheers!

1 Like

I did this very thing for the release of HuskySat-1 (on it’s first two scheduled orbits, with my amateur radio groundstation). We didnt’ have keps yet, so most used ISS keps. Tracking a 70cm cross yagi to the ISS, I waited until I saw the signal that had been described on the AMSAT mailing list (which showed up about 5 minutes after the ISS). I didn’t record I/Q (nobody asked for it), but it would’ve been trivial to do so. You can’t really do this now with SatNogs (at least not easily), as there is no built-in way monitor in real time and adjust “listening” center frequency and or pass length, nor any way to take over rotators (if that’s what you are using). Having said that, I know some station owners have modified their stations to do some of this stuff.

–Roy
K3RLD

3 Likes

OK so here is a more lengthy high-level response on a couple of points:

First of all, thank you for reaching out and joining our community @luckyjunknowitz . Collaboration and coordination among all space-related projects/organizations/entities are critical for any viable way forward.

This is not entirely true. SatNOGS has never been just for data/telemetry acquisition since it’s creation. A testament to that is db.satnogs.org acting like a crowd-sourced source of truth regarding missions, especially around their COMMS information.

Since SatNOGS is a Libre Space Foundation project, it abides by the Libre Space Manifesto and the pillars it puts forward. Open Data is one of them, and orbital information (and generally Space Situational Awareness) falls well within the goals of SatNOGS.

As you can imagine, various technical, legal, and operational aspects need to be figured out around using SatNOGS extensively for that purpose. Still, we have been making good progress on all those fronts. (pinging @cgbsat @fredy and others to chime in with the technical details around them).

I would like to reaffirm that orbital determination and contribution to an open-source, open data, neutral SSA through SatNOGS is a definite goal that we are all committed towards it.

Scheduling strategy and auto-scheduling tuned around LEOP needs are a principal development direction of network.satnogs.org indeed. We already have mechanisms in place to prioritize things, but we need more development and testing before we deploy those on the whole Network.

I will let other core members chime in on the technical aspect of this, but I would like to use this as an example of resource needs that SatNOGS has. Storing raw IQ can be a huge undertaking, primarily if you operate in the scale we are today (350+ stations and thousand observations per day).

Currently, SatNOGS is supported solely by Libre Space Foundation (in terms of development and infrastructure funding) with hundreds of thousands of dollars per year. Moving forward, we would like to be able to do more with additional sustainability funds made available by users of the Network (Missions, Researchers, Companies), maintaining the open data, open-source aspect of our project, staying true to our Manifesto. This sustainability aspect should be key whenever we consider additional development and resource ideas and needs around the project.

Collaborations are critical for open ecosystems, and we couldn’t be more than happy with the influx of users and contributors we see in our projects. That said, it would be great to see more of that. So, again, @luckyjunknowitz thanks for bringing this to the next level, on top of our existing lightweight collaboration with the always helpful T.S. Kelso.

Related note 1: RF tracking (active/passive etc.) is just one of the ways to achieve SSA. Various members of our community have been super active around optical tracking too, and a combination of both could make a massive difference in the landscape.

Related note 2: Libre Space Foundation is active on space segments too, and exploring technologies (like our upcoming QUBIK missions) that could help with LEOP built-in the space segments.

3 Likes

You may find this video from the Open Source Cubesat Workshop helpful. “Orbit Determination Using the SatNOGS network and Orekit”

1 Like

I would like to add that with the eventual move to using https://gitlab.com/librespacefoundation/satnogs/gr-satnogs/-/commit/8072219a8ac94a802bd166a25e672bd0796b4424

(I know its probably a while till everything is read.) This will allow to basically using STRF with the waterfall files directly to do orbital stuff. Also you have New software: SatNOGS waterfall tabulation helper which is useful right now for doing TLE stuff with the current waterfalls.

1 Like

@pierros Thanks for the assistance in carving out the proper place for these discussions, and thanks again for the welcome !

I figured that in my wording, and narrative, I might trigger some “reactions”, :wink: and hopefully some productive discussions.

I fully appreciate the delicate balance that needs to happen when flirting with the {legal | technical | ops} aspect of performing SSA, even part-time. When my company stood up the COMSpOC a few years ago, we had to tread very cautiously, and yet, methodically to get to where it is today. Other groups, that perform similar commercial functions, deliberately put in place policies that self-regulate dropping into that “awkward” portion of the SSA conversation. Policies such as “…if it isn’t in the catalog, we won’t track it…”
All perfectly understandable, and to a certain degree, reasonable. (quibbling on this can happen on the upper and lower side bands…)

I’ve done a lot of different jobs over the years, and held many equally varied titles, and the one I find myself in now can best be described as “dot connector”. If I see potential in collaboration, to the benefit of all parties, I’m encouraged to pursue it. That’s, in large part, what prompted me to come out of forum-lurker-mode and shotgun you all with some ideas.
Also, I can literally feel the frustration that T.S. has when he has to wrangle with the wants and needs of his Celestrak “groupies”, and the realities of how those TLEs come into being. He really wants those data sets to be right (it’s a matter of personal & professional pride when they are), both for the benefit of those concerned with conjunctions, but the broader uses as well.

In various past discussions he and I have had, I kept reinforcing how a community-driven system, similar to how the “amateur” ADS-B receiver world operates, could very much help his un-enviable job of herding cats…errrr Sats, and you don’t always need a full-3+ DOF tracker to participate and add value. This network is living proof of that.

He has a hard enough time keeping tabs on maintaining those lists of “active” (aka, uh… data-producing ?, radiating ?, powered ? ) satellites because of having to ferret out that information piecemeal. Here in the SATNOGS DB there’s a running record of who’s still radiating, who’s started drifting, and how long since others have fallen silent.
(I’m assuming all that’s just a few SATNOGS API queries away from being automated…)
So some of this collaboration doesn’t require dramatic changes, just judicious mining of the data already there, and I’m kind of poking around here to see what fits.

I’m also an engineer at heart, so I love sudden deep dives into narrow technical challenges, just to see if my idea of a solution might work.
You brought up the very real issue of Raw IQ data being rather “bloaty” for storage and distribution, at scale.

Has there been discussion about decentralizing the storage of that particular (niche) data capture ?

It’s really only important and needed by a smaller sub-group of the broader SATNOGS community interested in the OD value it might provide, and is already decentralized in a way with the work of individuals, such as @cgbsat and others playing with tools like STRF.

Maybe it doesn’t need to be collated in the same way as the other Obs.

I’m gonna use a pop-culture reference here, which may ring a bell or two for some.
What about taking a page from the Napster book (from the good 'ol “bootleg” mp3 days) and creating a Torrent-based decentralized system…?
Let those ground stations who want to participate in the raw IQ collection also “volunteer” their own storage and access to the data. After all, not everyone is going to want (or be able) to allocate the additional load on their ISP cap for transferring the data, even if they collect & process it… although I personally think the bandwidth cost of sharing IQ data for the greater good could be offset by simply NOT binge-watching the newest sensation on NETFLIX… but I digress.

So maybe exploring a distributed IQ “audio” sharing network, a-la-Napster, with maybe a bit of Git-like distributed repo replication thrown in, just in case some seriously enthused folks want to ensure that the greatest hits of BUGSAT Doppler never disappear.

Then there’s the possibility of using other techniques to extract just what you need out of the Raw IQ, and simply store that. Taking a page from the Radio Astronomy book here.
I was on a very cool Hackaday.io chat yesterday and user @0xCoto pointed me to this project: VIRGO (major thanks dude, if you’re lurking around here too… really want to know when LyraNet comes online). So while that particular code might not have an immediate fit, if the raw IQ can be decomposed into something much lighter (on either the client or server side) for long-term storage, then let’s have at it.
Again, I’ll offer this up as a separate conversation, where OD gurus from all sides might opine. I know a few on my side with strong views… :wink:

I still think the lowest hanging fruit here ties back to the problem of Identity. You don’t always need a “pristene” signal capture to glean enough information about whether the thing that’s radiating is what you think it is. In fact, I’d hazard a guess that plenty of the “bad” or “fail” observations contain just enough information to serve as exclusion matches for such comparisons.

So thanks again for the welcome, and looking forwards to the discussions that ensue !

…also, best of luck to the recent crop of on-orbit “graduates” in finding, and communicating with their Cubesats !!

2 Likes

@jebba yeah, thanks for posting those links !

I watched that presentation not too long ago, and @Noeljanes work is of great interest to me (any my colleagues). Normally we shoot for using much higher tolerances on the measurements, to reduce the eventual residual magnitudes, but then again, most of those “requirements” are derived for doing OD for GEO birds. They tend to have the buckets of $$$, by the nature of their business, so naturally a commercial company like ours would chase after that work first, and use “only the very best” observations (predominantly optical) for doing that work. :wink:

Where I’m very curious is in the application of “coarser” measurements for LEO OD. What you give up in measurement quality, could potentially be made up in quantity, and somewhere in the overall OD voodoo fitment (not my primary specialty to be honest…) that volume of observations results in better state estimates.
Now, they may never be good enough estimates to produce a great initial fix, so you’d still want to rely on the obs done by “others”, and their skateboard-half-pipe radars (those LEOLab guys are rockin’ it in more ways than one!!), but the passive-RF one-way-Doppler stuff might be sufficient enough to maintain some reasonable accuracy over the lifespan of their mission, and thus reduce the load on passive radar systems.
Also, with the non-cooperative radar method, you still carry some ambiguity on the Identity issue. (yes I know there are methods for gleaning uniqueness from Radar returns… don’t judge me! :wink:

The one issue that Noel raises towards the end is particularly concerning. For narrow-band signals (like CW or beacons) the gleaning of how the signal is being Doppler shifted is (mostly) straightforward. For more complex waveforms, that could get rather ugly, and fast… and that’s where I’m going to be leaning on some of the knowledge of others to do the requisite sanity-check on it’s potential.
Has there been demonstrable success in using the same Doppler-extraction methods from Telem data signals as something like a narrowband carrier, CW beacon, etc… ?
I just don’t know, but I aim to find out…

For signal “fingerprinting” though, and associative Identity matching, the complexity of the waveform, and it’s periodic nature actually helps pin things down.

Of course there’s the work coming out of the GNURadio Army Signal Classification Challenge, from 2018, which shows promise.
In addition to the ever-increasing modern approach of “throwing AI at it” to assist in the classification, there’s some interesting work being done on fingerprinting the unique signatures of the transmitter hardware itself, deep down in the “wiggle” of the local oscillators, which are, more or less, unique in their own right.
What’s uncertain is exactly how clean of an IQ stream you need to perform that level of ID, and more importantly, how much training data you need to properly configure the “artificial-stupid” to do its job properly.

So there are even increasing levels of depth that each focus (OD assistance, and ID assistance) can take here. Some of it is just mining the timing and logged occurrences of successful, and failed, observations, but others dive deeper down the IQ rabbit-hole… all the way to RF wonderland.

Thanks again !!

-f

1 Like

@KD9KCK oh very cool !!, on both counts.

…followed by a sinking feeling that I’m gonna need to stand up (yet another) set of experimental/test GS’s here to play with the beta-ware prior to it being pushed to production branch… like a literal set of ghost stations to mirror those stable ones I intend to put online for the Network’s greater good. Fortunately I’m getting better with tweaking Ansible… and Amazon-US still has stockpiles of Pi’s and Nooelec SDR’s/LNA’s (for now)

Starting to get up to my neck in Pi’s here, and I can only split/amplify the signal from the antennas so many times before weird sh*t happens… :wink:
also, I think I’m gonna need a bigger PoE switch…

for those who are doing experimental/tweaking stuff on x86 systems, and running concurrent instances of SATNOGS Client, each pointing at their own USB SDR’s, can you point me to links where this configuration is better described, or discussed ?

Thanks !

-Lucky

I’m toying with that:

1 Like

I’ll further draw attention to this post where you see the entire kabuki dance of “which one’s MY satellite” play out in all its splendor.

Good times !!

It’s like reliving my childhood boy scout “Snipe Hunt” days all over again… (those in other countries probably have a similar indoctrination process to adulthood), except it’s with expensive hardware in orbit…

Not trying to play on anyone’s past painful experiences here (well, maybe a little :sunglasses:), but this particular pattern is likely to continue and become magnified into the future. I’d simply like to channel those frustrations, and emotional/visceral feelings to fix {something}…

At least in that forum post, there were more colorful waterfalls than language at the near-futility of chasing your tail in circles.

Cheers

-Lucky

Sure, if you’re in the group of people who have sent me an email at 0xcoto@protonmail.com, I’ll be sure to notify you when the LyraNet network is launched later this month.

There are two main issues with raw I/Q data acquisition. This idea has been out for a while, and I wish it was investigated at OSCW 2019 in a bit more detail.

  1. Having the raw I/Q data recorded (instead of just the FFT samples) means your observation file is huge, so expect users’ micro SD cards to get filled quickly (since most SatNOGS ground stations operate on a Raspberry Pi). Having to upload the raw I/Q data to e.g. share it with the community implies you’re also limited by the speed of your internet connection, so how effectively can you do this?
  2. And even more importantly, something that no one seems to consider is timing accuracy. Most people think they’ll achieve something fancy by combining their I/Q data with another station’s data. The problem is the clocks of the SDRs need to be accurately synchronized, and have the observations begin at the exact same time (or know the very exact timestamps each sample was taken), which is a very difficult thing to accomplish.

I’m not sure how people are benefit from having raw I/Q data though. Getting a phased array to work at such long baselines is tricky, and will not offer much sensitivity compared to just averaging the FFT samples (waterfalls) between different observers. This also has some limitations, but the difficulty is nothing like raw I/Q data acquisition and combination.

2 Likes

Hey @0xCoto, thanks for chiming in. You raise a couple of good points, so let me clarify.

  1. Having the raw I/Q data recorded (instead of just the FFT samples) means your observation file is huge, so expect users’ micro SD cards to get filled quickly (since most SatNOGS ground stations operate on a Raspberry Pi). Having to upload the raw I/Q data to e.g. share it with the community implies you’re also limited by the speed of your internet connection, so how effectively can you do this?

so, regarding the storage of the raw IQ on the Pi, I certainly wouldn’t advise that. Not only is there the “filling up the card” issue, but the frequent writes tend to shorten the lifespan of the μSD card itself.

Rather I would advise anyone wanting to capture the existing post-Doppler-shifted “raw” IQ (for whatever reasons) to mount a thumb drive onto the Pi’s file system, and configure the SATNOGS client IQ_DUMP_FILENAME variable to point to that mounted path. Even USB 2.0 write speeds (with buffering) are sufficient to dump the data, and wear & tear on a thumb drive is supposedly better than that on the actual μSD card itself. Also, thumb drives are cheap, and if it turns into a yearly “consumable”, that’s fine by me, personally.

and further speaking for myself, in case anyone is curious:

  • I have each Pi + SDR, sitting on a VLAN I carved out for “chatty” little IoT devices I “sort of” trust :thinking:, located at the end of a PoE/Cat5e cable, closer to the antenna outside. I didn’t want to keep an actual HDD or SSD out there, so just a “standard” 128GB thumb drive mounted on the Pi, with the IQ dumps happening there.
  • I use SATNOGS_POST_OBSERVATION_SCRIPT’s to rename each dump file appropriately and log any observation metadata I want along with the .raw file, although all that data is already stored on the SATNOGS DB, so just having the Obs # associated with the .raw file is good enough.
  • Next, since the thumb drive would eventually fill up over time, I have cron jobs, similar to log rotations, running on the Pi to periodically push the files over the IoT network, to a general-purpose data-storage NAS I have at the house. Plenty of room there for hoarding RF samples… :roll_eyes:
  • From there my normal backup processes ensure that the data doesn’t go “poof” in an unforeseen event, and if I chose, I could push the files further out across the interwebs, expose them for torrent-exchange, trade them like Pokeman cards… whatever. At this stage the data is off the Pi, and I’m more free to do whatever I want with it.
  • NOTE: To keep the total process load on the Pi’s CPU at a reasonable level, and give priority to the GR Flowgraph/SDR processing, I only let the cron push files off the Pi to the home NAS during lulls in the ground station activity, and only files that are obviously a few hours old, or older. There’s plenty of room on the thumb drive to last a while, even if my station gets booked non-stop (which it probably won’t). This may not be strictly necessary but I really don’t want anything jacking with the SDR activity, and I “think” Raspbian can sometimes prioritize file I/O over other processes down at the IRQ level… someone check me on this plz!

So the thumb drive “buffers” the IQ filling up, and the NAS acts as the archive where I can “torture” the data to my hearts content. I don’t plan on keeping this stuff forever, mind you. Spring cleaning rules are in full effect, and that extends to digital cruft…
At the moment it’s mostly to accumulate enough sample data to start playing with ideas around Sat & Signal “fingerprinting”… (i.e. the ID problem), and eventually, once I have the pre-Doppler-shift IQ saved off, applying variations of @cbassa’s STRF code to extracting the actual Doppler shift, which may not ever take place on the Pi itself… As far as I’m concerned, the Pi is fine performing the role of a RF capture instrument, with some basic data forwarding thrown in to keep it lean & healthy.

The real heavy lifting on the IQ can take place, well…
In a house, With a mouse,
In a box, With a fox,
or simply
On a cloud, (or wherever huevos verdes con jamón are found)…
I’m not to proud to put it there, or anywhere.

At some point, should I attempt to “throw AI at this” similar to the work @jebba linked to above, I should have a good starting set of training data for wherever that takes me…:scream:

of course the whole point of me chiming in from the get-go was to find out if others are working on similar threads, explore possibilities of collaborating at-scale, and (maybe) help evolve the system into being able to perform these “secondary” or “tertiary” tasks if called upon to do so…

There are other posts around these forums, like @mwheeler’s post on pushing the IQ to AWS S3 buckets, or @mfalkvidd’s post expanding on that effort and widening the destination to a broader set of cloud storage. Both are excellent resources for anyone considering this to review.

I won’t make any assumptions on the upload speeds and bandwidth allotments, because everyone’s connectivity is obviously different. If you happen to be at the end of a “soda straw” ISP connection, then maybe this store-and-forward thing isn’t the right fit… Every ground station owner/operator will obviously need to make that choice.

Now, on the more complex concern about timing accuracy and clock synchronizing…

…The problem is the clocks of the SDRs need to be accurately synchronized, and have the observations begin at the exact same time (or know the very exact timestamps each sample was taken), which is a very difficult thing to accomplish.

You are absolutely right.
Getting clocks so precisely aligned so you get phase coherence on each ground station’s samples is a VERY tricky problem… if your intent is to do TDOA/PhaseDOA interferometry from multiple stations to the same bird, over the same pass, all at the same time… but that’s not the approach I was thinking about, so thanks for pointing that out.
I should have clarified my thoughts better, and the technique I’d like to explore is better described at the STRF link that I provided above.

TL;DR; version: I don’t want to combine the IQ itself. I want to extract the Doppler shift measured for each observation IQ and combine that in a more traditional OD process. The links that @jebba shared to @Noeljanes presentation also do a better job of describing what I’m looking to combine, and how.

Now, in an ideal world, I would LOVE to have phase coherence across as many of the ground stations as possible, just to allow for the opportunity to precisely derive angles to the bird over time, instead of relying on one-way Doppler, but I’ll work with what I can get… :wink:

Hope that clears up what I was trying to say !

Cheers,

-Lucky

You do not require raw I/Q data to derive Doppler shifts, and just as I expected, STRF does not analyze the raw I/Q data either according to the README. It performs the data analysis in the post-FFT computation stage. Apparently (as far as I understand) it requires the raw I/Q data as an input because it expects the user to stream the data live from their SDR to STRF, which does not have to be the case. I haven’t really looked at the code, but you should be able to input the FFT samples and skip the Fast Fourier Transformation computation phase that is traditionally required to convert raw I/Q to FFT samples. STRF should then proceed to the analysis of the FFT samples (i.e. the waterfall) and give the desired output as needed.

2 Likes

My thoughts on the topic. Some qualitative changes and results can be obtained if

  • The file will contain GPS synchronized timestamps which allows to calculate the position of the signal source according to several receiving stations.

  • Receiving stations should use old SSDs (60-120Gb) - this will solve the issue of reliability and volume of stored data.

  • Receiving stations should use BitTorrent File System to provide data to everyone…

Hi Frank,

Just catching up on this thread, mostly cherrypicking some of your questions from the initial post. I’ll probably have more to add in the next day or so.

SatNOGS/LSF has already chosen to do so. gr-satnogs and satnogs-client now have functionality to log timestamped waterfalls. In the future, these waterfall files will be uploaded to the SatNOGS network to allow extraction of Doppler curves from the waterfall files. Even though these have been Doppler corrected, this correction can be undone to obtain received frequency vs time, and allow comparison against other TLEs (ala ikhnos from @fredy, or as input to my STRF software.

This would greatly expand on the coverage of Doppler tracking from what I, and a few other STRF users, have been able to do with just STRF alone. From my experience, it should be totally feasible to generate TLEs that are accurate enough for antenna pointing and satellite identification, thought likely not usable for applications that require high accuracy (conjunction analysis etc).

For now, functionality needs to be added to SatNOGS to upload the waterfalls and give them a place in the network.

Yep, we’ve done that with the 25 meter Dwingeloo dish for the Spaceflight Inc.'s SSO-A mission; just point at the orbital plane and let all cubesats pass through.

Note that this is mostly overkill though, as you can get Doppler curves with an omni-directional antenna. Even without decoding telemetry you can use frequency vs time to link a Doppler curve to a CSpOC TLE, and use the public information about the transmitter to link that TLE to a satellite to identify it.

8 Likes

A post was split to a new topic: Il 18 ottobre 2024, SSTV dovrebbe essere trasmesso da UMKA-1 (RS40S)