I worked as a researcher on orbital debris during university and now work full time on this and adjacent problems. My full write-up on the approach for using cheap radar data to assess orbital debris is here on my blog.

Orbital Debris

There has been a considerable increase in the number of things humans have lofted into space, specifically low-Earth orbit (LEO) in the past ~6 years. In 2016, there were about 6,000 satellites that had ever been launched into space, according to the United Nations Office for Outer Space Affairs. Of that, there were about 4,000 satellites orbiting the planet, many of them launched before the turn of the century by the USA and USSR between 1957 and 1999. At the time of writing this post (EOY 2022), there are about 8,500 satellites in orbit, and more than 12,000 satellites total have been thrown into space by humans.

Earth is a shared resource, and LEO is no different. The exponential nature of the growth in LEO use via satellites is not a harmless thing and could result in some nasty outcomes, namely Kessler Syndrome. When objects in orbit break up, the debris that is created often does not immediately deorbit and depending on the altitude, it can remain floating in the sky for eons. Except it doesn’t float – it careens around the planet at thousands of miles per hour with ludicrous amounts of kinetic energy. The danger is obvious, but what isn’t is the positive feedback loop of destruction and debris that feeds on the now numerous satellites in orbit: Kessler Syndrome. Even without the syndrome, space debris is still a dangerous and rather expensive threat – assets worth billions of dollars in space have to be monitored increasingly carefully to guard against collisions and it doesn’t always work.

Space Environmentalism

Dr. Moriba Jah, who was recently named a MacArthur Fellow for his work on the subject, is an astrodynamicist and prominent researcher on the subject of orbital debris. He and his colleagues have developed a platform for aggregating various data sources into a complete catalogue of space objects, ASTRIAGraph, as well as visualization tools like Wayfinder, available for free use in the public domain. In particular, the novel statistical methods employed in these tools highlight the discrepancies prominent between multiple data sources: data on space objects is often anything but tidy.

Researchers like Jah, including himself, have been advocating for “space environmentalism” for decades. That is, treating near-Earth orbit as a finite resource that should be governed by norms of behavior, accessible data sharing, and policies & regulations to service sustainable and equitable international use of space. Just like most natural resources, it is endangered by carefree use by the rich and the few in spacefaring nations, which is indeed everyone’s problem.

Where Debris is Headed

Commercial services in the space can take the form of the heavy-duty number crunching that companies offering Space Situational Awareness Software as a Service (SSASAAS lol) like Slingshot or Privateer, or it can be efforts to develop space tugs or sophisticated orbital robotic platforms for satellite servicing. Currently, debris is officially tracked by the U.S. Space Command’s 18th Space Defense Squadron. It consists of about 100 people tracking objects that have doubled to ~30,000 in the past six years. For a bit on anchoring on the issue, the International Space Station (ISS) is a $150bn manned outpost in space that has had to maneuver to avoid such debris several times in the past year, mostly due to a recent Russian anti-satellite test. If a large chunk of debris were to slam into the ISS, if it weren’t completely catastrophic resulting in loss of life, at the very least one of the 16 modules would need to be replaced. At ~$4k/kg, that’s roughly $250mm just to launch – not exactly chump change, even on NASA’s dime.

The lock-step proliferation of debris and commercial interest in space has created a market for mitigating the risk of collision, ruining extremely expensive equipment. Commercial operators and researchers in the public domain are stepping up were those 100 Space Command workers simply cannot, and it’ll be tough to lose money preventing collisions: upper bound estimates for satellites in orbit by 2030 are on the order of 100,000.

The system I worked to develop in Jah’s lab is known as Re-entry Analyses from Serendipitous Radar Data (RASR). The RASR project expands on the proof-of-concept work of NASA and leverages its small database of detections to extend an automated detection system into the real-time domain. RASR shifts through real-time atmospheric data and detects phenomena, outputting a confidence value in detection, based on fall velocity signatures and the detection neural network architecture. 

Data is Messy

As a final thought, I wanted to highlight the immensity of the data to be gathered and the titanic statistical challenge that is hacking through it. Data on space objects is collected across the globe using large radar facilities of varying resolutions at frequencies on the order of GHz (usually X-band). There are lots of difficulties to getting accurate information on space object position and velocity data, including atmospheric interference, the speed of the objects distorting the reading (inverse synthetic-aperture), and distortion due to rotation of the object. Due to incomplete information, keeping track of debris in space is a wicked problem.

To predict collisions (or conjunctions, as they are called in industry), 30,000 objects with position and velocity vectors must be assessed on a sliding time window that can range from minutes to 10 days. Different mathematical methods for conjunction assessments are used, the most simple and popular being the “two-dimensional” probability of collision (Pc) computation put forward in the 90s where the closest distance between two objects is found from an assumed gaussian distribution of the position vectors and the objects are propagated numerically forward in time on Keplerian orbits. More modern approaches, like Coppola’s method, are superior in that they account for uncertainties in velocity over time as well.

Just to drive it home, if I have 30,000 objects that I want to propagate forward in time just 2 days, that is on the order of 10 billion equations to be solved. Then I must account for errors and propagate uncertainties, which even under simple frameworks is 20x more. Finally, there is the issue of rooting through the results as fast as possible to sort out the potential collisions – about 50 exaFLOPS (50e^18) worth of computation. Definitely do-able, but absolutely no fun if you’re constantly dealing with edge cases and incomplete data. And certainly not when billions of dollars are on the line.

New Comment
1 comment, sorted by Click to highlight new comments since: Today at 2:29 PM

TL;DR: The "mitigation" in the title is more accurate debris tracking using... something jargon something.