<img height="1" width="1" src="https://www.facebook.com/tr?id=205228923362421&amp;ev=PageView &amp;noscript=1">

The right to dark and quiet skies.

United Nations' Committee on the peaceful uses of Outer Space (COPUOUS) 24 kicks off in Vienna. Isar extends its Series C. Virgin Galactic books a crew for a Delta flight. And more.




United Nations' Committee on the peaceful uses of Outer Space (COPUOUS) 24 kicks off in Vienna. Isar extends its Series C. Virgin Galactic books a crew for a Delta flight.

Our 2024 N2K CyberWire Audience Survey is underway, make your voice heard and get in the running for a $100 Amazon gift card. Remember to leave us a 5-star rating and review in your favorite podcast app.

Remember to leave us a 5-star rating and review in your favorite podcast app.

Miss an episode? Sign-up for our weekly intelligence roundup, Signals and Space, and you’ll never miss a beat. And be sure to follow T-Minus on LinkedIn and Instagram.

T-Minus Guest

Our guest today is Philip Weiss, Vice President of Software Engineering at Orbital Sidekick. We discussOrbital Sidekick’s mission and how they use AWS infrastructure to enhance their work.

You can learn more about AWS Aerospace and Satellite on their website.

Selected Reading

COPUOS 2024 

U.S. Statement - Agenda Item 6 - 67th Session of the COPUOS - June 2024 - U.S. Mission to International Organizations in Vienna 

Leveraging commercial technologies for sovereignty: Isar Aerospace extends Series C to over EUR 220m with strong commitment from NATO Innovation Fund

Stratolaunch Unlocks New Flight Capabilities for Roc Launch Platform

Virgin Galactic Announces New Research Flight Contract With Repeat Customer | Business Wire

Mayo woman set to become first Irish person in space

SERA and NASRDA Partner to Send First Nigerian to Space - SpaceWatch.Global 

ExLabs wins funds to accelerate space robotics - SpaceNews


ESA - Monitoring marine litter from space is now a reality

T-Minus Crew Survey

We want to hear from you! Please complete our 4 question survey. It’ll help us get better and deliver you the most mission-critical space intel every day.

Want to hear your company in the show?

You too can reach the most influential leaders and operators in the industry. Here’s our media kit. Contact us at space@n2k.com to request more info.

Want to join us for an interview?

Please send your pitch to space-editor@n2k.com and include your name, affiliation, and topic proposal.

T-Minus is a production of N2K Networks, your source for strategic workforce intelligence. © N2K Networks, Inc.

[MUSIC] We here at N2K Networks took a publishing break yesterday, as it was the Juneteenth holiday here in the United States. But yesterday was also the kickoff of the 67th session of the United Nations Committee on the Peaceful Uses of Outer Space. Hm, I wonder if there's anything going on that they might be concerned about. [MUSIC] >> Today is June 20th, 2024. I'm Maria Varmasas and this is T-minus. [MUSIC] Copius, 24, kicks off in Vienna. ISAR extends its series C. Virgin Galactic books a crew for a Delta flight. And our guest today is Phillip Weiss, Vice President of Software Engineering at Orbital Sidekick. We'll be discussing Orbital Sidekick's mission and how they use AWS infrastructure to enhance their work. So stay with us for that chat. [MUSIC] >> Happy to be back on this Thursday. Let's take a look at our Intel briefing. Lots of items on the agenda at the Committee of Peaceful Uses of Outer Space, or Copius in Vienna over the next few days, including the ongoing evolution of international laws and norms regarding commercial, governmental and military activities in space, space debris, cooperation between regions and the global space economy, space sustainability and lunar sustainability. And as we've covered on the show in the past, the right to dark and quiet skies. And in a prepared statement for the ways and means of maintaining outer space for peaceful purposes, US Representative Richard Bunenke emphasized the United States' view that an unnamed UN member state, reading between the lines, it's Russia, would be violating international space laws if it put nuclear weapons into orbit as a counter space measure. Nothing new there, but clearly the focus on nuclear weapons in space is not going away anytime soon. And we've got a link to the full statement by Representative Bunenke in the show notes if you want to read it in full. In any case, it's going to be a busy week in Vienna as Copius runs until next Thursday, and of course, we'll be keeping an eye on any noteworthy developments for you. ISAR Aerospace of Germany has extended its Series C funding round by over 65 million euros, bringing its Series C funding total and all to more than 220 million euros, which is equivalent to about 236 million US dollars. The Series C includes backing from the NATO Innovation Fund, marking its first direct investment in a satellite launch service provider. And we've been talking a little bit about the NATO Innovation Fund this week. As you might remember, it also provided funding to the UK based Space Forge. As for ISAR, it has raised over 400 million euros over all of its funding rounds. And regarding the funds raised in the Series C specifically, they'll support the industrial scale production of ISAR's Spectrum launch vehicles, aiming to meet the growing demand for satellite launches from both private and public sectors. Virgin Galactic has announced a new research flight contract with the International Institute for Astronautical Science for a flight on Virgin's upcoming Delta commercial service. Three researchers, including Dr. Shana Pandya, who we've interviewed on T-minus in the past, as well as Dr. Nora Patton, who will become Ireland's first astronaut, and previous Virgin Galactic flyer Kelly Girardi, will comprise the crew on the upcoming Delta ship, which will fly no earlier than 2026. And while we're on the topic of being a nation's first astronaut, let's add another announcement in that vein to the list. Nigeria's National Space Research and Development Agency, or NASARDA, has announced that it has partnered with the Space Exploration and Research Agency to send its first astronaut to space on an upcoming Blue Origin new Shepard space flight. Dr. Matthew Adepoju, Director General and Chief Executive of NASARDA, said this is "a step towards the realization of our objective of putting a person in space, as stated in our national space policy and program, and we're excited to work with CERA, which would be the Space Exploration Research Agency, and Blue Origin, to provide this once in a lifetime opportunity." Some great progress for hypersonic flight fans from the StratoLaunch team. They have successfully completed a series of envelope expansion flights for the ROC, which is the world's largest flying aircraft, hitting a new operational altitude of 35,000 feet and a speed of Mach 0.63. These successful tests will now allow StratoLaunch to optimize future Talon-A mission operations in order to achieve sustained hypersonic flight. And speaking of, earlier this year, StratoLaunch achieved its first successful powered flight of the Talon-A1 vehicle, reaching speeds of near Mach 5. All of these recent flights included team training and also demonstrated StratoLaunch's ability to increase overall flight cadence in order to align with the Department of Defense's needs for hypersonic testing. Putting it all together, the company is preparing for the maiden flight of its fully recoverable Talon-A vehicle, the TA-2, which is expected later this year. Exploration laboratories, or X-Labs, has received a $1.9 million tactical funding increase, or TACFI, from Spaceworks. This boost will help X-Labs speed up development on its autonomous space object capture tech. This funding also follows a $1.7 million Spaceworks Ciber contract awarded last year. In a statement to Space News, X-Labs CEO and co-founder Matthew Schmidgall says the new TACFI investment will support prototype development and testing of X-Labs's ACQR autonomous capture robot, preparing it for an orbital test flight. Phantom Space has received a finding of no significant impact, also known as a Fonsee at SLC-5, Vandenberg Space Force Base, where they are approved to conduct up to 60 launches annually. This is the final hurdle before Phantom Space can start turning this former NASA scout launch site into a two-pad site for launches of their Daytona spacecraft. And the airspace did open at 7am this morning. Now, of course, a lot of the teams couldn't get out to the site. And it's been an exciting time for all the rocketeers at the Spaceport America Cup, as there were 32 launches on the first day of flights, which was yesterday. And there are still 90 teams left to launch today and tomorrow, with today's first launch kicking things off at 8.35am mountain time. For more on that, we have an update from the field at the Spaceport America Cup from our very own T-minus producer, Alice Gerooth. Over to you, Alice. Hello, Maria, from the vertical launch area at Spaceport America. I'm here to cover the events of the 2024 Spaceport America Cup, which has been going on for the last week here in Las Cruces, New Mexico. Day one of launches occurred yesterday, and 32 teams were able to take off before the weather caused delays and shut down the range. Now, that weather also caused further delays in the afternoon and into the early evening. As a hubbub came in, which is a dust storm and managed to destroy, some of the tents that the teams have been working underneath and keeping them out of the elements. So, unfortunately, the teams had to work overnight here at Spaceport America with the volunteers from the Experimental Sounding Rocket Association and some of the students as well to try and get those tents back up and running ahead of day two of launches. So it has caused a bit of a delay today for the second day of launches, but I'll be bringing you some updates on that on Friday's show. Thanks, Alice, and here's hoping for no hubbubs tomorrow. [Music] And that wraps up our briefing for today. And as always, we have lots of links for you in the show notes, including a few items that we didn't cover in the briefing, like some staff moves at Space Force, and divided opinions over Russian space expertise boosting China's moon ambitions. Hey, T-minus crew. Every Thursday, we sit down with industry experts in a segment called Industry Voices, all about the groundbreaking new products, services, and businesses emerging around the world. Every guest on Industry Voices has paid to be here. We hope you'll find it useful to hear directly from businesses about the challenges that they're solving and how they're doing it. And today you'll hear from Philip Weiss at Orbital Sidekick about how they use AWS infrastructure to enhance their mission. Visit space.n2k.com/aws to learn more. [Music] Today's guest is Philip Weiss, Vice President of Software Engineering at Orbital Sidekick. They're a customer of AWS Aerospace and Satellite, and we discussed how AWS infrastructure enhances their missions. [Music] My name is Philip Weiss. I'm the Vice President of Software Engineering at Orbital Sidekick. What OSK does is we have a constellation of five hyperspectral satellites, and what hyperspectral cameras are is, think of like a normal photo, kind of get split into its opponents of red, green, and blue. Our satellites split the spectrum from ultraviolet all the way up to infrared into about 470 different spectral bands or colors. And then we use that information to look for things. Our main customer base is oil and gas industry. An oil pipeline company needs to know when they have leaks. Methane or oil leaking reflects light slightly differently than the surrounding things. Just like water reflects blue, when you're looking at it in 470 different spectral bands, methane reflects in color. And we use that to be able to tell when our customers have leaks. Outside of oil and gas, we can look at vegetation health. We can tell when vegetation is healthy, when it's not. We can look for specific minerals. Just was chatting with a potential customer about looking for specific kinds of rocks. If they are spectrally significant, we can see that as long as there's no class. So that's what we do. Wow. So there's the making things more visible, but also making things that you can't see visible, sort of in a manner of speaking. That's got to be a fascinating intersection of not just what space data can provide, but also these different industries. Like you mentioned the energy industry. I mean, as many industries are controlling emissions and decarbonizing, that's got to be a fantastic place to be right now. Yeah, at scale from space, we can be able to see things that are a lot harder to see, or a lot more expensive to see from the ground or locally. So if you have an oil and gas pipeline that runs through very remote areas, or a mountainous area that is hard to traverse, that's not difficult for us to see from space. So like I said, as long as there's no clouds. And we can dig out things that could cause major problems. I don't remember exactly how many years ago it was, but there was an oil pipeline or a gas pipeline that exploded on the edge of a city in Washington State and caused a lot of damage. And we would very much like to not have that sort of stuff. We'd like to be able to help make the carbon that we still have at least a lot safer for one. Or just even cut down on the amount of carbon emissions that from the remaining carbon that's out there, so that it isn't being emitted on there. We have a leak that's going out there. If we can detect it months before it'll be found at the next traversal of their pipeline, that is a very interesting thing for us. We must be talking about an unbelievable amount of image data, given it sounds like images are being refreshed rather quickly. This has got to be a lot of data. Yeah. So that's actually one of the things where AWS comes in is a huge benefit to us. If you think of a normal image, number of megapixels is what people think of when you're taking with a camera. Ours, because we're splitting this up into so many more bands, the images get really, really large. A typical image for us will be 4,000 pixels by maybe 1,000 pixels. I don't know the exact numbers on there, and it varies depending on how we collect it. But for each of those pixels, there are 470 values that we have to collect. So those files, when we drop them in, are huge. And so being able to store those efficiently, being able to quickly process them, the amount of data is quite immense. And it's unlike the stuff where I've worked out before, where you just have massive numbers of transactions. We don't have massive numbers of transactions, but the amount of data that we have to pour through is still large. What kind of file sizes are we talking about? From 100 megabytes at the small end of it on there to once things are fully processed, we can get things up to 10 gigabytes sometimes, which is very, very, very difficult to process through quickly. And one of the reasons why we choose to do this with Cloud Provider is that trying to grind through that on a pretty powerful single machine that sits in a closet somewhere is slow, because those can only do so much at a time. With AWS, we can just spin up all sorts of compute resources and work on specific bands at a time, multiple images at a time, particular areas of images at a time, and we can have multiple pieces of that spread out across all sorts of compute. Okay, a 10 gigabyte file. As I said, that's a hefty image. If that's on the regular, how are you extracting meaningful insights from that in a way that's not taking you all forever? So the basic idea for how this works is that a few times a day, our satellites will drop information to us whenever they're over a ground station that we can use. And they drop as many files as they can during a short period. We literally trigger this as soon as a file shows up in AWS S3. And what we can do is for each of those files, we start a processing pipeline. We use AWS step functions for those. And for each file, we'll literally go through a number of functions within step functions. So we will locate where the image is. So it's not as trivial as I would have thought when I first started this image. It's not just, oh, magically right there. You have to find out specifically where it is. You have to do what's called spectral calibration for those files. We don't just have three colors to do it for. We have to do it for 470. Huge amounts of stuff that you have to go through correcting all of this. Another step that we will then have to do is something called georeferencing and orthorectification for the data. So that is taking into account elevation. That can be extremely time-consuming because you have to actually get an elevation map for the entire area that your image covers. Then we have to calculate a whole bunch of other things. For instance, the actual methane map that we normally do, the oil map that we do, or a cloud cover map. Can we figure out where the clouds are and not use the portions of the image where there are clouds over it? Or do additional processing? So when we do cloud cover on there, we have both parts where there's clouds and then parts below the clouds that are covered by shadow. And what extra processing do we have to do for shadowed areas in order to bring out the information that we need to do? I should note our sensors are not active sensors. These are not like radar where they send out a signal and we calculate the amount of time it takes to come back or something like that. These are entirely passive sensors, kind of like a normal camera is. So you're looking at that dark spot on your normal camera image on there. If you change your white balance or all the other things on there, you can still see things with those. So when we have shadowed areas, we can do additional stuff with that and be able to see things. I'm just trying to imagine taking on this kind of compute. If you were trying to home grow this, would that even be possible? It can be done on your local computers. You don't want to. So for instance, when I started, we had the folks that were at the company previous and built out some experimental compute pipelines using a very, very beefy machine. And I don't even know what the hardware that went into it that sits in a closet in our office. And when we got imagery, in this case, it was not from space. It was from aerial flights with the same kinds of sensors. The time that it would take to process all that, you would start it at nine o'clock at night, and you would come in the next morning and it would be halfway done on there. If it didn't fail halfway through, right? If it didn't fail halfway through or for parts of it on there, just because you have to grind it in, this is where we get into something like cloud, however many files you have, there's probably a practical limit to how much compute AWS actually has, but we're coming nowhere close to actually hitting that. We've got 50 files on there. So we want to do is do them all at the same time so that they finish in about 20 minutes, which is more or less what it takes for us to process each file right now. We can do them all at the same time on there. And it doesn't actually cost us any more than if we did them one after the other on there, because we use Lambda functions for the actual compute processing. Occasionally, we may run into limits for how many we can do at once with that, but so far we haven't really hit any kind of limitations like that. And because of that, whenever we get image drops, the record so far I've seen from image acquisition on the satellites to when it's processed, the fastest I've seen so far with what we've been able to do is about 34 minutes. That includes transmission time, delivery into an AWS bucket, and then for processing to go through. You blew my mind a little bit. Sorry, that's so fast. You went from hours to minutes. That's crazy. There was a very opportune data collection of the image itself was taken just before we hit a downlink window. So the image was taken and then it was almost immediately downlinked, then transferred into an AWS bucket. And from that triggered this, and I'm like, oh, what I saw the numbers on there was like, wow, that was very, very fortuitous. However, one of the things that we will be doing on there is being able to prioritize certain pieces of data. So if we do some compute on board, we'll be able to tell here's an image with where we think there is a very high probability methane leak. We will be able to at the next downlink opportunity, which generally happens about every 90 minutes right now. As we get more downlink stations with capable radios, those will come up a lot more frequently. We can downlink the data, process it, and get that inside out to our customers. So there is no one that has to push any buttons for any of this to happen, either at AWS or on our end. And that's also very different from when we, this pipeline that ran on a computer processed over that one. Some of that we couldn't have done completely automatically because we had to collect the hard drives off a plane. And somebody literally had to physically carry those into a place where we could up like them. And I got to say, a lot of oil pipelines are not in places with great internet. Yeah, they're often quite remote, right? So trying to get some of that uplinked on there. Coming from space, we have in many cases have better internet than we would driving around in the Texas high country where you got to find a place to uplink it. Someone had to push a button to run it. Someone had to look it over. Someone had to figure out just how much can this machine process at a time on there and then split it up into chunks so that it could run on there. We don't have to think about that. We launched satellites four and five in March. All we had to do was update a couple of database entries to mark down the IDs of the satellites. So we can recognize that these going there and everything on the AWS and nothing had to be done on that end afterwards. Just put a couple new database entries in and it literally just starts spinning up. We don't have to say, give us more compute specifically. They just give it to us when we want it. Wow. Okay. You all still have some of the first data that you ever downloaded from your satellites. You've now gone back and reprocessed them with AWS. There literally is almost no data that's wasted in what we do. Some things may be somewhat less useful than others, but we can look for all sorts of things. It might be a little bit late for a methane leak on there to process six-month-old data or three-month-old data, but we can still tell, for instance, things like crop health during that period. The data is always good for that particular time. So if you come up with something else that you want to look for, now we can do that, for instance. And if that mineral is something that you can see spectrally, we can go back and look through all of our imagery and reprocess that with a particular algorithm to say, oh, maybe go find that around last year. Look for new sources of particular mineral or what your crop health was at a particular time or we're doing research on speciation. So what's the makeup of a particular forest on there? Is it we can go back and look at old data for that? We'll be right back. Welcome back. Some unexpected good news on fighting plastic pollution thanks to space, specifically data from Copernicus. No doubt you've heard about patches of plastic pollution floating around our oceans, but unless those plastic patches are particularly dense, they are nigh impossible for satellites to see them. Often these slicks or litter windrows are thin, ribbon-like streaks of plastic waste that are also short-lived before they sink down to the ocean floor. So being able to find them and track them over time can be a useful window into where plastic pollution is coming from and who is doing the polluting. And thanks to work done by an ESA Discovery-funded consortium of space companies and research institutes, researchers were able to scan images taken by Copernicus's Sentinel-2 satellites of the entire Mediterranean Sea, taken over a six-year historical period at a 10-meter resolution, to find these litter windrows. And oh, did they find them. Thousands of them, most over a kilometer in length, some as much as 20 kilometers long. And thanks to this work, we now have the most complete marine litter map of the Mediterranean ever made and a better understanding of where this plastic pollution is coming from. And now we know how to find these litter windrows with the satellites that we already have in orbit. In this case, it was a specialized supercomputer algorithm whoring over the imagery that did all the heavy lifting. And since no new satellites are needed for this, although they could be nice, the project team says this kind of trash-spotting tech is ready for other parts of the world too. Now reusing existing satellites to reduce pollution here on Earth, that is a pretty green space story if I've ever heard one. [Music] That's it for T-Minus for June 20th, 2024, brought to you by N2K Cyberwire. For additional resources from today's report, check out our show notes at space.n2k.com. We're privileged that N2K and podcasts like T-Minus are part of the daily routine of many of the most influential leaders and operators in the public and private sector, from the Fortune 500 to many of the world's preeminent intelligence and law enforcement agencies. This episode was produced by Alice Caruth, our associate producer is Liz Stokes. We are mixed by Elliot Peltzman and Trey Hester, with original music by Elliot Peltzman. Our executive producer is Jennifer Iban. Our executive editor is Brandon Karpf. Simone Petrell is our president. Peter Kilpie is our publisher. And I'm your host, I'm back! Maria Varmasas. Thanks for listening. We'll see you tomorrow. [Music] [BLANK_AUDIO]

Similar posts

Stay in the loop on new releases. 

Subscribe below to receive information about new blog posts, podcasts, newsletters, and product information.