Episode 34 of the Space Industry podcast is a discussion with Maciej Ziaja, Machine Learning Software Engineer at satsearch member company KP Labs on machine learning in space.
Episode show notes
KP Labs is a Poland-based company offering AI computers and software to bring autonomy to demanding space missions. In the podcast we discuss:
- The factors driving the increase in adoption of on-board data processing (OBDP) and artificial intelligence (AI) capabilities on satellites
- Examples of how algorithms and processing tools can support different missions
- Advice on how to integrate hardware and software for AI use
- Predictions on where such advances could take NewSpace in years to come
Here are some additional resources mentioned or referenced in the podcast:
- Benchmarking Deep Learning for On-Board Space Applications
- Quantifying the “Unquantifiable”: How to Estimate the Robustness of the On-board AI for Hyperspectral Image Analysis
- Antelope: Towards on-board anomaly detection in telemetry data using deep learning
KP Labs’ portfolio
Oryx is a modular flight software tool developed for the mission control of small satellites. It manages all satellite tasks – namely, processing telecommands sent by the operators, monitoring the power budget, executing pre-defined schedules, managing emergencies and handling data.
The KP Labs Antelope can work as an On-Board Computer (OBC) with an optional Data Processing Unit (DPU) or as a data processing unit (DPU). OBC is the powerful heart of the satellite, responsible for satellite control and basic task performance such as communication handling, monitoring the satellite’s subsystems, handling the classic Fault Detection, Isolation and Recovery (FDIR) mechanism, and performing planned tasks.
The KP Labs' Lion is a data processing unit for advanced operations with the use of artificial intelligence and on-board data processing. The Lion DPU is dedicated to micro and small satellites weighing between 50 and 500kg.
A CubeSat standard-compliant Data Processing Unit (DPU) designed for the application of AI solutions in space. Leopard uses Deep Neural Networks to process data on-board and features FPGA to implement deep learning algorithms. The system has a throughput of up to 3 Tera Operations per second.
KP Labs’ The Herd - AI-powered algorithms for Earth Observation (EO) - is a set of AI-powered algorithms designed for EO data analysis.
Oasis is a single-board, CubeSat PC-104 compatible electrical ground support equipment that serves as an interface between the PC-running satellite systems simulators and the hardware engineering model.
Please note that while we have endeavored to produce a transcript that matches the audio as closely as possible, there may be slight differences in the text below. If you would like anything in this transcript clarified, or have any other questions or comments, please contact us today.
Hywel: Hello everybody. I’m your host Hywel Curtis. And I’d like to welcome you to the space industry by satsearch, where we share stories about the companies taking us into orbit. In this podcast, we delve into the opinions and expertise of the people behind the commercial space organizations of today who could become the household names of tomorrow.
Before we get started with the episode. Remember, you can find out more information about the suppliers, products, and innovations that are mentioned in this discussion on the global marketplace for space at satsearch.com.
Hello everybody. And welcome to the episode. I’m joined today by Maciej Ziaja, Machine learning software engineer at satsearch member company KP Labs.
KP Labs is based in Poland and develop solutions that bring greater autonomy to space. Today we’re going to be discussing the algorithms and hardware that can enable the running of machine learning in space missions.
Hi, welcome to the episode. Great to have you here. Is there anything you’d like to add to that?
Maciej: Hi welcome, pleased to be here, we can get to the topic.
Hywel: Okay, fantastic. So this is a really interesting topic, whether that’s getting a lot of attention in the industry as the amount of data and capabilities the satellites have increased.
And obviously there are limitations on communication between satellites, communication in particular with the ground. That’s the sorts of solutions or trying to overcome problems. They tried to address as well as enable new capabilities, new business models, new research opportunities. So onboard data processing or OBDP at the use of AI are becoming increasingly important in the industry. Like I just mentioned, what do you think is driving this?
Maciej: So firstly, to really understand and highlight the difference between the traditional approach, how stuff used to be done in the past vs the new approach with the onboard data processing. So throughout this talk, I’ve been mainly focusing on Earth Observation satellites that process images, capturing images of earth and process them.
And in this context, in the traditional scheme of things, satellites performing little to no independent tasks, and there are treated as data gathering devices that send bursts of images to the ground stations via the radio connection. And this connection that enables sending data from satellites to ground station is called downlink.
And after satellite captures photos sends them down to Earth. This process inside a data center. And this is how things used to be done. And then you approach is quite opposite to gather data and process it on board of satellite makes up like, be aware of the data that they’ve captured. In this new approach, we transmit the only results of data that has been processed, we send valuable results and we don’t have to process everything.
We don’t have to transfer with everything and then processes inside the data center. So I believe that there are two factors driving this. The limitations of the old approach and emerging possibilities with the new. So let’s get started with what was wrong, problematic with the old way of processing and data.
So the main limitations was transmission, was the communication, because in most scenarios you can communicate with your satellite and when it flies over your ground station. And if you want to have more connection, you have to build more ground station and is very expensive. And your satellite it’s still very dependent on those ground stations.
This creates a lot of problems because the communication is scarce. It is really expensive to build the ground stations. I communication can take long, latency in the whole process. They may come some communication gaps, mainly when satellite flies over different parts of where when you don’t have ground stations. To really provide you with an insight of how big this problem is and how different a new way of processing the data will be. In the old way, 15 to 20% of what satellites transfer is useful.
Like 80% of photos that we send via the downlink connection in old ways of doing things is invaluable. So the communication is really expensive and we wasted on sending data that may be cloudy or blurred. So there is a lot of to improve. Now, what is driving this shift?
Also, as I said, is the possibility of something new with new technologies, new space missions and I want to maybe get a bit of track and say about. This shift of thinking about satellites in wider context, because it is very important connected with onboard data processing, because there has been a shift towards more agile and more rapid development of satellites that are more focused on AI experiments, new technologies, and the why their reason for this movement towards AI.
And for example, if KP Labs we work on something called smart mission ecosystem, that grants great flexibility and ease of use. And previously you could either buy a pre-made hard to customize satellite or create your own solution from scratch. But you had to do the heavy lifting by building your custom design.
So in this new paradigm, with this new approach, we really want to hit the sweet spot between building stuff from scratch and buying something that is hard to customize. And we provide smart mission ecosystem, which. Comes with many building blocks for creating your solution with our hardware, our accelerate, or software for satellites machine learning algorithm.
And they are provided as building blocks. And they are not only busy to build on top of, but they also come with many great convenient tools, environments for testing, and all of that creates a new wave of satellites and new approach with emerging popularity of CubeSat substandard, smaller satellite. And what I have described this new approach with building blocks that enable onboard data processing on the lowest level.
It stems from advancement gradual change in more available technology, micro processor or things that are not necessarily connected to satellites, but companies like KP Labs can create building blocks. Although those basics components. And then the end user can create smart mission is out of those basic building blocks.
So there is a ladder of cooperation, new technologies on very rudimentary level companies that can create building blocks from those technologies and end users that can use these technologies. These building blocks to create smart missions in the end.
Hywel: Okay, fantastic. Yeah, that’s really interesting. We’re used to dealing with certain standards and levels of performance in space, and you sometimes when you take a step back and think for any terrestrial application, if you were only getting 20% of useful data, that just wouldn’t be acceptable.
So that’s really illuminated why solutions like onboard data processing are required. In terms of the technologies and systems that you’ve just mentioned, what sort of new capabilities can these solutions bring for satellite missions?
Maciej: The AI on board satellite can not only help the payload and the mission objective, but also aid the satellite operation.
And again, in the older approach, we were limited by the cost of communication, expensive communication and supplied was very dependent on ground station. And a lot of images that we sent, as I said, are cloudy or shaken, and they’re really unbelievable. And the numbers are quite shocking. How much, how many of them are in volume. So the most rudimentary tasks that you can perform with onboard data processing and AI is to discard the data that is invaluable transfer the data to air, but you can at least discount what is invaluable by making satellite, be aware of what it can gather. And you can also must out parts of images that are cloudy.
Use AI to detect clouds on images, Musk them out. And this really helps image compression algorithms. So even without actually doing anything smart on satellite. We, most of them do is to help the compression and transmission and the most basic level, you don’t have to change too much, but you can discard what is invaluable.
And this is already a huge improvement, but this is just the beginning because cloud detection can be treated as a preferencing step for image analysis. So really the cornerstone, the starting point is cloud detection and making satellite aware of. Again, this is just the beginning. And for example, in this smart mission ecosystem, as I said, we have a lot of very convenient tools and software to aid the mission itself.
And for example, offer a modular software system called Oryx, and it comes with a scripting engine, which enables you to change the mode of operation of satellite on the flight. So you can rewrite what your satellite should do without tackling the low level software and you don’t have to have the thing about the bridge between hardware and software with the modular scripting engine and Oryx, you can just write what your satellite should do in a very user-friendly scripting language called Lua and just update the script and the scripting engine will understand it and adjust the satellite to your needs. So this is again, a different side of the revolution that enables us to perform these onboard data processing tasks. But when it comes strictly to them to onboard data processing and AI, we again have to differentiate smart satellite and smart payload.
And the first one is to aid the satellite operation itself. And we can do this by using AI to analyze telemetric data because satellites have onboard sensors that monitor different parameters. And this is what’s smart satellite is about itself. What it can do, it can help our satellite can help to analyze the logs, the telemetry, and provide onboard maintenance.
Keep your satellite in good health and avoid failures. And on the other hand, we have the second branch of smart payload, which offers AI capabilities regarding the mission objective, image analysis of satellite for Earth Observation tasks and image processing.
Hywel: Two sides of the coin but really interesting that this, the same solutions enabling both, you’ve mentioned a number of these already, but I wonder if you could give some further specific examples of how algorithms and processing tools can support different kinds of missions.
Maciej: Yeah, there are tell us of exciting ideas and specific examples that we can provide, get the box of the maintenance and telemetric, because they are really interesting and anomalies and failures can be really straightforward.
Let’s take an example of out of bounds failures. to detect study failure, like for example, to low voltage power supply failure, you can just press the telemetry data. This is a really straightforward thing to do. However, there are many failure scenarios that are harder to detect called contextual anomalies.
This kind of anomaly can only be detected by AI nowadays. And what contextual anomalies are. They are occur when every single signal on board, the satellite is incorrect, right. And it all, it seems fine. However, when you get them and compare them how they relate to each other. It may come that even though each one of them is perfectly fine, they are in combination in a situation where they should not be during normal operation.
And this is really important and detecting such a failure is only possible with AI, but if it is possible on board of satellite with AI, you can really take some steps. You can try to mitigate this problem automatically. And this is a huge difference because if you were to try to mitigate failures with the traditional approach, you would have to gather the logs and then wait for your satellite to fly over a ground station transmit these logs and analyse them in station. And after you detect a failure, it may be too late to do anything. You may have missed the moment when you could have taken the action.
And this is why you want to be able to detect the failure on board of satellite and have some failure handling scenarios. In case something happens, you can detect them with AI and start rescuing satellite in orbit without actually having to wait for the communication session to begin. This is really important and it makes a huge difference because actually I have another example of quite shocking number because from the year 2000 to 2015, little over 40% of small satellite missions experience, at least partial failure, provided by NASA. So it’s insane number and smart maintenance and fault handling are crucial key points to mitigate this problem and to really help satellites to operate in a more correct. And this was more about the smart satellite, but now let’s move on to the smart payload because this is where maybe the most interesting and more, most spectacular things may come.
So we work at KP Labs on several machinery algorithms for observation. And I have already mentioned the most basic task of cloud detection because it’s going to help utilize data, transmission better, but the possibilities are far beyond that. And we worked for example, on smart farming algorithms and with AI, it is possible to estimate soil parameters from a satellite image.
And this is a really interesting idea. It’s something that is hard to wrap your mind around when you think about it. You can estimate, for example, how much potassium ground contains just from a satellite image.
Hywel: Just from the image of the surface.
Maciej: Yes, but actually it is something more than image of surface because it captured image of surface, but we work with very specific images.
We call, they are generally called hyperspectral images. And even though they capture. What the grant looks like on the most basic level, they cover the light spectrum beyond visible wavelengths, and they are very feature rich and they capture the data in infrared, ultraviolet, or radio frequencies. And you can estimate you can infer such complex parameters soil quality just from a satellite image, which is really interesting and was never heard of previously until AI and hyperspectral data came into play.
So you can really think of those images as scientific scans or using scientific apparatus, than normal images. That’s we are comfortable with. They are much more than just RGB images and alongside our domestic work on a soil analysis, we organized open for all challenge in this topic of soil quality assessment. And we provide data and metrics for competition that we have prepared. And because hyperspectral data is so feature rich, we advertise the competition with a See Beyond Visible slogan because hyperspectral, it creates images, but the images contain things beyond what you can see with your own eyes, which is a really insane idea. And this initiative, this competition of soil analysis is supported by European space agency. So I definitely encourage everyone listening to this podcast to try it out.
You can join the competitions and there are some nice prizes for the winners. So definitely check it out. And going back to the benefits of using AI versus the traditional approach. Again, we are doing a lot of these comparisons. You may ask what is the difference between analyzing so with AI, with satellite versus how used to be done before, vs this is done actually now, so we can experience this firsthand, whether we were in gathering data for our competition.
And we were cooperating with people that work daily with soil analysis and how they do this now is there is a guy who gets to rent an ATV and drive around crop fields and gather some boxes, some probes of ground and park them in a larger box, send them all to a laboratory. And we’re actually has to run some analyses with a scientific apparatus.
It’s something that you have to reorganize people, equipment vehicle, laboratory, and it’s a process that can take weeks or even months to cover a large area of crop fields and with satellites we are close to the point when you can just order a analysis of soil with satellite, click a button, and have results delivered in hours or perhaps minutes one day.
So it’s really, again, interesting idea. And there’s a huge, different numbers are really on the side of hard satellites and we are not only working on soil analysis, but there’s another interesting and novel idea regarding super-resolution tasks. And super-resolution is a technique for enhancing quality of images enlarging of extra details created by AI.
And you may have used it. You have Photoshop installed because I believe nowadays, if you stretch out image and you enlarge it, you may use a, an AI algorithm built in photoshops to ask some details and sharpen the image and enhance the quality. And this is how this technology has mainly used for aesthetic purposes and for manual analysis.
However, now together with European space agency, we are running a project for using super resolution as a data fusion step as a prep processing step to aid further data processing. So for example, it’s imaging, but you have a observation task like monitoring fire spreads, for example, or tracking vessels. And with super resolution before you apply AI to the final tasks.
For example, vessel tracking, you can use super resolution to first enhance the image, make it sharper. So it’s easier to detect the vessels by the final algorithm because it gets better quality images at the input. And this is a very novel idea of super resolution in classified space environments. And this is what we are working on. Beyond what we work on at KP Labs, there are many more use cases that may come into play, and there are people around the world preparing different AI solutions for tasks like again, vessel tracking and the anomoly defection, natural disaster monitoring.
Sometime ago, I came across a really interesting model to predict possibility of car accidents, crossroads from satellite photos. Yeah, it’s really cool. And it’s connected to safety. So really important topic and AI was able to take a photo of some network of roads and estimate which crossroads, where most prone to accidents and crushes.
So it’s a great tool for city planners and stuff like that. Possibilities are really endless and there are many benefits in general to using onboard data processing and AI. And one things to note is that in general, the running AI on satellite is an asset and not a must. So we can deploy any way we want. We are very flexible.
Nevertheless, there are some use cases that are uniquely available, thanks to onboard processing. Like the best example is this online risk management with telemetric analysis and failure prevention. However, there are some new, exciting ideas that are close to being implemented, like in orbit rendezvous operations, where a swarm of devices or satellites or space vessels with high degree of independence can cooperate together. And this really requires a high level of autonomy to operate actions between different vessels and satellites in orbit.
Hywel: Yeah that’s amazing. This really interesting use cases, there’s quite a few things that stood out to me, any online telemetry of tracking and the failure prevention is fascinating because presumably the larger the satellite, the smarter this satellite, the smarter the payloads on the satellite, the more important such capabilities because the satellites are more expensive and heavy.
You want that ability to be able to track the telemetry and prevent what failures you can. So that’s interesting. And yeah, we talk a lot about satsearch, viewing both sides of the marketplace. You do lots of comparisons and a a lot of discussion of competition in the industry. And who’s what what are the alternative solutions to things, but you never think of it as the sort of applications you’re talking about.
You never think of it as satellite versus a guy in the truck driving around collected soil samples, but for, to the end user, that’s the comparison they have to meet where they’re trying to decide, which is the best solution to find out the soil quality.
So really interesting stuff. Thank you very much for sharing along that. So we talked a lot about the processing capabilities and the results that the such onboard processing and AI can bring you. What sort of hardware is required in order to use it in space?
Maciej: More than AI in general, it’s a very demanding software around, in all condition. I would say perhaps it is one of the most demanding in terms of performance software. So when we talk about running AI inference in domestic conditions, graphic cards are often used accelerators and they may be built into your computer and they are used in data centre, they are great. They perform great, and they are a standard.
However, what works well on the surface is not that suitable for space missions and when it comes down to satellites. The traits between power consumption and performance are really crucial for efficiency management, because there is always a risk of your satellite running out of juice. And this is a really important risk and something that you can, you have to be aware of.
First thing is about the trade off between power and saving energy is really important. It’s more important than in normal conditions. And also the hardware flexibility factor has a different context in space because you can easily swap GPU models in your computer. And you can just take out parts of your computer and install new ones.
However, this cant be done in orbit. Unfortunately, you cant just take out pieces of your satellite. And for this reason at KP Labs our accelerators are mainly a FPGA based and this enables us a flexibility and FPGAs are very specific devices because normal processors can be programmed with different software. Computers can perform different tasks because they run different pieces of software and FPGAs are like that, but they are even more programmable because not only software, but also hardware can be resynthesized on the fly. You can describe different hardware. FPGAs are able to readjust on the flight on the hardware level.
This means that different trade-offs can be balanced during the mission. For example, you can a single core performance over a month, or vice versa in orbit and also really repioritize when energy saving is crucial and you are close to running out of the juice again. So flexibility and power consumption and performance are key factors in modern AI hardware for space.
And when it comes down to KP Labs and what we offer, we have three different AI accelerators for satellites, Antelope, Leopard and and let’s walk through them. So Antelope it would be most light unit, it is designated to the smart satellite. So we are talking about these maintenance tasks, anomaly detection, failure prevention, Antelope is an AI accelerator that is designed to run algorithms, to perform. To help the satellite operation. Yeah. However Leopard which is a medium sized all-rounder. It’s more focused on smart satellite.
So it is designed for image processing for Earth Observation tasks. I like cloud detection, segmentation, soil analysis. So this is the difference between smart satellite and smarter. And last but not least is , which is the bigger brother of Leopard, biggest unit. And it is designated for the most demanding space missions whereas Antelope and Leopard are designed for CubeSat missions. Work with standard DPX space missions for satellites over 50 kilograms. So this is definitely for the biggest context. I have told you about all those algorithms and hardware pieces and what we can do with them.
But we do work hard not only on prototyping them, but on what we work hard to also on turning these possibilities into real life advantages and and space missions that are implemented in real life. And this is why we launch our in-house satellite mission called INTUITION-1, which will carry a Leopard on board and perform smart findings and analysis.
And furthermore, we have invested in making sure that our development environment is really user-friendly. And we put a lot of emphasis of the telemetry and safety and robustness and the development environment and simulation environment environments to make sure that you develop your satellite in a very convenient and very robust way.
And this is why we support a second mission called PW-Sat3. It is a mission run by student and scholars from Warsaw University of Technology in Poland. And we provide them with our solutions and we are really eager to cooperate with them and offer our support. PW-Sat3 will carry Antelope and test the telemetric analysis algorithm.
And we are really looking forward to cooperating with students and scholars and try out. User-friendly and how efficient our development tool chains are. And our solution is, and how end user can benefit from. And once again, I would like to highlight the flexibility and the hardware of our the flexibility of our hardware and software, because Leopard is based on FPGAs that can be reprogrammed in orbit to support different hardware, accelerator architectures and Oryx our modular software enables you to reprogram satellite operations and this leads to very reusable, very adjustable satellites.
It means that a single satellite can support different goals and missions as the time progresses. And because of these wide capabilities and variety of tasks at hand, we call it INTUITION-1, our mission with Leopard on board, a blank laboratory that can perform diverse experiments, and try out new possibilities, different possibilities during the same mission.
Hywel: Very interesting to know, to understand what the, some of the technical considerations are. And obviously engineer’s, you’ve mentioned some of the things that they would need to consider when trying to implement their own solutions using onboard edge processing AI, you’ve mentioned the importance of considering power efficiency and also in the first instance, thinking about how you’re using the tools, whether it’s to enable that smart satellite concept and analyze your own telemetry data or.
Whether it’s a smart payload situation. Both of course. So I wouldn’t do them. What are the sort of technical considerations that an engineer would need to factor? And then the trade offs that they may need to think about when integrated the hardware and the software into satellite and deploying it on orbit.
Maciej: Yeah, this is a really important topic because the engineering is what drives satellites. And when it comes through satellite drivers and software development, it’s used to be a pretty daunting task in the past. A lot of things had to be written from scratch. Often you have to use a very obscure hardware and program from the very beginning.
However, a huge progress has been made with the new emerging new space paradigm. And nowadays with technologies like our space smart solution ecosystem, creating satellite software can be a much more developer friendly user-friendly task, and our development process and what we offer and in form of building blocks, encourage good programming practices with modern coding standards.
And this means that we build highly testable code that works with CII environments. And this is really important too, to software developments because nowadays programming tools are dev ops becomes more and more disposable. So every software developers will really appreciate this process of creating highly testable code that works well with existing software tools.
And we really put emphasis on unit and end to end testing this. Not only means that our satellite software is robust. And you are confident in how it works, but you can also create your software on top of it with great convenience and using the software tools that we provide. So not only the building blocks are really great and created in a very cautious way with good coding practices, but you can follow the way we create software and development tools. It provided a way of creating software that is really user-friendly and very convenient. And on the other hand, AI space is a relatively new thing and it’s still maturing and at KP Labs so we work really hard on pushing the frontiers of AI on board satellites and raising technological readiness levels of new solutions.
And as mentioned, the hardware around AI in orbit is a pretty novel and it’s very different from standard GPUs. And it comes with a ton of benefits and I have praised our flexibility of hardware and FPGAs. It comes with the cost of heavy, a lot of parameters to fine tune and tweak. And at first glance, it is really hard to know how to get the optimal configuration for the mission at hand, because you have so many possibilities, and this is why we have developed a custom benchmarking process for robust AI deployment onboard satellites because there are tons of existing benchmarks and the majority of them, it is great and they will work fine for choosing your hardware platform for choosing your hardware vendor hardware family.
They are grateful for this purpose. However, after you have chosen what you want to use and you have it delivered this kinds of benchmark, they tell you really nothing about how to utilize best, what you have chosen, how to find, need, how to tweak it. And this is how. our benchmarking process differs.
And when it comes into play, after you have bought your seller, you can use our benchmarking process to find this unit. And we have created this process while evaluating our hardware, our Leopard AI accelerator. You can think of this benchmarking process as painting the landscape of what the hardware is capable of other different circumstances in different. For example, we created our benchmarking process with different sized neural networks with different use cases, operational modes and configuration. And then mathematical terms, you can think of this benchmarking process as multidimensional matrix or tensor of various scenarios, various integrations is various parameters and we can be results quantified and measured in relation to power consumption.
So we know how to turn the knobs in a way that really works as intended. So the trade-offs those regarding our hardware are balanced in a very conscious way. And this not only helps us to scale hardware, current solutions regarding what we work on now, but it provides us with confidence and knowledge about future deployments.
And we can easily predict what is what we are capable of and what can be easily implemented. And what is the best path to take if you were to invent something new and deploy something new? Because we have such a broad understanding of how our hardware scales and this benchmarking process was applied to Leopard.
However, this workflow of benchmarking we have developed is available public. So we encourage all AI in space practitioners to check it out and follow our steps, to perform benchmarks in similar spirit and evaluate their hardware pieces. So the general knowledge of this field, they know how it grows with continuing experiments and we are more and more mature with the hardware platforms in our possession.
Hywel: Thank you. We’ll link to that in the show notes as well, the benchmark in process, or provide some more info. And you mentioned that is obviously it’s a solution that enables you to determined or prepare for future deployments. And I wondered if you could do that a little bit and think about future deployments of AI and onboard processing in general, in the field it years to come, not just the next sort of mission schedule.
What do you think? What capabilities and opportunities you think satellite data end users, and mission designers, are likely to have access to with?
Maciej: Okay. So this is a really interesting question because it is justly starting. We are just at the beginning of this path of truly autonomous space systems. And in recent years, we will experience raising these technological readiness level of the solutions that I have described.
Easy to use, feature rich satellites development systems are nearly at 10 and the accessibility of onboard data processing and AI in this space really will enable us to see more complex solutions being built. And we will see things like satellite swarms, and fully automated satellites to come soon. And so this, we can expect a merge between the cloud computing services and satellite technologies, and it started happening right now.
Amazon web services, which is the most popular provider of cloud services already provides a satellite ground station service. This is a very basic feed by the word again, just at the starting point, but we can expect a moment when satellite missions will be part of a bigger autonomous system of swarms or connections between different computers, semi autonomous systems of automatic decisions making.
This is really interesting. And AI plays a huge role in automating Earth Observation processes in general and onboard data processing will definitely decrease latency and like satellite operations and businesses will be able to order a satellite observation analysis on demand with AI insight delivered on the spot.
Maybe like. Come on people. And like every Joe will be able to order a satellite observation and with a few clicks and have some interesting insight delivered by AI. So in the far future AI and onboard data processing, we will hopefully help to launch even more exciting Scifi like missions, like smart space vessels and robots, and are some interesting ideas regarding satellite mining and really interesting for me to come if you’re looking at a very broad perspective, but also really interesting and not other things are really close. And really at hand to come in years to come.
Hywel: Quite a vision to the future. I think there’s so many things that so many people in the industry and then the followers of the space industry would like to see happen.
And maybe in some cases expected to see happen by 2022 and obviously the sort of technologies you’re talking about conformer core part of enabling that. So that’s brilliant. Thank you very much at the insights you showed shared this today. I think we’ll be really interested to the community. Yeah.
On behalf of satsearch and all of our listeners in the space industry podcasts. I’d like to say thank you very much for spending time with us and sharing that information.
Maciej: Thank you very much. It was great opportunity to talk with you.
Hywel: Thank you. And to all the listeners out there, you can find out more about KP Labs, this smart mission ecosystem on all of the tools and technologies that said, that have been discussed today on the satsearch platform. We’ll link to the details of that. And you’re free to use the request system to get in touch with the company and more.
We’ll also share some of the resources that were mentioned. Yeah, we would encourage if you have any interest in this area if you have professional applications of the tools and technologies that are discussed , to get in touch. And finally, I’d just like to say thank you all for listening and spending time with us on the space industry podcast. We’ll be back soon.
Thank you for listening to this episode of the Space industry by satsearch. I hope you enjoy today’s story about one of the companies taking us into orbit. We’ll be back soon with more in-depth behind the scenes insights from private space businesses. In the meantime, you can go to satsearch.com for more information on the space industry today, or find us on social media, if you have any questions or comments.
Stay up to date, please subscribe to our weekly newsletter and you can also get each podcast on demand on iTunes, Spotify, the Google play store, or whichever podcast service you typically use.