Sifting out low-quality Earth Observation data – with AIKO

Webinar

Earth Observation (EO) missions have a variety of challenges to contend with, one of which is simply . . . . clouds. This problem led satsearch member AIKO to develop an onboard tool to improve EO mission data collection, called cloudy_CHARLES.

The video below is footage of a live demo of cloudy_CHARLES. AIKO product manager Paolo Madonia explains how this system can help to more efficiently (and cost-effectively) collect and downlink a greater volume of useful Earth Observation data through the intelligent detection of clouds.



AIKO’s portfolio

At the pages below you can find more details of AIKO’s portfolio of space solutions and software tools:

The AIKO cloudy_CHARLES is an onboard imagery processing software for cloud segmentation. With CHARLES, the quality of data is assessed directly onboard and only the relevant frames are sent to the ground.

The AIKO orbital_OLIVER is a software solution to enable autonomous spacecraft operations onboard the satellite thereby improving its performance.

The AIKO speedy_SKYLAR is a software tool that enables optimal routing for communications within a satellite constellation. SKYLAR helps in finding a path to send the satellite data back to Earth, even when there is no direct contact with the ground stations.


Demo transcript

Hi, everybody! Welcome to the webinar today. 

We’re just getting started here. I’m just gonna give people about two minutes to just join in.

You know what it’s like? You get those kinds of notifications popping up and remember what you were doing, where you are and getting the headphones set up. So just giving people a little bit of time just to get in.

Very pleased to see so many people joining us today from, well as far as I believe, lots of different countries around the world and that’s why we try and hold these events in a time zone where people from both the US, where obviously a large part of the market is and Europe can easily attend but obviously we can’t, you know, meet everybody’s needs.

So if you’re staying up late or getting up early to hear from us today, we really appreciate that. Thank you and for anybody unable to make it for time zone reasons, we will of course be providing access to the information material after the event.

Hope 2023 has started well for everybody. 2022 I think, another tough year on the heels of the COVID pandemic. Here in the UK it’s been raining for about 48 years. I don’t know whether that’s the case for wherever you guys are. But hopefully January has been a bit kinder too weather wise than it has here.

So I think we’ve got, yeah, I think the number of people who’ve come in and the numbers have stabilized this, So I think we can get started properly.

So, hi, my name is Hywel Curtis and I’m the head of marketing at SatSearch, the global marketplace for space. As part of our content program for both the buyers and suppliers in the industry. We run various types of output, the podcast or blog posts and these live events and webinars and demos are a part of this. So we’re really grateful for you to be able to spend time with us here today. And we would, you know, like to say a big thank you to our guest Paolo Madonia from Aiko. Aiko is an Italy based company working on various aspects of autonomous space missions. And in this event today, Paolo is going to be presenting to you the company’s solution for improving Earth observation data, specifically focusing on clouds.And he’ll, you know, share all the information about the access about what the system does and how it works and the benefits that it can bring to Earth Observation missions, which as, as I’m sure you know, for the people in the audience, this is a critical challenge in a very important part of the industry today.

So just a couple of housekeeping things, we’ll have the chat function open if you would like to use the chat and to speak to each other. We will also obviously enable you to put your questions to Paolo. We will run these at the end, I will cheer that.

But please, if you can use the Q and A function on Zoom, which is on the toolbar, which could be at the top or the bottom of the side of your screen wherever you have it. But if you can just get onto that, use the Q and A, put those questions there and then we’ll run through them at the end. 

If there’s anything else that doesn’t get covered in that time, we obviously do as many questions as we can and as is possible in the time limit, but respecting everybody’s schedules. So if there’s anything that is unanswered, please do feel free to follow up with us or with Paolo after the session. So it’s not me that you’re here to listen to. So without further ado, I’d like to hand over to Paolo and say welcome to the session.

Thank you very much Hywel for the introduction and welcome everybody this afternoon or morning or evening depending on where you are. I’m gonna share my screen.

So meanwhile, I’m sorry to hear that the weather has been so bad in the UK Hywel.

But what we’re gonna talk about today is probably related to this. So thank you again for the introduction.

I am Paula Madonia and I’m the product manager at Aiko.

And today we are going to talk about one of our product which is cloudy_CHARLES and how this software product, its numbered application can improve Earth Observation missions by filtering out data that turns out to be unprofitable.

First, I’d like to say a few words about Aiko. Hywel already said a few things.

We are software house based in Italy, specifically in Turin and we focus on delivering artificial intelligence solutions for the aerospace industry.

So far, we have already flew some of our products and we have been involved in projects and activities in various domains, most notably earth observation, telecommunications and Deep Space. We have three products currently on the market of which cloudy_CHARLES is one of this group and here we summarized briefly, some of the major partnership that we have established throughout the last five years. Aiko has been founded in 2017 And our team currently accounts for about 28 people.

Actually, we just hit 30 of which 25% held, hold, sorry, a PhD degree. So I said that we, we deliver artificial intelligence solutions, we build products, software products and we do so not in let’s say unconnected way. Yeah, we don’t build tools that are separated from each other. We do so with an idea of an integrated ecosystem. So this software can, these pieces of software can work alone but they work better if they collaborate with each other, they are deployed together. Specifically, we have 3 lines, let’s say of activity. The first one concerns smarter satellite operations, so most notably onboard software that could be onboard artificial intelligence or onboard data processing, we have tools for the enhancement of human operators activities on the ground.

And finally, we are developing solutions to make the operations of constellations and most notably mega constellations more scalable. Today, we are going to focus on the first of these three, let’s say lines of activities that is smarter satellite operations and more specifically we are going to talk about onboard data processing.

So what cloudy_CHARLES is, is a tool that is able to process and analyze data life on board of the spacecraft. So going back again, at this point, you probably have guessed that we are going to talk about earth observation and actually before diving into the specifics of of CHARLES, I’d like to touch a few points on on the earth observation sector and specifically on the optical missions that are used in earth observations.

Thanks to optical satellites that are orbiting earth, we are nowadays able to get pictures like this one I would say pretty easily. They’re quite common; it’s quite easy also to download them. We got this one from ESA, this is a picture taken from Sentinel two and this, this shows actually an area close to our headquarters in Turin. You can spot Milan here in the bottom down of the image, you can see the Lake Majora here in the North. So a few years ago, pictures of this kind were almost incredible to see around. And now we are used to this kind of data. So they’re really common. They are used in so many fields from agriculture to infrastructure to logistics, meridian applications, aviation. So, of course defense, there’s a large number of applications to optical imagery. And yeah, we get some beautiful views of our planet like here we see Rome, for example, here’s the city, here, then you can spot the airport. I know I’m probably biased and show you just pictures of Italy. But yeah, I just wanted to show you know, what beautiful data we can get nowadays with these assets in orbit, and the level of detail that we get from optical satellites with different sensors, especially with multispectral or hyper spectral sensors that are capable of probing different wavelengths. It’s incredible.

However, by showing you this picture, I’m not telling the full story here. What we might get, in fact is something like this.

So the previous two pictures were pretty clean. But as Hywel was saying earlier on in the U K it has been raining forever now and they’ve been covered by clouds most likely. And so that’s what we could get if we were to randomly take a picture from, from the sky down at our planet. So the earth is actually covered by clouds. And in this case, we were seeing a volcano in Southeast Asia. If we were interested in looking at the volcano, we would be fine. I would say from this picture. But if we were interested in looking south of this mountain, we would not be able to because with this acquisition, our site is hampered by many clouds in between. So yeah, our planet is covered in clouds and it is so, its surface is covered by about 65% on average. So this is a picture from the naval observatory from the US and this poses a problem for optical observation. It is so because, because of clouds, we have to delete, discard 30-40% of the data that we send down to earth and to better understand why this aspect is really a problem. I’d like to take one or two minutes to analyze the flow that usually is used and from acquisition to you know, use of the data imagery product. So let’s consider our generic earth observation optical mission.

We start, as I said with the acquisition currently, we are located on the spacecraft, it’s at the space segment level. And after this, the images are acquired or taken once the satellite is inside of a ground station, this data is sent down to earth. Now we send these data packets to earth and as you can imagine in this data packet, we can have some frames that are clean with a clear sky, some others that may be featuring clouds, partially or completely with frames totally obscured by clouds. So in order to, you know, sell this data or distribute this data. In case, for example of the D sentinel missions that are open to free access, we need to analyze this data and make sure that we are not going to distribute, sorry, useless information. So once we get the images down to earth, we analyze them, we keep just the good ones and then we are able to distribute them.

Now that you probably already spotted the issue is that what we are doing is filtering down to earth. Now, these would be let’s say fine if the down link wouldn’t have a cost, but that’s not the case. Down link is actually quite expensive. And so having to download and spend money to get this data down to earth and then discard it right away is serious waste issue.

Again, I remind you that we discard 30-40% of the data that we sent down to earth.

So in terms of money, by looking at the data generated by the optical satellites and by looking at how many optical satellites are going to be orbiting the earth over the next few years, we can get a glimpse of how much money is lost in this wasteful down link activity. So in 2021, in the last few years, the estimates span from €30 million up to €70 million and you can see that with the rising number of optical constellations in orbit, this number is set to rise quite highly over the next few years. So I believe that at this stage, at least I hope that the problem that we have to face is clear to everybody and we are all on the same page. 

So I believe it’s enough talking about the problem and it’s time to talk about the solution so at Aiko we developed cloudy_CHARLES specifically to tackle this issue of the remote sensing market. cloudy_CHARLE is a numbered software as I mentioned already. And the technology behind it is deep learning. It has been conceived specifically for owners of earth observation satellites. What cloudy_CHARLES is trained to do because deep learning is a technology that allows us to train machines to do specific tasks. What cloudy_CHARLES is trained to do is to identify clouds in optical imagery data. So it takes as an input the frame that has been acquired by the spacecraft, by the optical system, process it and then it outputs a cloud mask.

Here you see cloudy_CHARLES in action on a frame. And here at the bottom, you can see the cloud mask. Thanks to this cloud mask, we are able to reconstruct and calculate how, what’s the cloud cover cloud coverage percentage in that frame. And depending on that number that we get, the spacecraft is then able to keep it or or delete it. So this actually is a decision that it’s up to the end customer to choose what’s the threshold value that he’s fine with.

So by using cloudy_CHARLES, we are able to tap down link costs because we are not sending down any more useless data and profitable data and on the other hand, we are also able to provide actionable insights for onboard artificial intelligence agents, for example. So this is probably an extra feature and I’m not gonna dwell too much on this one during this talk. But if you recall that what I said at the beginning at Aiko, we believe in providing solutions that are integrated, we could use the information extracted by cloudy_CHARLES about the cloud coverage on one area, for example, to replant the mission on the go and avoid taking other acquisitions on the area as far as it’s cloudy.

One last point on this brief overview of CHARLES, I said it’s been conceived for owners or operators of satellites of earth observation satellites. But on the other hand, it can also be deployed at the payload level so it can be a differentiator for payload providers so that the optical system, the sensor is able already to decide whether to keep an image or to discard it right away after the acquisition.

So I want to go back for, for a sec, to what we saw earlier so that was the flow that I presented earlier on. We have the acquisition, we store all of this data and we send it down to earth regardless of the quality of the data because we don’t know that already. And then after we analyzed the data, we are able to keep only the good frames and distribute them. With cloudy_CHARLES on board, what we can do is to analyze data right after the acquisition. So that when we get to the down link phase, what we get down to earth, it’s only the good data. This as of course, the first advantage that I mentioned that allows two cuts down link costs. But a second advantage that I didn’t mention so far is that it also shortens the time to get access to the data, not just because we are cutting this analysis face on ground but also because since we are downloading to earth, less data, we are getting access to that more quickly, less passes over the ground stations are needed to to get the hands on the data. So these are the two key improvements provided by cloudy_CHARLES.

And this is how the let’s say the scenario of an earth observation mission changes when this is used.

So concerning, the features of this product. The first one is the accuracy with cloudy_CHARLES we are getting values of the F one score above 90%. I’m gonna touch on that in a few moments when we get to the live team of CHARLES. It is 100% software, as I said, and this has the first effect of being deployable on a variety of platforms. Now, we designed it to be compatible with several commercial off the shelf platforms and I’m gonna share with you the hardware compatibility specifications in a few slides in a few minutes.

Then we by using a, you know, a deep learning based approach, we are able to deploy cloudy_CHARLES on different systems; it is payload agnostic. We can retrain the model to meet the specification of a given camera of a given optical chain. So we provide flexibility and also towards the sensor and the instrumentation that is used to acquire the data.

Lastly, it is pretty lightweight. It’s the order of a few megabytes. We’re gonna see that in a sec and it’s upgradable. So not just we can upgrade it and improve the model once it’s in orbit, but we can also deploy it for the first time on satellites that are already in orbit.

So there’s no need, you know, if you already have a satellite in orbit and you’re thinking it’s a pity that you’re discovering this right now. No worries, we can also deploy it if your satellite is already there.

Just a few words on the flight heritage. So the development of cloudy_CHARLES started basically three years ago and back then it was called clarity. So that was the prototype version of cloudy_CHARLES. And when it was still called clarity, we had our first flight with logic and back then, we were among the first to run deep learning models, data processing models in orbit and building on the positive results of that first flight, we continued the development throughout the years until 2022, until last year and this culminated with the first release of cloudy_CHARLES a few weeks ago.

Now we are currently at the stage where we are launching officially cloudy_CHARLES on the market with the first flight scheduled to occur in the first quarter. So in a few weeks, you’re gonna be hearing more about this and with an early access program that is going to start next month, with this program, we are going to provide discounted access to the product and also we are going to provide extra support in training the modell, in deploying the model on your platform. So if you’re interested in knowing more about cloudy_CHARLES and how this tool can improve your mission or save you costs, the early access program is probably the best time, you know, to approach this new technology.

So at this stage, I probably showed you too many slides. So I’m gonna pause for a sec the presentation and I’m gonna switch over a very brief demo of cloudy_CHARLES. So what you’re going to see here, in this window.

I’m sorry. Okay. I have been told that there’s an issue with the screen sharing. So I’m gonna quickly stop screen sharing and showing that again. Okay, waiting for confirmation. Okay. We have it.

So I was saying in this window, you’re gonna see how, what’s behind, let’s say, in a way of, cloudy_CHARLES and we’re gonna test it with different boards that we have here at Aiko available.

So enough talking, let’s say I’m just gonna take an image and explain you live with examples. So what you see here on the left is the image we are going to run cloudy_CHARLES on and on the right, we have the ground truth mask that so called ground truth. This is the same known position, known for sure position of the clouds in the frame and we usually don’t have that when we are processing data life in its space on the platform. But we use the ground truth for training the model and also for validating the models after we train that.

So in this case, this is a part of a data set that we use for training and validating CHARLES, we have the ground truth available and the ground truth allows us to probe the performance, calculate the performance of our tool. So I’m going to start running CHARLES with the first of our boards. So here we have the Google Coral and I’m gonna surf from the bottom here. So what you see here in this interface from the bottom, we get the FPS, that’s the frame per second that CHARLES is able to process. So the inference time was about 40 milliseconds. That means that at this speed, it would be able to analyze about 20 to 25 images per second. The cloud coverage that is computed here is about 17%, which usually is deemed to be quite good as far as you know, our customers have told us a good threshold is usually between 60-50%. But that again depends strongly on the customer or on the applications; somebody could be fine with 70% or somebody else could be fine with with lower values. As you can see here, the F1 score is about just a bit lower than, than 90%. But this can depend on the platform that is used because the model deployed on different platforms is optimized in different ways.

So here we get a slightly lower F one score, we get slightly, well not slightly, we get a lower speed of influence with the, with the Intel myriad, which is not a surprise because it’s less performance than the Google Coral. We can go over to the Jetson Nano. We get an FPS value which is halfway through the two previous boards and we get an F one score of about 86% which is a bit worse again than the Coral.

We can run this test again. For example, with a different image, this one has some more clouds in it. So here, for example, we get almost 92% of F one score. Again, the coral processes the image in uh in about 40 millisecond So that the myriad here performs a bit better. So we also always have to take average values as I mentioned earlier. The average performances that we are getting with CHARLES so far on different datasets has been above 90% of F one score.

So now that I showed you some of the performance of CHARLES, I’d like to show it like if it was in action. So I’m gonna select this full deficit And I’m gonna select the cloud coverage threshold. So for the sake of this run, I’m gonna put it at 50. And by besetting this parameter, we are going to get an information of how much data would be discarded on board. And so how much, let’s say how much money in percentage we could save in down link expenses.

So, and I’m gonna start this, this experiment, what you see here is, I don’t know, we have a problem here probably. Yeah. So that’s the beauty of the live session.We’ve tried that until  five minutes ago. Yeah. Okay. Our IT team has been telling me just to restart it. So I’m gonna select this again. Here we go. I’m gonna put again 50. I’m waiting for the go from, from my team. So when, okay, I believe I have to refresh again. Yes, because I refreshed before the boards were available. So okay, strip mode.

So as I was saying before we had this brief issue, what you’re going to see here is a series of, you know, of acquisition as if the satellite was, you know, hovering over an area and continuously acquiring images. Now you’re, you’re going to see from time to time some of these squares to, you know, turning orange and that’s going to tell you when the cloud coverage threshold is exceeded. So this is happening here and now it’s happening also for the Jetson nano. So the three, I haven’t said that the three panels referred to the three boards that we are currently using. So the coral and the nano already finished, they were the fastest one and the myriad is currently still running. So while the myriad finishes its acquisition, let’s have a look at the numbers down down here.

So the data that were discarded according to the estimations made by the Coral were about 32% of data whereas with the, with the inference performed by the Nano, it was about 34%. So slightly higher value. And now the Myriad has finished and also in this case, we get 34%.

If we were to use different coverage threshold, then we can do that of course. We would get slightly different values slightly depending on how strong difference is. So I’m just going to go with 60% here, around 60%. So running again the model and as you can see the, in this case, the coral is running pretty sadly out about 23 FPS. The NANO is a bit slower but you are going to see the two of them finishing basically at the same time, that’s not an error that’s fine because on one hand, the coral is faster in performing the inference; so in understanding how many clouds there are in the frame, but the CPU that is paired to the hardware acceleration harbor accelerator that performs the inference is less powerful in the case of the Coral. Also, we have a slight bottleneck in the processing of this information.

So as you can see here, I used a higher threshold. So in this case, it was 61% and of course, we are getting a lower estimate for the data that would have been discarded on board. So yeah, this was just a very quick demo to show you that the principles behind cloudy_CHARLES and how this actually converts to, you know, a saving in the amount of data that is down linked. And so in the costs that are related to this activity, These are not the only three platforms we can run cloudy_CHARLES on.

So at this stage, I’m going to stop again, sharing my screen and switch over again to the presentation so that we can talk a bit more about the, can you see the screen?

Yeah.

Okay, excellent.

So I was going to say we were going to talk a bit about the specifications that are needed in order to have cloudy_CHARLES running. As I said, it’s cuts compatible. So we have tested it so far with arms architecture, both 32 and 64 bits and also with X 86 architecture. In terms of storage that is needed to host the model and by using the floating.32 model, we get a size of about 30 MB. That’s a conservative estimate. It’s slightly less than that. CHARLES runs on Linux based operating system and when it comes to memory, that’s the value that mostly depends, that’s let’s say most volatile because it depends on the libraries that are available. cloudy_CHARLES would need from 20 up to 320 megabytes of RAM available. Concerning the AI inference, so the actual operation of detecting the clouds, cloudy_CHARLES needs a dedicated accelerator needed dedicated accelerator hosting, you know, sorry, not, not hosting and would need tensor flow as, as a training framework.

Also the deep learning operators that are needed are the most common ones. So this is not let’s say a specific requirement, but the list is quite long. So we can provide that in a separate and of course, if you’re interested in using CHARLES, we are going to go through this list of operators that are needed.

So when it comes to compatibility, we have tested it on the following hardware accelerators and on the following CPUs but this list is currently evolving. So if the platform that you intend to use is not here, do not worry because either we have tested it already and it’s currently undergoing the final benchmarking or if we haven’t tested that already on your board, we can consider also exploring the compatibility of CHARLES on your artwork, of choice. 

So concerning the activities that are needed from when you choose or you start to be interested in cloudy_CHARLES until you, you actually use it in orbit. The first thing to do is to get some data from your system, from your platform to train cloudy_CHARLES. As I said, the product per se is platform agnostic so that we can fine tune it to your platform. But to do so, we would need to get some training data from you. This way, our engineers would take care of this fine tuning. And also on the other hand, our software engineers by knowing the computational platform. So the hardware platform that we are going to run the software on, they will be able also to tailor the model and the product for the specific computational architecture, so these are the preliminary activities.

When it comes to the deployment, we are going to run some tests either on ground or during  flight, we are gonna right after the deployment; Of course, we are going to have our first commissioning time. And I like again that the software can be deployed either before the launch or after the launch has taken place.

Last phase; so after the software has been commissioned, the nominal use phase starts, but of course, we offer software support to the users and also periodic checks on performance. So the fine tuning can proceed on so that, you know, the best performance are guaranteed and to do so, we provide the retraining of the model as needed.

So at this stage, I lost track of the time or so. Hywel sorry if I was too worthy on a few points, but we went to the end of this life demo today.

So these are the five take away points that I’d like to highlight and I’d like you to bring back with you after today’s event.

The 30 to 40% of data in optical missions is actually discarded after downlink and this problem, you know, is going to cause quite large waste of money over the next few years and it already has. For this,wwe at Aiko developed cloudy_CHARLES which will allow to filter out the data that turns out to be useless and to do so directly on board, which is quite different to what is done today, as we saw.

I’d like to remark again, that CHARLES is completely a software product so it’s not tied to any hardware platform and it is by design compatible with commercial off the shelf platforms and just too close I remind you that we are going to start an early access program in February. So if you’re interested again, I invite you to reach out to us because that’s the best time to approach this new technology. And if you have any questions that may pop up after this event, feel free to reach out to us at Aiko through this email or through Narayan and Hywel. You can get to me and the CHARLES team quite easily.

So thank you very much for being here today and for this opportunity to Hywel, Narayan and the SatSearch team.

Fantastic. Thank you very much Paolo, that was great. Really interesting to see the system in operation there. And I think everybody will have gained a lot from that.

So yeah, I wanted to thank you very much.

We do just have a couple of minutes.

Obviously, anybody who needs to go, please do. We respect your schedules, of course.

But yeah, if anybody can stick around and I just do have a few questions to put to Paolo if that’s okay.

So firstly, one from someone in the audience asked if the product works on L0 raw data or does it need, for example, reflect its data in order to operate?

So the product we tested it on a variety of data types and you know, data products. So it is, meant to work on L zero and L one data usually. But again, the ultimate check needs to be done with the specific customer and platform. So to see how the model performs before the fine tuning and after the fine tuning. So in this sense, I invite the anonymous, certainly, who asked this to contact me or the team and we can, you know, investigate their needs.

Okay, great.

It relates a little bit to a question that we had actually. So we’re tracking approximately 150 something different OPC systems in the industry today. Given that there are such a widespread, you know, variety of options for people. How do you work to ensure that the cloudy_CHARLES can work with those different pieces of hardware?

So the goal for us is not to work with everything I have to say because that’s, let’s say if you know, okay, if you don’t focus on something, you’re not going to do, you know that thing.

Well, so far we targeted the, let’s say, the most used platform or at least the most promising platforms and we intend also to, you know, hear the needs of the customers.

So at this stage, we are still, you know, keeping a nice and I open to, you know, to new opportunities, but we’re not, you know, running on every platform on the market because that’s not the point. We intend to be compatible to provide flexibility but we are going to evaluate this flexibility also on a let’s say, on a customer need basis, of course, the most sorry for interrupting you but just to make this point more clear in case of the most, let’s say used platform that we chose to to run CHARLES you know, first and test it on. But then if there’s maybe some niche device that we didn’t consider and that’s not in our list, for example, probably we were not going to reach out to that device autonomously. But if Somebody comes to us and asks if CHARLES can work on that, we can make it work, we can tailor it for that device, that’s of course, not 100% sure until we test it but that’s the plan.

And then we’ve had a question from Marsha Walker about the early access program just asking for more information. We did discuss before the event that we can send out some more information to everybody who attended on that. But is there anything else you wanted to share about the program Paolo at this point?

So we are going to release something. So you, you will be getting some news in that sense concerning the requirements is to have some kind of data product that you want to validate, sorry, not validate, you want to run CHARLES on. So that’s the only requirement.

So if you have interest in doing some testing and let’s say five years from now, you know, to see how CHARLES works with your system, that’s probably not the best occasion because the early access program is, is meant to, you know, start over the next few weeks and it should be lasting a few months. But if you have something to run charge on in a shorter time scale, that’s probably the best time to again pioneer this technology, I’d say, yeah, I don’t know if this is answered the question from our R and D. But again, as I said, we are going to be making an announcement over the next few weeks. So if you will be looking on our channels or on our website, you will be hearing something about this.

Excellent.

And yeah, as mentioned, we will then send a follow up message for all the attendees or the registrants of this event, which will include some more information or at least how you can contact Paulo.

That’s great.

I think just, you know, we are pretty much at times so just a final, a good final question perhaps to finish on is another one from Marsha.

Why the name cloudy Charles obviously cloudy? But why Charles?

Yeah. So that, that was, I can imagine that the UK audience may see some connection with the recent development in the monarchy, but there’s it’s totally unrelated. Actually, cloudy_CHARLES had this name before King Charles took the throne. So it’s called cloudy_CHARLES because we like to think our products as a person, as people.

I co works with artificial intelligence and we’d like to see these tools as intelligent as people can be. So all of our products have a name of, you know, a human name.

So we have cloudy_CHARLES, we have Speedy Skyler and we have Orbital Oliver that are already, you know, on the market. We have a few upcoming products that will be released, but I’m not going to spoil their names today.

So that’s the funny, let’s say background about this choice.

Excellent.

Well, I think that’s a great place to finish up. As mentioned. We will be in touch further with everybody and yeah, you know, for example, Marcy, in the Q and A.

You’ve shared some your quantities as etcetera. Well, we will follow up with you definitely. And yeah, thank you very much to Paolo for sharing all those insights and giving us the live demo of cloudy_CHARLES in action there.

And I wanted to say thank you as well to everybody who was able to attend today. I think we’re very grateful that people are, you know, were willing to attend this sort of event early in January and hopefully you got something out of it. I know we certainly did, learned a lot about what can be done in terms of onboard data processing for earth observation missions today in this really important area in the industry.

So thank you again Paolo

Thank you very much for organizing this and thanks to everybody who came.

Fantastic. Thank you, everybody. Have a great rest of the day.

Bye.

data processing
demo
earth observation
on board computers
space software

related articles

Blog home

Microsatellite and CubeSat platforms on the global market

CubeSat thrusters and small satellite propulsion systems

Ground station service providers: an overview of telemetry and telecommand communication services and networks for small satellites