Spotlight: the power of visual feedback in space – with Infinity Avionics

Spotlight

In this article we discuss the applications, opportunities, and risks of using smart visual feedback systems in space. You can also find examples of missions that have used the technology and instances that could have benefited from it, were it available.

This article is produced in collaboration with Australian space sensor solution provider Infinity Avionics (IA), a paying participant in the satsearch membership program.


Introduction to visual feedback in space

Footage of the International Space Station (ISS), satellites, landers, and rovers taken in orbit or on other celestial bodies are visual proof of the billions of dollars spent in the space industry. But acquiring such data about in-space processes or assets has been a complex and expensive task.

When you see a video recorded and sent by a rover of its landing on a Martian soil, the impact is compelling. Likewise, footage of an astronaut on a space walk outside of the huge facility orbiting Earth that is the ISS can be inspirational to every person on the planet. And they can also be very useful at the same time.

For example, below is a video posted by NASA a few years ago recorded by NASA astronaut Peggy Whitson during a spacewalk on the ISS on Thursday, March 30, 2017. The video has around 4 million views and 40,000 likes on YouTube, making it a great communications asset for NASA. 

But if you watch carefully, at around the 6th minute you will see that one of the shields slipped away from the astronaut during installation.

This footage is also critical visual aid for understanding and analyzing what went wrong during the installation and why the shield slipped away. If this had not been recorded or been an autonomous system, rather than a person, we may never have known the exact cause of the issue if such visual feedback wasn’t present.

A single piece of footage, or even a series of still images, taken in space can be powerful and have multiple uses. The growing integration of, and reliance on, advanced space services and subsystems, combined with their miniaturization in today’s missions, makes it possible to obtain such visual data of in-orbit assets. 

There are many risks and opportunities in space that can be addressed by such visual feedback. In the next section we take a close look at some of these.


The areas of application

Currently, there are only limited methods available for visually monitoring assets in space. And such assets are subjected to varying radiation and thermal loads, physical pressures, and structural changes throughout launch, deployment, and operational phases.

These technical challenges demand systems with a variety of in-built health monitoring capabilities.

Therefore, visually assessing technologies such as deployable solar panels, radiators, and cameras, satellite deployers and dispensers, rocket fairing deployments, battery jettisons, and other such processes can provide mission operators with a wealth of information.

Below is a screenshot of the video footage of Expedition 68 showing NASA’s SpaceX Crew-5, from the 5th of October 2022. You can see how the nozzle is lit up differently as the velocity increases. This visualization can aid engineers in understanding how a product works in space over time and is affected by the environment, changing mission parameters, and other factors.

NASA’s SpaceX Crew-5 Flight Day 1 Highlights - Oct. 5, 2022 - on the satsearch blog about Infinity Avionics
NASA’s SpaceX Crew-5 Flight Day 1 Highlights - Oct. 5, 2022 - on the satsearch blog about Infinity Avionics - screenshot of footage.

Image credit: Expedition 68 – NASA’s SpaceX Crew-5 Flight Day 1 Highlights – Oct. 5, 2022

Such footage and imagery also has a role in marketing. Being able to demonstrate mission success and create engaging content for use online helps space companies tell more sophisticated and compelling stories about their work.

Space applications are fundamentally quite difficult to visualize and appreciate, particularly when smaller assets, such as CubeSats, are in use. The lack of light, the distances involved, the lack of moving parts, and the complexity of both the technologies in use and the ability to downlink information about their operation, all make it hard for non-specialists to fully grasp the scale of a mission.

In engineering and design processes this is addressed with simulation, but this too has its limits, and can be complemented significantly by visual feedback. For example, below are two screenshots from a video streamed by NASA of the Artemis I Close Flyby of the Moon. The first image is from the video recorded in space by an onboard camera and the second is a simulated animation driven by the spacecraft’s telemetry data.

Artemis I Close Flyby of the Moon - image on satsearch blog on Infinity Avionics
Artemis I Close Flyby of the Moon - image on satsearch blog on Infinity Avionics - closer shot

Image credit: Artemis I Close Flyby of the Moon

Although the simulated version shows the primary details, the actual video feedback also shows the mechanical structures and external systems of the Orion spacecraft as they are actually deployed at the time of capture, illuminated by sunlight.

Employing visual imagery along with simulations gives an extra dimension to the amount and quality of information that can be captured for a mission in progress. Whether this is then used for marketing, education and training, troubleshooting, historical record, or other purposes is down to the creativity and strategic objectives of the mission designers.

In the next section, we discuss how visual feedback has also been used in post-failure analysis of a mission and the potential opportunities areas where it could be used in the future.


Reducing risks

As has been shown, visual footage and imagery provides real-time (subject to downlinking windows) and reliable preliminary information about the status of a system that can be used to fast-track time-consuming troubleshooting or post-mission analysis.

Throughout the history of the space industry there have been many incidences where a satellite or a space asset has stopped working, partially or fully, due to technical error. For smaller satellites in particular the rate of systems that never return an operable signal following launch (i.e. known as being ‘dead on arrival’) is often quoted as around 15-25% and in many of those cases (potentially up to a third) the reason for the failure is forever unknown.

Obviously, in virtually all such cases it is not possible to retrieve, process, or analyze the telemetry or monitoring data following a failure. And even when partial data is available it can take weeks or months to understand the cause, requiring significant manpower and resources. 

It may be possible, in some of these failure analyses, to get some of the preliminary information through visual feedback, possibly even before analyzing other telemetry data. This preliminary information can provide a starting point for a more focused analysis of telemetry and other system data.

For example, Astra’s two-stage Launch Vehicle 0008 (LV0008) lifted off on February 10, 2022 resulting in a failure costing two customer satellites. Footage from the camera mounted on the rocket showed the upper stage began to tumble and shortly after, the rocket was lost. The video footage can be found on Astra’s website.

launch image of a spacex rocket on satsearch

Another example of a situation where video feedback would be highly beneficial is in space debris monitoring around a specific space asset for collision avoidance or other mitigation processes.

In March 2022, there was news about the breakup of Yunhai 1-02 due to a space debris collision. But it was not until a series of intense analyses were carried out that the cause was found. Imagine in such scenarios how helpful it would be if the satellite was equipped with an external camera to monitor its vicinity.

The concept of monitoring the health of a system through visual inspection is not new and it is also being extensively researched, developed, and adopted in the aerospace industry. Combined with advancements in computer vision and AI, there are developing fields such as ‘Intelligent Prognostic Health Maintenance (PHM)’ and the creation of digital twins of physical assets.

These analysis concepts combine visual feedback and multi-sensor inputs to monitor various areas such as the health status of different physical assets, the quality control of manufactured products, tests of the wear and tear of materials and components, and process control. The space industry can take a lot of cues from developments in aerospace (and vice versa!) in order to improve satellite status assessment. 

But how can you actually put these ideas into practice in a space system? Next, let’s take a look at examples of the actual hardware that would enable you to capture visual feedback, if integrated into your system.


The hardware

In all space applications, performance and functionality come down to technology choices and trade-offs. Where visual imagery and feedback systems are concerned, the priorities are to integrate a usable device (or series of devices).

For example, the Infinity Avionics SelfieCam Engineering Camera is suitable for deployment monitoring, spacecraft orientation identification, and even low-resolution Earth imaging. These cameras have been used by Infinity Avionics’ customers to monitor spacecraft deployments such as antennas, solar panels, solar sails, and other components. The cameras can produce highly compressed image data to ease downlink bandwidth requirements.

The SelfieCam Engineering Camera has held flight heritage since 2018, and has Single Event Upset (SEU) immune MRAM, a resolution of 1024 x 768 pixels, and is qualified to NASA GEVS standards. The cameras were also tested up to 30kRad TID (Total Ionizing Dose) without functional degradation.

The SelfieCam Engineering Camera can capture JPEG-compressed images at a rate of up to 12 frames per second. It can save up to 500 images on board and can support three different lens options with different fields-of-view (FoV). The camera is suitable for spacecraft deployment monitoring, health monitoring, and spacecraft orientation verification. 

The company’s new hardware products, the Lynx4MP series of cameras, have various FoV optics and provide high-resolution space asset monitoring capability.

Infinity Avionics is currently developing other modular camera products which allow its customers to mix and match between optics, sensors, storage, and processing.

In addition, Infinity Avionics is developing edge processing capabilities to deliver intelligent camera solutions for space asset monitoring. In the next section we discuss existing, and future, edge processing functions, as well as how higher downlink and processing bandwidth are enabling new capabilities.

earth observation image of desert - satlantis podcast with satsearch

Smarter imaging and monitoring

In order to make visual feedback technologies genuinely scalable, autonomous, and rapidly responsive, we need a system that does not only rely on human operators. This is the case for a wide array of current and future applications.

Artificial Intelligence (AI) is an example of a technology that enables onboard cameras to operate intelligently – with new functionality, and more advanced and higher volume data processing. 

For terrestrial purposes AI is already used in a variety of monitoring camera applications, such as facial recognition in law enforcement imaging, restricting security footage of a facility to a specific area of interest, or scaling up more focussed analysis at certain times of the day.

Due to the limited downlink capacities from space assets to ground stations, cameras often need to be accompanied by an intelligent processing system to increase efficiency. 

An AI-based smart system can help autonomously turn the camera on and off when needed, identify patterns in the collected images, identify the usefulness of collected images, and perform more advanced processing, even combining images with other datasets. 

For example, in space debris monitoring, a number of images could be captured during every orbit of the Earth and the processing system could identify patterns that might indicate debris. It would then downlink this data only when it is needed to facilitate a collision warning or for commissioning an avoidance maneuver. 

Infinity Avionics is both developing on-board storage solutions with customers and advising on and developing on-board processing as part of its cameras, in order to only organize and compress data that is useful to the end-user.

Smart camera solutions with edge processing and AI capability will allow autonomous decision-making in space, which can be highly beneficial for space robotics and in-space manufacturing applications. A real-world example of the value that visual feedback in space missions can bring is discussed in the next section.


Operational images in the M2 CubeSat mission

The M2 Cubesat is the most complex Australian CubeSat mission to date. Launched as a single 12U spacecraft, the M2 performed a controlled separation in orbit into two separate 6U CubeSats, named the M2-A and M2-B.

Both the M2-A and M2-B satellites have an Infinity Avionics SelfieCam on a deployable antenna arm. These were used to capture the moment of separation and the resultant video is available here.

The image sequence provided feedback on safe separation, separation velocity, physical condition and configuration of each spacecraft, all of which was invaluable for assessing the success of this unique maneuver.

Following the successful separation the M2-A and M2-B satellites proceeded to perform a number of formation flying maneuvers and communications experiments, as well as an EO imaging demonstration. Find out more about the mission here.

Visual feedback enabled by the SelfieCams, such as the image below captured during the Launch and Early Orbit phase (LEOP), was used to ensure that:

  • The two satellites are still conjoined,
  • The systems’ antennas are deployed as intended, and
  • The solar panels powering each system are deployed.

In the images captured the operators were even able to see the remnants of the burnwire used to deploy the solar panel and to confirm the main telescope cover is still stowed.

The M2 CubeSat in LEOP (image credit: UNSW Canberra Space) on the satsearch blog post with Infinity Avionics.

The M2 CubeSat in LEOP (image credit: UNSW Canberra Space).

Such innovative CubeSat missions are driving new approaches and experiments right across the space industry and the application of visual feedback in this area will lead to new operational models in both commercial services and exploration. In this next section we take a closer look at some examples of the latter domain.


Visual feedback in autonomous deep space missions

Deep space missions are highly autonomous and could employ visual aids for navigation, monitoring, exploration, marketing/PR, and analysis in a number of ways.

The Mars Perseverance Rover is a perfect example of how visual feedback can be crucial in autonomous deep space applications. In total it has 23 engineering cameras for navigation, risk reduction, and other purposes, aside from the science cameras onboard.

The Engineering cameras include Entry, Descent and Landing Cameras Lander Vision System Camera, Hazard Avoidance Cameras (HazCams), Navigation Cameras (Navcams), and CacheCam.

Mars Perseverance Rover engineering camera dispersion (image credit: NASA) - on the satsearch blog with Infinity Avionics.

Mars Perseverance Rover engineering camera dispersion (image credit: NASA).

The Lander Vision System Camera is an interesting system where the combination of AI and vision hardware enabled Perseverance to autonomously select the safest touchdown site within its landing area. Here is video footage of the descent and landing of the rover on the Martian surface.

In the future, visual feedback will likely play a significant role in in-space manufacturing facilities. Footage and still imagery will enable or enhance various fabrication functions such as cargo re-supplies or the movement materials to delivery points, or other areas where they are required.

In addition, it is likely that images and video would act as useful evidence for confirming process success for clients and insuring space assets. With large-scale, complex systems, showing the telemetry data as proof of certain processes being completed can be a huge undertaking, and visual feedback of the system actually working in space would be useful alternative or complementary evidence for commercial, insurance, and legal compliance. 

So, having discussed the challenges and opportunities, how do you get started with designing a system by making use of this hardware and processing systems? In the next section we consider some of the factors that mission designers need to consider when bringing such capabilities onboard. 


Getting started

As with all space systems, mission objectives and purposes need to be clearly defined before designing a smart visual system. The mission purpose would enable the definition of the following parameters: 

  • image/video resolution, 
  • SWaP-C,
  • mounting location, 
  • memory storage, 
  • frames per second, 
  • timeframe of recording/capturing, 
  • field-of-view (FoV), 
  • processor power and requirements,
  • bands of operation,
  • downlink capacity,
  • radiation requirements, and
  • thermal insulations needs.

These parameters are important for selecting the right hardware for your application. The mission’s purpose and technical limitations will also determine the form of AI algorithm or processing protocols that need to be trained and used to enhance the cameras’ operations.

Infinity Avionics offers different camera solutions to meet different user requirements. The SelfieCam provides low resolution, compressed images that can be downloaded even with weaker RF links, and is ideally suited for spacecraft commissioning activities. 

High-resolution cameras, such as the Lynx4MP, generate more data than SelfieCams, but also provide greater details and accuracy. These cameras will be suitable for space robotics and scientific experiments. Infinity Avionics’ cameras with edge computing enable autonomous decision-making and also reduce the amount of data downlink needed, to cope with the higher data volumes.

Infinity Avionics also offers unique and tailored solutions for customers and encourages early collaboration on proposals and projects to unlock the full potential of onboard imaging capabilities.


Conclusion

Ultimately, Infinity Avionics aims to be at the forefront of ensuring the engineers and scientists have visual data at their fingertips to achieve mission success.

In this era of exploration, where our reliance on technology is growing, space missions have a range of limitations that demand effective remote monitoring. This is particularly the case for commercial space stations, advanced in-orbit processes, and human spaceflight (including tourism). 

But such capabilities are available and valuable on missions of all sizes. Intelligent visual systems and video feedback can open up new revenue opportunities and aid mission assurance in a wide variety of applications.

If a picture is worth a thousand words, a video in space might just be worth a million!

If you are developing an autonomous spacecraft or if you are interested in a visual system for a specific application, please feel free to reach out to Infinity Avionics for more information on their visual feedback solutions.

cubesat
digitalization
engineering
optical payload
procurement
visual feedback

related articles

Blog home

Microsatellite and CubeSat platforms on the global market

CubeSat thrusters and small satellite propulsion systems

Ground station service providers: an overview of telemetry and telecommand communication services and networks for small satellites