Information Systems

A year of machine learning demonstrations and validations for aerospace applications

The Intelligent Systems Technical Committee works to advance the application of computational problem-solving technologies and methods to aerospace systems.

This year, there were several significant advances to enable machine learning on air and space systems. Early in the year, engineers with the Aerospace Corp. demonstrated the pose estimation system aboard ExoRomper, a reprogrammable machine vision testbed hosted on Aerospace’s Slingshot-1 satellite in low-Earth orbit. ExoRomper has a visible camera pointed at a maneuverable miniature spacecraft. The payload captured upwards of 1,000 images of the spacecraft in a variety of poses and processed a subset of images onboard to estimate the spacecraft’s pose, using a combination of machine learning and Perspective-n-Point algorithms. The remaining images were used to train new machine learning models to be uplinked to improve the pose estimation accuracy and raise the technology readiness levels to support satellite proximity operations.

In February, DARPA announced two machine learning aviation firsts that were demonstrated by its $72 million Assured Autonomy program, which is developing design-time and run-time technologies to ensure safe operation of neural networks in flight software. Program activities included the first-ever demonstration of autonomous airfield taxi operations to follow a “follow me” automobile and another airplane at a commercial airport, and the first use of neural networks to reroute a general aviation-sized aircraft in the National Airspace System. Both demonstrations used flight software on a Boeing-Cessna Grand Caravan autonomous demonstrator assured with DARPA program technologies.

In April, NASA’s Jet Propulsion Laboratory in California, Ubotica of Ireland and Hewlett Packard Enterprise of Texas completed their two-year validation of 60 applications on the Snapdragon 855 Mobile Hardware Development Kit and Intel Myriad edge processors and Spaceborne Computer-2 rack computers onboard the International Space Station. These algorithms included instrument data processing, including synthetic aperture radar image formation; image and data analysis including machine learning classifiers; and resource scheduling and dynamic targeting. This validation included both general purpose computing and graphical processing unit, digital signal processing and neuromorphic (convolutional neural network) hardware to establish benchmarks against traditional flight processors like the Rad750 and LEON4.

In May, a Stanford University research group developing robotic locomotion for extreme environments conducted a field test in a lava tube in the Mojave Desert in California. The robot, ReachBot, is an ongoing research project in the Autonomous Systems Lab, Biomimetics and Dexterous Manipulation Lab, and Earth and Planetary and Surface Processes group at Stanford. ReachBot is designed to explore Martian lava tubes by using extendable booms instead of traditional robot arms or legs to climb vertical, overhanging and obstacle-laden terrain. At the end of each boom is a microspine gripper, a claw-like appendage containing several arrays of needles to catch on asperities of rough surfaces, and a stereo camera for autonomous grasp identification. For this project’s debut field test, a partial ReachBot prototype consisting of an extendable boom, stereo camera and microspine gripper identified and grasped targets in the lava tube.

In September, the Vehicle Systems Manager and Autonomous System Management Architecture for NASA’s Lunar Gateway, a planned moon-orbiting space station, completed a critical design review at NASA’s Johnson Space Center in Texas. The VSM is the vehicle-level integrated control system that will support tactical functions such as fault management, resource optimization and management, mission management, and timeline execution for Gateway. The VSM is expected to enable a significant reduction in ground operations contact hours with Gateway, particularly during extended uncrewed periods, by bringing unprecedented autonomous systems capabilities onboard the human spacecraft.

Contributors: Steve A. Chien, Alonzo E. Lopez, Stephanie N. Newdick, Jim Paunicka and Julia Badger

A year of machine learning demonstrations and validations for aerospace applications