Information Systems

UH-1 Huey flies autonomous resupply mission in test

The Intelligent Systems Technical Committee works to advance the application of computational problem-solving technologies and methods to aerospace systems.

This year yielded many innovations in the field of intelligent systems, including unmanned aerial systems, or UAS. Aurora’s Autonomous Aerial Cargo Utility System, or AACUS, participated in U.S. Marine Corps Integrated Training Exercise 3-18 at Marine Corps Air Ground Combat Center at Twenty-Nine Palms, California, in May. As part of the Marine Corps Warfighting Laboratory Expeditionary Hybrid Logistics experiment in conjunction with the exercise, an AACUS-enabled UH-1 Huey helicopter completed the first autonomous point-to-point cargo resupply mission. AACUS was developed for the Office of Naval Research in response to a U.S. Marine Corps Cargo UAS Urgent Needs Statement. AACUS is a rotary wing mission system kit that enables intelligent autonomous tactical flight missions into and out of unprepared landing zones. The program performed cargo/utility missions at its capstone demonstration at Marine Corps Base Quantico, Virginia, in December 2017.

Also throughout the year, the Vehicle Systems & Control Laboratory at Texas A&M University flight-demonstrated a machine learning algorithm for UAS to visually track stationary and moving ground targets. Performance of the system was demonstrated starting in December 2017 and continued in July and August with flight test cases of stationary, randomly moving targets and randomly moving targets in unstructured environments in College Station, Texas. Visual tracking of ground targets using UAS is challenging when the camera is strapped down or fixed to the airframe without a pan-and-tilt capability, rather than gimbaled, because the entire vehicle must be steered to orient the camera field of view. This is made more difficult when the target follows an unpredictable path. The tracking algorithm is based on Q-learning, and the agent determines a control policy for vehicle orientation and flight path such that a target can be tracked in the image frame of the camera without the need for operator input.

This year has seen significant accomplishments in intelligent systems for robotic space science and exploration. NASA’s Jet Propulsion Laboratory deployed artificial intelligence and machine learning capabilities in support of solar system and Earth science missions. In one prominent application, which began in December 2017 and continued through this year, convolutional neural networks were trained to classify images of Mars by their content. One classifier detects craters, dark slope streaks and sand dunes in orbital images; another classifier recognizes different parts of the Mars Science Laboratory’s Curiosity rover in its own images. The classifications provide the first content-based search capability for NASA images, enabling users to quickly drill down to images of interest (e.g., monitoring rover wheel condition). The content-based search technology was deployed on the Planetary Data System Atlas website in December 2017.

Similarly, artificial intelligence is being used to automatically schedule the Ecosystem Spaceborne Thermal Radiometer Experiment on Space Station, or ECOSTRESS, science observations. The scheduler generates the instrument command sequences from science campaign priorities and predicted ephemeris without human inputs. The scheduler incorporates visibility, data volume, timing uncertainty and South Atlantic Anomaly operations constraints. ECOSTRESS launched in June, and the AI scheduling went operational in August.

Contributors: Lisa Guerra, John Valasek, Mitch Ingham, Kiri Wagstaff and Steven Chien

Photo: A U.S. Marine operates an Autonomous Aerial Cargo Utility System tablet to request an autonomous helicopter delivery, in a test. Credit: Aurora Flight Sciences

UH-1 Huey flies autonomous resupply mission in test