Smart satellites for fast action
By Keith Button|January 2025
Scientists, first responders and environmental watchdogs are smart, but not smart enough to always predict where a volcano will erupt, a wildfire will break out, an algae bloom explode or a ship will illegally flush its bilge. By the time any of them put in a request to a satellite operator and receive images of the scene, hours to days have passed.
“Not only have you provided something that’s of no use to the end client, you’ve also wasted the valuable compute and power budget of the satellite,” says Fintan Buckley, CEO of Dublin-based Ubotica, a satellite software developer that’s among those aspiring to arm satellites with artificial intelligence to solve this problem.
Under a $632,000 contract with the NASA-funded Jet Propulsion Lab in California, Ubotica plans to demonstrate dynamic targeting. That’s when a satellite’s camera or cameras look ahead on the satellite’s path to automatically detect and focus in on an unfolding event as the spacecraft passes over it. According to Steve Chien, co-head of the AI group at JPL, the only satellite publicly known to do this is GOSAT-2, Japan’s Greenhouse Gases Observing Satellite.
Starting at a date-to-be-decided this year, two of JPL’s various AI agents on board CogniSAT-6, a 6-unit cubesat that was launched in March, will attempt to autonomously spot and redirect the satellite to take high-resolution images of “thermal anomalies” that indicate volcanic activity or wildfires. If they can do so, the feat would mark a step beyond experiments conducted in October, November and December. The same agents, along with additional ones from JPL and Ubotica, recognized specific events in images, including flooding in Spain and ships at sea. The difference was that no attempt was made to redirect the satellite. For its part, Ubotica’s AI “within minutes” identified 142 ships outside the port of Khor Fakkan in the United Arab Emirates by analyzing a single image.
Here’s how the coming trials will work: CogniSAT-6, operated by U.K. company Open Cosmos, will fly toward general target areas, such as a volcanically active region, to see if the JPL agents can detect the activity. The single full-light-spectrum camera on the satellite will look ahead and take images. JPL has trained all of its AI agents to look for specific features, such as visible infrared light emanating from a volcanic area. During the trials, the agents will run on Ubotica’s SPACE:AI software and hardware to analyze the lookahead images.
If an agent finds the specific feature it is looking for, it will have 50 seconds from the time the lookahead image is shot to redirect the camera to take a more fine-tuned image as the satellite passes over the target at 7.5 kilometers per second. The 50-second goal would beat the 2024 trials, in which the agents needed 10 minutes to analyze the images.
JPL is planning additional trials with the other AI agents, also at a date to be determined. Among them: Two agents that have been trained to spot clouds and identify storms will be tasked with determining whether clouds are obscuring the satellite’s field of view to the point that photographs should not be taken. These agents will also be tested on their ability to locate storms and take high-resolution photos of them from an altitude of 500 kilometers, which allows for more detailed photos than those by NOAA’s geostationary satellites that are locked in orbits above the equator at 35,000 kilometers up.
All of the lookahead and overhead images will be saved so JPL’s scientists can evaluate the agents.
Future iterations of dynamic targeting could employ satellites with a second, low-power lookahead camera, or analyze images captured by other satellites, such as NOAA satellites in the storm imaging scenario, Chien says.