Login to Continue Learning
This technology, called Dynamic Targeting, was tested by NASA’s Jet Propulsion Laboratory (JPL) earlier this month. It was installed on a briefcase-sized satellite operated by UK-based startup Open Cosmos, which also used a machine learning processor developed by Dublin-based firm Ubotica.
In the test, the satellite tilted forward to scan 500 kilometers ahead of its orbit and snapped a preview image. Ubotica’s AI quickly analyzed the scene for cloud cover. If skies were clear, the satellite tilted back to take a detailed photo of the surface. If clouds obscured the view, it skipped the shot—saving time, storage, and bandwidth.
“Being smart about what you’re taking pictures of means only imaging the ground and skipping the clouds,” said Ben Smith from JPL, which funds the Dynamic Targeting work. “This technology will help scientists get a much higher proportion of usable data.”
Brian Quinn, chief strategy officer at Ubotica, noted that until now, satellites have merely acted as passive data collectors, imaging whatever happens to be beneath them and sending all that data—useful or not—to Earth for scientists to sort through later.
“Post-processing could take days,” said Quinn in an article published on NASA’s website earlier this year. “It takes post-processing to say, ‘Hey, there was a fire. Hey, there was a harmful algal bloom.’”
NASA, Ubotica, and Open Cosmos say the system could also be expanded to spot wildfires, volcanic eruptions, and severe storms faster than ever before from space.
The recent test builds on previous partnerships involving these three parties. In 2021, Ubotica demonstrated real-time AI cloud detection aboard the International Space Station (ISS) as part of a broader research collaboration with JPL. Then, in 2024, Open Cosmos launched HAMMER—an AI-powered satellite equipped with a hyperspectral camera and Ubotica’s machine learning processor.