How can artificial intelligence (AI) help astronauts on long-term space missions? This is what a recent study presented at the 2024 International Astronautical Congress in Milan, Italy, hopes to address as an international team of researchers led by the German Aerospace Center introduce enhancements for the Mars Exploration Telemetry-Driven Information System (METIS) system and how this could help future astronauts on Mars mitigate the communications issues between Earth and Mars, which can take up to 24 minutes depending in the orbits. This study holds the potential to develop more efficient technology for long-term space missions beyond Earth, specifically to the Moon and Mars.
Here, Universe Today discusses this incredible research with Oliver Bensch, who is a PhD student at the German Aerospace Center regarding the motivation behind the study, the most significant results and follow-up studies, the significance of using specific tools for enhancing METIS, and the importance of using AI-based technology on future crewed missions. Therefore, what was the motivation behind this study regarding AI assistants for future space missions?
“Current astronauts rely heavily on ground support, especially during unexpected situations,” Bensch tells Universe Today. “Our project aims to explore new ways to support astronauts, making them more autonomous during missions. Our focus was to make the great amount of multimodal data, like documents or sensor data easily, and most importantly, reliably available to astronauts in natural language. This is especially relevant when we think about future long-duration space missions, e.g., to Mars where there is a significant communication latency.”
For the study, the researchers improved upon current METIS algorithms since current Generative Pretrained Transformer (GPT) Models and are known for producing errors based on specific environments where they are deployed. To combat this, the researchers incorporated GPTs, Retrieval-Augmented Generation (RAG), Knowledge Graphs (KGs), and Augmented Reality (AR) with the goal of enabling more autonomy for future astronauts without the need for constant communication with Earth ground stations.
The goal of the study was to develop a system that can improve astronaut autonomy, safety, and efficiency in conducting mission objectives on long-duration space missions to either the Moon or Mars. As noted, communication delays between the Earth and Mars can be as high as 24 minutes, so astronauts being able to make on-the-spot decisions could mean the difference between life and death. Therefore, what were the most significant results from this study?
“In our project we aim to integrate documents, like procedures, with live sensor data and other additional information into our Knowledge Graph,” Bensch tells Universe Today. “The stored and live updated information is then displayed in an intuitive way using augmented reality cues and natural language voice interaction, enhancing the autonomy of the astronauts. Reliable answers are ensured by backlinks to the Knowledge Graph, enabling astronauts to verify the information, something that is not possible when just relying on large language model-based assistants as they are prone to generating inaccurate or fabricated information.”
Regarding follow-up studies, Bensch tells Universe Today the team is currently working with the MIT Media Lab Space Exploration Initiative and aspires to work with astronauts at the European Space Agency’s European Astronaut Centre sometime in 2025.
As noted, the researchers integrated Generative Pretrained Transformer (GPT) Models, Retrieval-Augmented Generation (RAG), Knowledge Graphs (KGs), and Augmented Reality (AR) with the goal of enabling more autonomy for astronauts on future long-term space missions. GPTs are designed to serve as a framework for generative artificial intelligence and was first used by OpenAI in 2018.
RAGs help enhance generative artificial intelligence by enabling the algorithm to input outside data and documentation from the user and are comprised of four stages: indexing, retrieval, augmentation, and generation. KGs knowledge bases responsible for enhancing data through storing connected datasets and the term was first used by Austrian linguist Edgar W. Schneider in 1972. AR is a display interface that combines the elements of the virtual and real world with the goal of immersing the user with a virtual environment while still maintaining the real-world surroundings. Therefore, what was the significance of combining RAGs, KGs, and AR to produce this new system?
“Traditional RAG systems typically retrieve and generate responses based on a single matching document,” Bensch tells Universe Today. “However, the challenges of space exploration often involve processing distributed and multimodal data, ranging from procedural manuals and sensor data to images and live telemetry, such as temperatures or pressures. By integrating KGs, we address these challenges by organizing data into an interconnected, updatable structure that can accommodate live data and provide contextually relevant responses. KGs act as a backbone, linking disparate sources of information and enabling astronauts to access cohesive and accurate insights across multiple documents or data types.”
Bensch continues, “AR enhances this system by offering intuitive, hands-free interfaces. By overlaying procedures, sensor readings, or warnings directly onto the astronaut’s field of view, AR minimizes cognitive load and reduces the need to shift focus between devices. Additionally, voice control capabilities allow astronauts to query and interact with the system naturally, further streamlining task execution. Although each technology provides some benefit individually, their combined use offers significantly greater value to astronauts, especially during long-duration space missions where astronauts need to operate more autonomously.”
While this study addresses how AI could help astronauts on future space missions, AI is already being used in current space missions, specifically on the International Space Station (ISS), and include generative AI, AI robots, machine learning, and embedded processors. For AI robots, the ISS uses three 12.5-inch cube-shaped robots named Honey, Queen, and Bumble as part of NASA’s Astrobee program designed to assist ISS astronauts on their daily tasks. All three robots were launched to the ISS across two missions in 2019, with Honey briefly returning to Earth for maintenance shortly after arriving at the orbiting outpost and didn’t return until 2023.
Each powered by an electric fan, the three robots perform tasks like cargo movement, experiment documentation, and inventory management, along with possessing a perching arm to hold handrails for energy conservation purposes. The long-term goal of the program is to help enhance this technology for use on lunar crewed missions and the Lunar Gateway. But how important is it to incorporate artificial intelligence into future crewed missions, specifically to Mars?
“Astronauts are currently supported by a team during training and their missions,” Bensch tells Universe Today. “Mars missions involve significant delays, which makes ground support difficult during time critical situations. AI assistants that provide quick, reliable access to procedures and live data via voice and AR are essential for overcoming these challenges.”
How will AI assistants help astronauts on long-term space missions in the coming years and decades? Only time will tell, and this is why we science!
As always, keep doing science & keep looking up!
The post Astronauts on Long Missions Will Need Personal AI Assistants appeared first on Universe Today.