NASA Develops Pod to Help Autonomous Aircraft Operators 

A white helicopter with blue stripe and NASA logo sits inside of an aircraft hangar with grey cement floors and white roofing with metal beams. The helicopter has four grey blades and has a black base. A white cube is attached to the black base and holds wires and cameras. No one sits inside the helicopter, but the door is open, and a grey seat is shown along with four black, tinted windows. There is an American flag on the helicopter’s tail.
The NASA Airborne Instrumentation for Real-world Video of Urban Environments (AIRVUE) sensor pod is attached to the base of a NASA helicopter at NASA’s Kennedy Space Center in Cape Canaveral, Florida in April 2024 before a flight to test the pod’s cameras and sensors. The AIRVUE pod will be used to collect data for autonomous aircraft like air taxis, drones, or other Advanced Air Mobility aircraft.
NASA/Isaac Watson

For self-flying aircraft to take to the skies, they need to learn about their environments to avoid hazards. NASA aeronautics researchers recently developed a camera pod with sensors to help with this challenge by advancing computer vision for autonomous aviation.  

This pod is called the Airborne Instrumentation for Real-world Video of Urban Environments (AIRVUE). It was developed and built at NASA’s Armstrong Flight Research Center in Edwards, California. Researchers recently flew it on a piloted helicopter at NASA’s Kennedy Space Center in Cape Canaveral, Florida for initial testing.  

The team hopes to use the pod to collect large, diverse, and accessible visual datasets of weather and other obstacles. They will then use that information to create a data cloud for manufacturers of self-flying air taxis or drones, or other similar aircraft, to access. Developers can use this data to evaluate how well their aircraft can “see” the complex world around them.  

A woman with brown hair pulled into a bun, wearing a white, collared shirt with black lines, stands in the foreground of the photo. She is working on a grey laptop computer with black screen with computer coding shown. Behind her, on the left side, is the side of a man’s head and he is wearing a red polo. On the right side, behind her computer, is a white cube with wires and the man is placing his hand inside.
NASA researchers Elizabeth Nail (foreground) and A.J. Jaffe (background) prepare the NASA Airborne Instrumentation for Real-world Video of Urban Environments (AIRVUE) sensor pod for testing at NASA’s Kennedy Space Center in Cape Canaveral, Florida, in April 2024.
NASA/Isaac Watson

“Data is the fuel for machine learning,” said Nelson Brown, lead NASA researcher for the AIRVUE project. “We hope to inspire innovation by providing the computer vision community with realistic flight scenarios. Accessible datasets have been essential to advances in driver aids and self-driving cars, but so far, we haven’t seen open datasets like this in aviation.” 

The computer algorithms that will enable the aircraft to sense the environment must be reliable and proven to work in many flight circumstances. NASA data promises that fidelity, making this an important resource for industry. When a company conducts data collection on their own, it’s unlikely they share it with other manufacturers. NASA’s role facilitates this accessible dataset for all companies in the Advanced Air Mobility industry, ensuring the United States stays at the forefront of innovation. 

Once the design is refined, through evaluation and additional testing, the team hopes to make more pods that ride along on various types of aircraft to collect more visuals and grow the digital repository of data.

To source

, ,

Comments are closed.

Space, astronomy and science