Artificial Intelligence model patterned after bats equips drones with sonar vision for navigating disaster-stricken locales.
In a groundbreaking development, scientists at the University of Michigan have created an artificial intelligence (AI) powered ultrasonic echolocation system for robots and drones. This technology, funded by the US Army Research Office and Ground Vehicle Systems Center, mimics biological echolocation by firing high-frequency sound pulses and analyzing how they bounce back from nearby objects [1].
The system works by emitting high-frequency ultrasonic pulses, similar to how bats and dolphins use sonar. These pulses travel through the environment, bounce off nearby objects, and return as echoes. The AI then analyzes these echoes, taking into account their scattering patterns and the time they take to return, to construct a spatial map of the surroundings and identify objects, even in complete darkness or through visual obstructions like smoke or dust [1].
To ensure the AI system's strength in complex scenarios, the researchers intentionally selected object sizes and materials that would produce confusingly similar reflections. The AI was fed thousands of simulated echo patterns augmented to reflect real-world variations in material, angle, and noise. The AI learned how different object shapes reflect sound from various angles [1].
The researchers developed a unique AI model built on an ensemble of convolutional neural networks (CNNs) to decode these echoes. Each CNN was focused on a different object type, learning to identify subtle shape-based differences in the echoes. This AI-driven approach allows robust detection and navigation capabilities in environments where traditional visual sensors might fail [1].
The potential applications of this ultrasonic echolocation system extend beyond defence and traditional robotics. For instance, it could be used in autonomous vehicles and drones to enable navigation in zero-visibility conditions such as night, fog, or smoke, improving safety and operational capability [1][3]. Underwater exploration is another promising field, where similar principles could enable underwater drones to navigate and interact with environments opaque to light [2].
Environmental monitoring and cleanup, industrial automation, assistive technologies for visually impaired people, and search and rescue operations are other potential areas where this technology could be beneficial [1][3]. In these scenarios, the system could detect and map pollutants or obstacles in challenging or dynamic environments, provide enhanced spatial awareness tools for visually impaired people, or navigate hazardous environments where visibility is compromised.
In summary, the ultrasonic echolocation for robots uses AI to interpret reflected sound pulses to build accurate spatial awareness in conditions unsuitable for optical sensors. This breakthrough holds promise for a wide range of fields requiring precise navigation and object detection in low-visibility, visually obstructed, or complex environments far beyond military applications [1][3]. The study on this technology has been published in the Journal of Sound and Vibration.
- This technology's potential applications extend to autonomous vehicles and drones, enabling safe navigation in low-visibility conditions like night, fog, or smoke.
- Robotics, especially underwater drones, could leverage similar principles for navigation and interaction within light-opaque environments, expanding their capabilities.
- Additionally, this ultrasonic echolocation system could be beneficial in environmental monitoring, industrial automation, assistive technologies for visually impaired individuals, and search and rescue operations due to its ability to detect and map pollutants or obstacles in complex or dynamic environments.