All Higher Education NewsAll Trending Jobs & Careers News

Ultrasound Robotics Breakthrough: Reviving Ultrasound for Robust Autonomy in Degraded Environments

WPI's Saranga Project: Milliwatt Ultrasound Enables Palm-Sized Drones to Conquer Fog, Darkness, and More

  • research-publication-news
  • robotics-research
  • ultrasound-robotics
  • robust-autonomy
  • degraded-environments

Be the first to comment on this article!

You

Please keep comments respectful and on-topic.

multicolored wallpaper
Photo by Samuel Damon on Unsplash

Promote Your Research… Share it Worldwide

Have a story or written a research paper? Become a contributor and publish your work on AcademicJobs.com.

Submit your Research - Make it Global News

Navigating the Unseen: The Dawn of Ultrasound-Powered Drone Autonomy

Palm-sized aerial robots, often no larger than a human hand, have long promised revolutionary applications in search and rescue, environmental monitoring, and infrastructure inspection. However, their potential has been curtailed by a critical limitation: reliance on visual sensors like cameras and LiDAR, which falter in degraded environments such as thick fog, swirling dust, heavy snow, or pitch-black darkness. A groundbreaking research publication from Worcester Polytechnic Institute (WPI) is changing that narrative. Researchers have developed Saranga, a milliwatt ultrasound perception system that enables these tiny drones to autonomously navigate complex, low-visibility scenarios with remarkable success.

This innovation draws inspiration from bat echolocation, adapting low-power ultrasound sensors—long overlooked in modern robotics—to deliver robust, real-time obstacle detection and avoidance. Unlike resource-intensive alternatives, Saranga operates at just 1.2 milliwatts, making it ideal for battery-constrained platforms. The work, detailed in a recent Science Robotics paper, showcases success rates exceeding 70% across harsh conditions, opening new frontiers for autonomous systems in unpredictable real-world settings.

Why Degraded Environments Challenge Robotic Autonomy

Autonomous robots excel in controlled settings but struggle where perception fails. Visual cameras suffer motion blur in low light and complete signal loss in fog or smoke. LiDAR, while precise, scatters in particulate matter and misses transparent or thin obstacles like glass panes or wires. RADAR offers penetration but demands tens of watts of power, bulky hardware, and struggles with non-metallic plastics common in indoor or urban clutter.

Degraded environments are not edge cases; they define critical missions. Consider disaster zones post-wildfire, where ash clouds obscure vision, or underground mines shrouded in dust. Statistics from robotics benchmarks highlight the gap: vision-based systems drop to near-zero performance below 1-meter visibility, while power budgets for palm-sized drones rarely exceed a few grams and watts. This research addresses that void, reviving ultrasound—a sensor modality dismissed for its noise susceptibility and short range—as a parsimonious solution.

A Brief History of Ultrasound in Robotics: From Promise to Pause

Ultrasound sensing, using sound waves above 20 kHz (inaudible to humans), has roots in robotics dating back decades. Early mobile robots like the Stanford Cart in the 1970s employed ultrasonic rangefinders for basic obstacle avoidance. Bats, with their sophisticated echolocation, inspired further exploration, leading to systems like the BatBot in the 2010s.

Yet, ultrasound faded from favor. Propeller noise on aerial platforms drowned weak echoes, yielding peak signal-to-noise ratios (PSNR) as low as -4.9 dB—barely detectable. Limited range (typically under 2 meters) and narrow field-of-view compounded issues. By the 2020s, deep learning and cheap cameras shifted focus to vision-dominant stacks. Recent commentaries in Science Robotics dub this 'the forgotten spectrum,' but advances in micro-electro-mechanical systems (MEMS) and AI denoising are staging a revival.

PeARBat160 palm-sized quadrotor drone equipped with Saranga ultrasound sensors navigating through dense fog

The Saranga Project at WPI: Engineering Ultra-Low-Power Echolocation

At WPI's Perception & Aerial Robotics (PeAR) lab, a team led by Manoj Velmurugan, alongside Phillip Brush, Colin Balfour, R. J. Prasher from InvenSense, and advisor N. J. Sanket, engineered Saranga for the PeARBat160 quadrotor. This custom drone spans a 160-millimeter wheelbase, weighs 460 grams all-up, costs around $400 to build, and flies for 5 minutes on an 850 mAh LiPo battery.

Core hardware includes dual front-facing TDK InvenSense ICU-30201 chirp sensors—MEMS devices emitting frequency-modulated sweeps at ~53 kHz with a 140° horizontal by 57° vertical field-of-view (FOV). A third downward sensor handles altitude. Processing runs on a Google Coral Mini Edge TPU at 16 Hz, integrating with ROS 2 and ArduCopter firmware. Total sensing power: a mere 1.2 milliwatts, 2,833 times lower than comparable RADAR.

How Saranga Works: From Noisy Echoes to Clean Navigation

The pipeline unfolds step-by-step. One sensor transmits a chirp; both receive echoes, forming an 'echo image'—a stacked history of 32 cycles (0.82 seconds) at 512 samples, spanning 1.66 meters range.

Noise is the nemesis: propellers generate acoustic interference. Solution one: a lightweight polystyrene foam shield (63.5 mm high) blocks downwash, doubling range to 2 meters. Solution two: Saranga, a U-Net deep neural network with 1.24 million parameters (quantized to 0.5 MB). Trained on 40,000 synthetic pairs—ideal echoes convolved with real responses, augmented by recorded propeller and environmental noise—it denoises inputs in 14.57 milliseconds, consuming 10.6 millijoules.

Post-denoising, bilateration localizes obstacles via interaural time differences (ITD) between sensors, yielding 2D (range, azimuth) maps. A co-occurrence matrix matches echoes, median-filtered for robustness. For 3D, a vertical sensor enables trilateration. Navigation employs a reactive potential field: forward thrust reduces near obstacles; lateral/vertical vectors dodge via unit repulsion, with hysteresis to avoid chattering.

red, white, and blue abstract painting

Photo by Joel Filipe on Unsplash

Impressive Results: Thriving Where Others Fail

Rigorous testing in a 11x4.5x3.65-meter netted arena and outdoor forests validated Saranga. Indoor composite trials (transparent films <0.02 mm thick, PVC/aluminum poles 1.5-6 cm, boxes) yielded 69.57-100% success over 20+ randomized runs per scenario:

  • Dense fog (<0.75 m visibility): 90%
  • Darkness (0.2 lux): 100%
  • Snow: 75%
  • Thin/transparent obstacles: 77-81%
  • 3D mixed degraded: 72.7%

Outdoor forests (low/medium/high tree density): 77-91% success. Speed tests hit 100% at 1 m/s, 73% at 2 m/s. Saranga outperformed baselines like BatDeck (13/17 vs. 1/17 successes) and classical denoising (superior PSNR/SSIM/MSE).

EnvironmentSuccess RateTrials
Dense Fog90%20
Low Light100%20
Snow75%20
Forest (High Density)85.7%Varied

Superiority Over Conventional Sensors

Saranga shines in a performance matrix. Cameras fail utterly in fog/darkness; LiDAR scatters in particulates, ignores glass; RADAR misses dielectrics like plastic walls. Ultrasound penetrates fog/snow unaffected, detects sub-millimeter transparents, and localizes at centimeter resolution—all at microsecond latencies and nanowatt power.

For instance, in fog machines simulating disaster smoke, vision accuracy plummeted to 0%, while Saranga held 89.3% average across conditions. This parsimony—minimal sensors, onboard compute—slashes SWAP (size, weight, and power), vital for swarms or prolonged missions.

Read the full Science Robotics paper for in-depth metrics and code on Zenodo.

Performance comparison chart of Saranga ultrasound vs vision, LiDAR, and RADAR in degraded environments

Implications for Real-World Applications and University Research

This breakthrough extends to search-and-rescue (smoke-filled buildings), precision agriculture (dusty fields), and cave exploration. Palm-sized form factors enable access to tight spaces denied to larger drones. At universities like WPI, it fuels interdisciplinary robotics programs, blending aerospace, AI, and bio-mimicry.

Student theses, like Phil Brush's MS defense on Saranga, underscore hands-on training. As autonomy demands grow, such innovations position higher education as incubators for talent in PeAR lab-style research. Broader impacts include safer autonomous vehicles and humanoids, per accompanying commentary.

Challenges Overcome and Lessons for Future Designs

Key hurdles—noise mitigation, generalization—were tackled via hybrid physics-AI: shields for hardware, synthetic data for software. Ablations confirmed each: unshielded range halved; sans-Saranga, detection failed. Future iterations target swarms, longer ranges via arrays, or Crazyflie nano-drones (<50 mm).

Ethical considerations include wildlife impact (bat frequencies avoided) and regulatory nods for low-power emissions. Open-sourcing code democratizes access, spurring global university collaborations.

The Road Ahead: Scaling Ultrasound Autonomy

Authors envision multi-sensor fusion sparingly, emphasizing ultrasound's standalone robustness. Integration with edge LLMs could enable semantic mapping (e.g., 'person vs. debris'). With climate-driven disasters rising—flood fogs, wildfire smokes—Saranga-like tech is timely.

For academia, it highlights careers in resilient robotics: from PhD pursuits to faculty roles pioneering bio-inspired systems. WPI's success exemplifies how targeted university labs drive industry transformation.

Stakeholder Perspectives: From Labs to Industry

PeAR advisor N. J. Sanket notes, 'This proves tiny robots needn't sacrifice capability for size.' InvenSense's Prasher brings commercial viability. Press like EurekAlert hails it as 'bats inspire aerial advance.'

Challenges persist: scaling to mmWave hybrids or regulatory hurdles. Yet, with 89.3% cross-condition accuracy, ultrasound's revival promises more reliable autonomy.

Portrait of Prof. Marcus Blackwell

Prof. Marcus BlackwellView full profile

Contributing Writer

Shaping the future of academia with expertise in research methodologies and innovation.

Discussion

Sort by:

Be the first to comment on this article!

You

Please keep comments respectful and on-topic.

New0 comments

Join the conversation!

Add your comments now!

Have your say

Engagement level

Frequently Asked Questions

🦇What is the Saranga ultrasound system?

Saranga is a low-power (1.2 mW) ultrasound perception stack developed at WPI for palm-sized drones, enabling navigation in fog, darkness, and more using dual chirp sensors and AI denoising.

🌫️How does ultrasound outperform cameras in degraded environments?

Ultrasound penetrates fog, snow, and darkness unaffected, detecting thin/transparent obstacles cameras miss, with 90-100% success vs. vision's 0% in low visibility.

🚁What are the key specs of the PeARBat160 drone?

160 mm wheelbase, 460 g weight, $400 cost, 5-min flight time, equipped with ICU-30201 sensors at 53 kHz, processed on Coral Mini TPU.

🔇How does Saranga denoise propeller noise?

Combines physical foam shields (63.5 mm) to block downwash and a U-Net neural network trained on synthetic data, boosting PSNR from -4.9 dB for 2 m range.

📊What success rates did experiments show?

70-100% across fog (90%), dark (100%), snow (75%), forests (85.7%), outperforming LiDAR/RADAR on plastics and transparents.

📡Why was ultrasound 'forgotten' in robotics?

Noise issues, short range, and vision dominance sidelined it; AI denoising and MEMS sensors revive it for robust, low-SWAP autonomy.

🔍What applications benefit from this technology?

Search-and-rescue in smoke, cave exploration, dusty agriculture, infrastructure checks—any GPS-denied, low-visibility scenario.

🗺️How does navigation work in Saranga?

Reactive potential fields with bilateration/trilateration for 2D/3D obstacle maps; forward velocity with lateral repulsion, up to 2 m/s.

🎓Implications for university robotics research?

Boosts bio-inspired, resilient systems; opportunities in PhDs, faculty roles at labs like WPI PeAR for aerial autonomy careers.

🔗Where to access the research and code?

Full paper in Science Robotics; code/data on Zenodo; project at PeAR WPI.

🚀Future developments for ultrasound robotics?

Swarms, longer ranges, nano-drone integration, LLM fusion for semantics—scaling bio-mimicry for wild environments.