Blog | Exyn Technologies

How Do You Define Autonomy Level 4B?

Written by Exyn Technologies | Sep 23, 2024 9:00:00 AM

In 2021, we pioneered the only industrial-grade autonomous aerial robot to achieve autonomy level 4A, making it capable of free-flight exploration of complex spaces, with complete determination of flight path at 2 meters/second flight speeds, and with higher quality data collection in larger volumes.

Unlike previous industry standards of aerial autonomy, our proprietary SLAM algorithm, ExynAI, is completely self-reliant for open-ended exploration and does not require any human interaction during flight to complete its mission. This was a major step up from the previous Level 3 standard, in which a human operator or driver was required to be present and available to take control of the system at any time.

But once you enable a robot to autonomously explore an unknown environment, it can sometimes get itself into trouble. That fearless attitude to explore until the mission is complete or the battery is spent is unlike a human operator flying in an underground or unknown environment, especially beyond visual line of sight. 

Environment Factors That Affect Online SLAM

One of the major factors that affects aerial robots in underground or industrial environments is dust. As soon as surveyors began bringing LiDAR scanners/lasers underground they realized the sensors picked up dust particles, and even water vapor the same way they would a thin wire or other small obstacle. 

Examples of dust that Nexys encounters during autonomous flights underground

And while this sensor noise can be successfully filtered out of the final point cloud after post processing, a real-time online SLAM algorithm needs to know if an obstacle is in front of it or not. It's the difference between a routine mission and a costly replacement and possible delay in ongoing operations.

And even if you avoid a crash, the robot just gets stuck – like a mime trapped in a box of its own making. 

As more and more customers sent the ExynAero into increasingly challenging environments, we began to study how dust affected our online SLAM and how to make it more robust so that future platforms could confidently explore deep into the unknown.

We're excited to announce that ExynAI is now capable of Autonomy Level 4B navigation -- enabling Nexys to pilot aerial robots into dusty, dangerous environments beyond visual-line-of-sight to capture accurate 3D models while interpreting and overcoming dust as a perceived obstacle. 

How We Define Autonomy Level 4B

When we achieved Autonomy Level 4, we set about to create a model that could be universal throughout the autonomous aerial robotics industry. We based it on previous successful models of unmanned autonomous vehicles, the SAE levels for driverless vehicles. However, after consulting with experts in the field of aerial autonomy and writing a whitepaper to explain our endeavor, we realized that level 4 autonomy was more complex than a single designation. There was a level of understanding that the robot would need to achieve before truly reaching anything close to level 5 autonomy. It needs to understand and reason about its environment. 

But rather than ranking the environmental factors a UAV will need to overcome (light, dust, wind, etc.) individually which could be easily cherry-picked for a specific use case, we chose to rank complexity by how the autonomous system interprets and overcomes obstacles these various environments present. This is currently how we're designating our sublevels of Autonomous Understanding and Reasoning (AUR):

  • AUR A – A UAV can sense and avoid dynamic obstacles
  • AUR B - The UAV is capable of sensing & navigating around obstacles in its environment, but also can make determinations about perceived obstacles and how to approach them
    • Ex: The UAV can delineate between "phantom" obstacles induced by dust while still avoiding obstacles
  • AUR C - The UAV is capable of sensing & navigating around obstacles in its environment, and uses that information to make determinations about how to adjust its mission objectives
    • Ex: The UAV can identify people, doorways, windows through dust & smoke and uses that information to execute its mission

As you can see, the rungs up the autonomy ladder get taller with each step up. It's our belief that this will contribute to a future where robust autonomous robotic systems can tackle unknown obstacles and unleash human potential and prevent us from becoming trapped in the false promise of highly automated systems. 

ExynAI interpreting and overcoming dust in a man-sized shaft

And now that our online autonomy stack inside every Nexys can perform in Autonomy Level 4B environments, we're confident that our robots can help operators send robots boldly where previous robots have struggled to go before. Deep into dusty mines to capture business critical data sets for safety and industry compliance. Beyond line-of-sight in the wilderness to inspect critical infrastructure to keep environments safe and resources flowing. And reliably exploring ongoing construction projects while capturing accurate and actionable high-density 3D point clouds. 

We understand and accept that these levels are evolving as the current landscape of autonomous robots advances, and we encourage our fellow roboticists to help us craft these models for a future of responsible, reliable, and collaborative autonomous robots.

Book a personalized demonstration of Nexys in the field and experience the confidence of Level 4B Autonomy for yourself.