As any operator on a job site can tell you, dust gets into everything. It breaks down parts, messes with sensors, and is ever present even when you try everything in your power to get rid of it. And now with the advancement of 3D mapping drones making their way onto construction sites and other industrial environments, you can bet their propellors will kick up quite a bit of dust too. Which causes problems for onboard navigational sensors. Or ... at least it used to.
Robots that are equipped with a Nexys, can autonomously navigate through an online SLAM algorithm (you can read all about how SLAM works here). Essentially, if too many dust particles are picked up through the LiDAR sensor SLAM will interpret that as a solid object, causing robots to get stuck in dust clouds leading to wasted battery cycles, or worse, a crash.
A Nexys & DJI M300 in the field after a few dusty flights
This isn't just an Exyn problem, by the way. This is a major hurdle for SLAM-powered navigational platforms. Researchers have been working for years to help these platforms "see" dust in real time. However, when you try to tune the LiDAR data to filter out the dust the robot then has a hard time seeing thin wires and smaller obstacles, which is potentially catastrophic!
However, at Exyn we’ve developed a proprietary machine learning algorithm that can now accurately detect dust clouds in real time and adjust automatically to ensure the area is mapped safely and accurately. And this Autonomy Level 4B (read more about the standardized levels of autonomy here) is now baked into every autonomy-enabled Nexys for all our customers.
You can think of a dust filter as a software version of polarized sunglasses you might wear during a road trip.
When driving toward the setting sun, glare can obscure the fine details of the road in front of you. Road signs and even other cars are often partially obscured by the glare. A dust cloud has a similar effect on how SLAM might interpret LiDAR data.
In a car, you would put on a pair of polarized sunglasses and suddenly those obscured details would become visible, giving you a clear image of the road ahead.
Nexys flying in rugged, dusty environments.
A dust filter for a LiDAR system is similar, but instead of being a physical lens or filter like a pair of sunglasses, it’s a computer algorithm that removes the “glare” by enabling ExynAI -- the package of proprietary SLAM algorithms that power Nexys' autonomy & mapping -- to parse millions of point cloud data points into real objects or simply dust particles.
However, not all dust filters are created equal, and our new ExynAI dust filter has a few tricks up its sleeve that allow you to conduct digital mapping in more challenging environments.
We knew that our real-time dust filtering algorithm couldn't consume a ton of computational resources or otherwise it would bog down the autonomy. We needed it to work quickly, have a robot navigate confidently, and most importantly it can't affect the accuracy of the point cloud once it's captured and processed.
So to get started, we looked at years of previous flights in dusty environments and fed them into a machine learning algorithm to help teach ExynAI what dust looked like. This flight data would help us train a model that can better detect dust in real-time.
The red points are "dust" that Nexys is parsing in real time, green cubes are voxels defining occupied space.
This efficiency allows for real-time dust filtering during autonomous exploration of mines and other hazardous environments that would pose significant safety and accuracy issues with other less capable dust filtering systems.
The result is that you can now capture more accurate mapping data in dusty and obscured environments – saving time, money, and most importantly, keeping operators out of harm's way.
Our dust filter works in three phases to make it ultra-efficient while also providing the accuracy needed for autonomous exploration in difficult environments.
As ExynAI is ingesting LiDAR data during flight, it broadly identifies points that might be dust based on simple metrics. This has been a fairly standard approach for other navigational systems. And as the robot navigates new environments, researchers can tune this filter stage to better detect dust in novel environments but ultimately you still filter out too much for the robot to navigate safely.
See at this point the robot knows what a dust particle might look like, but it doesn't know what the shape of a dust cloud could look like. And that's where stage two comes in.
You can think of these stages like sifting sand through a screen. Stage one was broad and we pulled in a lot of potential dust particles, now in stage two we feed them through that machine learning model we developed to show the robot what the "shape" of dust looks like.
This is the crucial step, because without this "shape" ExynAI would filter out thin wires or other small features which could cause a crash or the robot to get stuck. This is has been an historic hurdle for LiDAR-based SLAM platforms operating autonomously in dusty environments.
The last step in the filtering process is how we enable the robot to do all of this immense calculation while in the middle of an autonomous flight. While in flight, ExynAI is taking point cloud data and interpreting through an occupancy mapping pipeline. This converts a group of points into a 3D box, or voxel, like a minecraft brick to define an obstacle and help the robot determine a safe flight corridor.
This adds further robustness to Nexys' autonomy because this occupancy pipeline is running thousands of times per second to determine each voxel classification. This helps eliminate spurious dust mis-detections which can cause a robot to get stuck in place.
While testing this filter in the field and with our customers, we saw operators become increasingly confident sending Nexys on missions well beyond visual line of site with this advanced dust filtering knowing the robot would return safely. This level of confidence in autonomous robots in rugged environments will further drive adoption once surveying professionals experience the reliability and robustness firsthand.
So if you're tired of dust determining what works and what doesn't on your job site, the new advanced filtering included in Nexys autonomy could be the solution you've been waiting for.
Contact us today to learn more about the Nexys portable mapping and modular autonomy ecosystem and book a personalized demo for yourself and your team.