November 11, 2024

Multi-Sensor Fusion: Supercharging Online & Offline SLAM

The development of Simultaneous Localization & Mapping (SLAM) technology for autonomous robots is comparable to the invention of transistors in the early days of electronics and computers. Both were groundbreaking technologies that opened up a world of previously unknown possibilities that had been limited to the realm of science fiction.

SLAM technology allows a robotic platform to understand its spatial location as it moves around an environment and then recreate a precise 3D rendering of that environment once it's done exploring. From driverless cars to our autonomous Nexys payload, SLAM is one of the key breakthroughs that make these technologies possible.

As SLAM technology improves, it’s paving the way for countless new use cases that further expand what’s possible with autonomous exploration. Below, we’ll look at how it's possible to incorporate additional sensors into Nexys to create 3D renderings that provide engineers, architects, and project managers with more critical data than ever before.

Adding Data Layers To Feature-Rich 3D Models

gas-sensor-dataMost mobile mapping systems contain Light Detection and Ranging (LiDAR) sensors. These sensors are used for digitally visualizing the surrounding environment. Each sensor contains a certain number of beams that are continually rotating and reflecting back points. On our Nexys platform, the sensor is then rotated to give the robot a 360º view of its environment while it navigates. All of this information is then post-processed to create a precise 3D rendering of the mapped environment, also referred to as a point cloud.

And point cloud maps don't just end with LiDAR data. Nexys is able to ingest a wide variety of sensor information and store that alongside feature-right point cloud data. We've successfully flown missions where ExynAI is displaying gas sensor information as precise X, Y, Z locations in a point cloud. We like to call this combination of 3D sensor data, multi-sensor fusion. 

Take for example, an oil refinery in Texas may be experiencing an emergency situation and harmful gasses are beginning to leak, but the exact extent and locations of the leak are not known. This can present a serious hazard for first responders as well as the public in the area.

Using an autonomous aerial drone or ground-based Spot robot equipped with our Nexys system, a sensor can be fitted to detect harmful gas concentrations at exact locations and concentrations throughout the facility.
rad-sensor-dataThis data can then be overlaid with the 3D map of the refinery property, giving first responders precise information as to the gas concentrations in various areas. This can help them safely plan their movements as well as locate the origin of the leak.

This is just one example, but it shows the flexibility of using highly accurate 3D point clouds such as those created by our Nexys platform along with other sensors.

GPS Flight For Featureless Environments

Increasing the amount of sensors and data that our Nexys payload has access to also allows ExynAI to choose data streams during online SLAM for more robust, safer, and contiguous autonomous navigation.

With every autonomous robotic SLAM platform, feature-rich locations are required to "maintain state" – basically so the robot knows where it is in its environment. In areas with poorly defined features, such as large open spaces or confined spaces with smooth walls, SLAM can get lost, "lose state", and most likely crash. And even if you walked or captured SLAM data with a vehicle, the resulting point cloud data can be too inaccurate for use due to the lost state.

Nexys-gps-outdoor-flightExample of robot losing state while in flight, without fuzz the map has been re-aligned using GPS data. 

In areas like this, our ExynAI technology stack can intelligently switch to live GPS data with additional sensors equipped. There’s no operator intervention needed and the system makes this choice on its own to obtain the most accurate mapping possible while on mission.

You can think of this internal Nexys decision-making as similar to when you’re researching a topic on Google. During your research, you'll look at several results and use logic and experience to determine which sources are the most reliable for your research. By choosing the best sources to make your decision, you reach your goal faster and ensure accuracy.

Our ExynAI system does the same thing but with navigational data, it determines which sources will provide the most accurate point cloud in any situation. This can take place in real-time or during post-processing.

This is perfect for when the Nexys system is surveying a forest or wooded environment where there are vast open areas between densely packed woods. The Nexys system is capable of navigating both while seamlessly switching between LiDAR/SLAM and GPS sensor streams as it moves between landscapes.

Multi-Sensor Fusions for Search & Rescue

Multi-sensor fusion also allows for extended use cases such as in search & rescue operations.

A thermal sensor can be equipped onto a Nexys-powered aerial drone or ground-based Spot robot. The thermal imagery captured during the search can be precisely overlaid with geospatial data for a specific area giving first responders more situational awareness.
Ov1 draft If a thermal signature is found, first responders can focus on that specific area instead of tedious grid searches. The result is that victims are rescued faster and first responders are put in less danger.

Whatever the mission is, multi-sensor fusion technology is all about giving decision-makers the most accurate and comprehensive data possible.  

Experience Multi-Sensor Fusion With Nexys

At Exyn, we understand trying to discern fact from fiction. Which is why we believe in demonstrating the power of our robotic autonomy in-person for all our customers.
Enabling-Zero-Dift-2Seeing the Nexys system in action as it precisely and autonomously maps an area of interest is truly something straight out of a sci-fi movie. Once mapped, ExynAI post-processing takes over and seamlessly integrates multi-sensor input to create accurate, colorized, feature-rich 3D models for geospatial professionals across the construction, inspection, and mining industries. 

Contact us today for your personalized Nexys demo and learn how multi-sensor fusion can change the way your business surveys in virtually any environment.

Subscribe to email updates