SLAM : Re-defining precise localization and mapping for autonomous systems

Date: August 16, 2018

Background

Previously known as SMAL, and now more eye pleasingly termed as SLAM, is a method of identifying precise location of an object in an environment and simultaneously tracking it. Previous decade has seen lot of traction in robotics and autonomous systems. These autonomous systems have application in numerous fields like medical, automotive, military, space, entertainment and other fields. In these application autonomous systems/ robots are expected to complete complicated tasks like navigation - both indoor and outdoor - with different climatic condition. To perform these tasks having precise information pertaining to location is a major requirement. Various research and new technological works have been conducted in past and present to address this issue of localization.

“Most existing devices which require location details use gyroscope and odometry to identify the location in real time. , lLocation identification applications like autonomous car, robots, AR-VR devices look for an alternative which is more dependable and accurate.”

Odometry – the most prevalent mechanism to find real-time location

Odometry is the use of data from encoders and actuators to identify/estimate change in location. Major drawback of this system is that its error prone due to miscalculation related to velocity estimation, accurate data collection and system calibration, which is required to effectively use the information to identify new location.

Some common Actuators and encoders

  • Actuators- These are the systems that typically generate motion like electric motors.
  • Rotary encoder- These are encoders which convert rotary motion of a shaft or axle to digital output signals (Further based on working are classified into mechanical and optical)

Illustration

Consider a robot which moves forward and backward. , Each rotary motor has a wheel encoder which identifies if the rotation is complete and in which direction. If left side wheels rotate and right ones are stationary the path is traced in a right arc (and similarly for left arc). Position of the wheels post tracing a circular path is identified, these new coordinates are compared with initial coordinates/ start position and new location is mapped by the system.

What is SLAM?

Durrant-Whyte and John J. Leonard developed SLAM which is an acronym for simultaneous localization and mapping. SLAM addresses the issues that just using odometry put forth, making mapping of unknown location and identifying location more accurate and precise.

Components of SLAM

  • Sensory input
  • Landmark extraction
  • Data association
  • State estimation
  • Landmark update

Working of SLAM

Predominantly extended Kalman filter- SALM is used for its feature-based working, for illustration also same is used. The flow chart below explains flow of data from sensor to new observation changes. Sensor are the first in line of any system that uses SLAM, there are many types of sensors as listed below. EKF (Extended Kalman filter) is used to update the location.

Fig. Data flow in EKF SLAM

For illustration purpose let’s consider a laser-based sensor on a robot placed indoor. Laser initially scans the environment for objects and identifies the position of the robot. Odometry changes are noted and EKF updates the same. The robot then identifies the position it is in based on the mapping done previously, if any landmark is not observed previously the new scan is updated. At any point of time EKF will have clear picture on position of the robot.

Fig. Working of EKF SLAM in a closed environment

SLAM VS GPS

GPS - initially developed for military use - is used across various consumer products and everyday technology. However, GPS technology has numerous disadvantages. -

“GPS doesn’t work well indoors. Even outdoors, it is accurate up to a few meters. This kind of accuracy is too low for autonomous applications.“

To address these SLAM is the new method applied to these applications and it performs better than GPS in all aspects.

  • SLAM uses many types of sensors like laser and camera (visual also termed as V-SLAM)
  • Indoor mapping and location identification are easier with SLAM
  • SLAM can produce 3D images of the surroundings

SLAM techniques/ sensors

It is established that SLAM is better than GPS technology for precise localization and mapping. Application of SLAM varies depending on the type of sensor it uses to map. Some of them are -

  • Acoustic sensor- These sensors are used for underwater application where laser and visual sensors lack accuracy. Sonar sensors provide better accuracy in terms of resolution in subsea environment but it is harder to interpret sonar depth information. Ultrasonic sensors are one of the cheapest alternatives here, they work with most type of surfaces, as long as the surface has acoustic reflecting ability. However ultrasonic sensors are sensitive to environmental factors and have slow response rate which hampers the use of them.
  • Laser- This is most sought-after sensor for SLAM. They provide high accuracy both indoor and outdoor. High speed and accuracy of laser help in having highly accurate distance measurement.
  • Stereo vision sensor- Vision/ camera-based sensors are used to obtain 3D image of location. Stereo cameras gave ability to obtain textured image. monocular cameras obtain depth information. Stereo cameras obtain 3D information from 2D images. Multiple stereo cameras or a single stereo camera mounted on a rotating platform to achieve this.
  • RGB-D Sensor- These are the sensors which combine RGB based image sensing with depth/ distance identification. Current RGB-D sensors used to obtain depth information rather than an infrared one are called as active stereo. These sensors are less accurate in indoor location due to more reliance on matching image with depth.

Application of SLAM

Robotics-industrial

In production line we can see humans working closely with robots and autonomous vehicles assisting in production activities. However, most of these robots use marked pathway to facilitate movement and stop if their sensors identify an object in the path. Also, same robots are used in warehouse maintenance wherein they travel across the warehouse staking goods. All these will change if the robot is equipped with SLAM. They not only just move in a marked path but can travel freely and interact with humans.

Autonomous car

One of the major hurdles in autonomous car is location identification. Currently most cars rely on GPS and sensing the object nearby to predict accidents. Using SLAM in place of GPS based system not only helps in increasing the accuracy of identifying position of car but also surrounding objects

AR/VR

Understanding of surroundings is one of the major factor to have an immersive AR/VR experience. To facilitate this SLAM is used to have a clear understanding of surrounding. First generation AR headsets used marker-based solution because of which they were able to map out only floor and not the complete surrounding. Most recent SLAM technology like Google tango uses points and creates a mesh of the surroundings as a result not only floor but side walls are also identified.

UAV

Using a drone that maps its environment indoor and outdoor, without the need of GPS guidance and navigate in the free space. As a result, there is no need for a dedicated flight path for the UAV to fly and it can work on completing other tasks of surveillance, mapping and others. And it in can easily understand its environment and fly freely with not much human interaction.

Planetary rovers

As early as 1997, the mars pathfinder expedition used SLAM techniques to manoeuvre on the surface of the mars. These planetary rovers rely completely on vision-based SLAM to identify the surface and surrounding of the planetary object and plan their movement on them accordingly.

Problems in implementation of SLAM

  • Dynamic object identification - SLAM confirms object’s position in an environment based on what is accepted as static or stationery object. But what if a moving object or non-stationery object interferes with motion? identification of same is important if not using the same for location identification.
  • Weather and climatic conditions - This drawback mainly depends on the type of sensor used which have their own working limitation.
  • Multiple algorithm implementation for indoor and outdoor use.

Conclusion

SLAM is not a new technique that is being pursued, instead it is a practised technique with proven accuracy and reliability. With the advent of new and sophisticated sensors and sensor fusion, as well as improved horse power on today’s chipsets, application of SLAM is growing wider day by day from autonomous cars to planetary rovers.

Further reading

SLAM

SLAM in planetary rovers

Extended Kalman filter (EKF)

ADAS

Computer vision

You might also like these blogs

post
Radar based object detection- 6 Different ways it is transforming the autonomous system’s navigation

In our previous blog, we discussed the mechanism of a…

Read More
post
Representing Decimal Data in Q-Format

Introduction Most of the real time signal processing algorithms have…

Read More
post
Enabling easy connectivity for Digital Audio Devices with Audio I2S Interface IP on FPGA

Abstract I2S (Inter-IC Sound) is a serial bus interface standard…

Read More