Top common ways to localize a mobile robot

Here are some most popular devices you can use for robot localization.

May 17, 2023

by

Aleksandra Szczepaniak

GPS-generated map
The red mark is our office in Wrocław, Poland ;)

In today’s robotics, robot localization is a process crucial for many applications and it can be performed in a wide range of methods. In this article, we’re going to zero in on the most typical tools that can be used for this purpose. Let’s dive right in.

Robot localization in short

Localization can be divided into two types: local and global. Local methods require the starting location of the robot to be approximately known and, normally, they’re unable to recover the data if they lose track of the robot’s position. Devices relying on local localization include IMU, encoders, LiDARs, ultrasonic sensors, or onboard cameras.

As for global techniques, they can locate a robot without prior knowledge of its position and are more powerful than local ones and can handle situations in which a robot may encounter serious positioning errors. Global localization of a robot can be performed with tools such as GPS, beacons, radars, or off-board cameras.

Now, let’s take a look at each of these devices.

IMU

One of the things used for robot localization and navigation are Inertial Measurement Units aka IMUs. They usually incorporate various gyroscopes and accelerometers and sometimes also barometers and magnetometers. IMU allows for obtaining parameters of the robot such as its pose, velocity (angular, linear), and acceleration (angular, linear). However, IMUs are prone to biases, drifts, and some other errors.

LeoCore controller board
IMU (marked in red) in the LeoCore controller board

IMU is fundamental in any robotic system and delivers a vast range of information about the motion of the robot which makes it highly useful for navigation. Whereas a typical commercially available IMU provides a high update rate (200 HZ+), an accelerometer usually entails too much noise to be handy for position estimation. You can use it for really tight correction of the pose by integrating the readings over a short period.

Encoders

Another source of information crucial for robot localization are wheel encoders typically used for odometry. Encoders estimate the distance the robot has traveled by counting the exact number of rotations of the robot’s wheels. The data they gather is continuous and mostly available at high frequency.

Leo Rover's motor with encoder
The Leo Rover’s wheel motor with a magnetic encoder

This method raises many challenges, though, as the data might not always be reliable. For example, there’s always some tire slippage with wheeled rovers, which needs to be taken into consideration. This means that the data gathered through the wheel encoders may be far from the actual state of the robot’s motion making the estimates of the position useless, so it’s best to combine it with other sensors. Also, odometry isn’t a good choice in the case of differential steering methods when you perform zero-radius turns.

GPS

A GPS module on the Leo Rover mobile robot
A GPS module on a Leo Rover mobile robot

Another device commonly used for robot localization is GPS. The information that GPS provides includes longitude, latitude, and altitude. Off-the-shelf GPS sensors can provide reporting data with an accuracy of one to twenty meters. The accuracy of the Global Positioning System has notably improved throughout the years, though, and there are also very precise modes such as RTK (Real-Time Kinematic). RTK corrections greatly enhance accuracy to the decimeter level. However, fixing an RTK base station or ensuring constant internet connectivity for NTRIPs (which is a method of delivering RTK corrections over the Internet) corrections is not always feasible.

Areas such as tunnels, indoors, and the like pose a serious obstacle for GPS. Still, GPS is highly affordable and significant for performing perception-less robotic navigation outdoors and it yields an accurate periodic reference.

You might want to look into the GPS module we provide in our online store ;).

Beacon-based localization systems

Beacon-based localization system scheme

Robot localization can also be performed with beacon-based localization systems that rely on, well, beacons that are strategically planted within a particular space and send signals to your device.

Beacons are typically used in Indoor Positioning Systems (IPS) allowing to locate and navigate robots in areas where GPS couldn’t, like indoors. Among beacon-based systems handy in IPS are, for example, WiFi beacons and Bluetooth Low Energy (BLE) ones.

A beacon-based localization system, specifically Marvelmind’s IPS, is used by FLOX in poultry farms to localize a Leo Rover that moves through a chicken shed. Find more details on the mobile robot applied in poultry farms in this article.

Range finder sensors

LiDARs

Most of the time, localization of a mobile robot is performed with the use of laser sensors.

A popular example here is LiDAR, about which you can read in more detail in this post. Also, check out our online s to see the LiDAR we offer.

A LiDAR on the Leo Rover robotic platform
A LiDAR on the Leo Rover robotic platform

Depending on the environment the robot operates in, that is, indoor or outdoor, as well as the robot’s speed, laser sensors can vary widely in performance, price, range, weight, and durability. The majority of such devices are based on the time-of-flight principle in which the distance between a sensor and an object is determined by measuring the time needed for the signal to travel from the source to the object and back.

Signal processing is carried out to output points with angle and range increments. For this task, both 2D and 3D lasers come in handy. The sensors send lots of data regarding every single laser point of range data. A lot of computing power is needed to make the most of lasers, yet it’s still less demanding than stereo-vision.

Ultrasonic sensors

Ultrasonic sensors are another kind of devices that come in handy in robot localization and navigation. If there’s an object within the range of an ultrasonic sensor pulse, some or all of the pulse will be bounced back to the transmitter as an echo and then detected by the receiver. 

Commercially available ultrasonic sensors perform measurements through the aforementioned time-of-flight principle. It’s enough to calculate the path distance. However, this kind of information isn’t easy to interpret. The reason for that is a wide beam of the released signal (from 20° to 50°).

An example of an ultrasonic sensor is sonar.

Radars

A tool that can also be useful in robot localization is a radar. The name “radar” is an acronym for Radio Detection And Ranging and it’s a way of using a channeled radio wave to establish the distance from a given object and how fast it’s moving, as well as its approximate size and shape. Radars reflect radio waves off a robot’s surroundings to gain the distance to any radar-reflective object it’s aimed at.

Unlike most other sensors, radars can operate in adverse weather conditions including fog, rain, snow and changes in lighting. Consequently, this type of sensor can effectively detect obstacles and precisely image the area around the moving robot in scenarios where, for example, LiDARs would fail. Their resolution, however, is inferior to laser sensors.

Radars are commonly used in military applications to detect and track distant objects before they pose a threat.

Vision sensors

The Leo Rover’s front camera

Crucial in robot localization and navigation are also vision sensors. They include all kinds of cameras, both 2D and 3D, as well as depth cameras. We’re going to focus on stereo, RGBD cameras, and optical flow.

Stereo-vision and RGBD cameras

Another solution for robot localization is a stereo-vision camera. It’s equipped with at least two image sensors (lenses) that allow it to mimic human binocular vision, giving it the ability of depth perception. Such cameras can gather 3D coverage data at high frame rates, as well as capture textures and colors that laser scanners can’t fully detect. Stereo-vision cameras tend to be the least sensitive to the object material, background, or different lighting conditions, which makes them great for outdoor localization and mapping applications.

As to an RGBD camera, it’s a type of camera that enhances the effectiveness of depth-sensing camera systems by allowing for object recognition. It provides both color (RGB) and depth (D) data as real-time output. Depth information can be obtained using a depth map/image created by a 3D depth sensor, such as time-of-flight or stereo sensor. RGBD can combine pixel-by-pixel RGB data and depth information delivering them both in one frame.

Thanks to adding RGB data to the point cloud or depth map of a 3D depth sensing camera, locating objects through pattern recognition or detection is more effective (FYI, object detection is one of the three new software features in the stock Leo Rover, about which you can read here). This comes in handy, especially in applications that need to determine the nature and type of objects in the scene and measure their depth.

Check out this article on top affordable cameras you can use with your rover.

Optical flow

Another way to localize a robot that has to do with visual sensors is optical flow. In short, this approach involves capturing characteristic objects of known size or known distance to them in a single image and estimating the camera’s lens shift relative to the motionless surroundings.

Localization through optical flow is commonly used in drones. Interestingly, it’s nothing out of the ordinary to use mouse sensors in drones for this purpose.

To gain more insight on optical flow for robot localization, check out this article. Also, you might want to read this paper to learn about a specific optical flow application.

Localize your mobile robot

There you have it. The top common ways for robot localization. But bear in mind that, since nothing is perfect, to end up with the most reliable data, it’s best to combine some of these tools so that they complement one another. For example, check out our tutorial on how to execute SLAM and autonomous navigation on your Leo Rover with IMU and LiDAR sensors.

Want to see more posts like that?

Subscribe to stay informed.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

See more blog posts:

<- get back to the Blog