I've had funny time playing around with the Gazebo simulator for autonomous robot exploration. One thing I've encountered is that the odometry data provided by Gazebo is so perfect that, sometime, makes the simulation less realistic. I used a Turtlebot model as robot model in my simulations. Googling around, i didn't find any solution of adding noise to the odometry data of this robot (using the URDF file). I then decided to develop a dedicated ROS node allowing me to add some random noise to the Gazebo's odometry data.

First thing fist, we need to understand the robot motion model. There are many motion models, but in the scope of this article, we focus only on the odometry motion model. Often, odometry is obtained by integrating sensor reading from wheel encoders, it measures the relative motion of the robot between time \(t\) and \(t-1\) or \((t-1,t]\). In 2D environment, a robot pose is represented by a point \((x,y)\) and an orientation (rotation angle) \(\theta\), so the robot pose at the time \(t-1\) and \(t\) are denoted by:

$$p_{t-1}=(x_{t-1},y_{t-1},\theta_{t-1})$$

$$p_{t}=(x_t,y_t,\theta_t)$$

The robot's motion is shown in the following figure

(image source: Internet)

The motion consists of a first rotation \(\delta_{rot1}\) at previous robot pose, a relative translation \(\delta_{trans}\) and a second rotation \(\delta_{rot2}\) at the new pose. The motion is denoted by:

$$u_t=(\delta_{rot1},\delta_{rot2},\delta_{trans})$$

In the ideal case where the odometry data is perfect, these values can be calculated as:

$$\delta_{rot1}=atan2(y_t-y_{t-1}, x_t-x_{t-1})-\theta_{t-1}$$

$$\delta_{trans}=\sqrt{( x_t-x_{t-1})^2+( y_t-y_{t-1})^2}$$

$$\delta_{rot2}=\theta_{t}-\theta_{t-1}-\delta_{rot1}$$

However, in real situation, there are always noises in these rotations and translation, these noises can be modeled as random Gaussian noises, or random normal distribution. Let \(N(\mu,\sigma)\) is the random normal distribution function with mean \(\mu\) and standard deviation \(\sigma\), we can add a noise to each transformation as follow:

$$\hat{\delta}_{rot1}=\delta_{rot1}+N(0,\sigma_{rot1}^2)$$

$$\hat{\delta}_{rot2}=\delta_{rot2}+N(0,\sigma_{rot2}^2)$$

$$\hat{\delta}_{trans}=\delta_{trans}+N(0,\sigma_{trans}^2)$$

With the standard deviation for each transformation noise:

$$\sigma_{rot1}=\alpha_1|\delta_{rot1}|+\alpha_2\delta_{trans}$$

$$\sigma_{rot2}=\alpha_1|\delta_{rot2}|+\alpha_2\delta_{trans}$$

$$\sigma_{trans}=\alpha_3\delta_{trans}+\alpha_4(|\delta_{rot1}|+|\delta_{rot2}|)$$

Here \(\alpha_1,\alpha_2,\alpha_3,\alpha_4\) are adjustment parameters showing how important noise in each transformation could affect the motion, they represent the change in: *degree/degree*, *degree/meter*, *meter/meter* and *meter/degree* respectively. With the added noises, the new position of the robot \(\hat{p_t}=(\hat{x_t},\hat{y_t},\hat{\theta_t})\) can be calculated as:

$$\hat{x}_t=x_{t-1}+\hat{\delta}_{trans}cos(\theta_{t-1}+\hat{\delta}_{rot1})$$

$$\hat{y}_t=y_{t-1}+\hat{\delta}_{trans}sin(\theta_{t-1}+\hat{\delta}_{rot1})$$

$$\hat{\theta_t}=\theta_{t-1}+\hat{\delta}_{rot1}+\hat{\delta}_{rot2}$$

This simple python snippet shows how the distribution of the robot pose in the motion with noise look like:

```
from math import *
import numpy as np
import matplotlib
import matplotlib.pyplot as plt
# translation distance
dx=0.2
dy=0.35
trans = sqrt(dx*dx + dy*dy)
# rotation1 = 30 degree
theta1 = 30.0*pi/180.0
# rotation2 = 10 degree
theta2 = 10.0*pi/180.0
rot1 = atan2(dy, dx) - theta1
rot2 = theta2-theta1-rot1
# the alpha parameters
a1 = 0.05
a2 = 15.0*pi/180.0
a3 = 0.05
a4 = 0.01
sd_rot1 = a1*abs(rot1) + a2*trans
sd_rot2 = a1*abs(rot2) + a2*trans
sd_trans = a3*trans + a4*(abs(rot1) + abs(rot2))
x= []
y = []
for i in range(0, 1000):
t = trans + np.random.normal(0,sd_trans*sd_trans)
r1 = rot1 + np.random.normal(0, sd_rot1*sd_rot1)
r2 = rot2 + np.random.normal(0, sd_rot2*sd_rot2)
x.append(t*cos(theta1+r1))
y.append(t*sin(theta1+r1))
fig, ax = plt.subplots()
#ax.plot(x, y)
ax.scatter(x, y)
ax.set_title('Gaussian noise of motion model')
plt.show()
```

The following figure show 1000 random possible points as the new robot's pose when the robot performs a motion with \((dx=0.2,dy=0.35,\theta_{t-1}=30\degree,\theta_{t}=10\degree\)). Usually, with this model of noise, we have a typical banana-shaped distributions of points

With this model, it is simple to add noise to the perfect Gazebo odometry data. The idea is to take Gazebo odometry data as input and to consider it as groundtruth data (since it is already perfect). The motion model is applied to this data to calculate a new odometry (with noise). Obviously, odometry's noise is accumulated during time just like in real world situation.

The following Python snippet is an implemetation of a ROS node that calculates and publishes new odometry data using the previous motion model

```
#! /usr/bin/env python
import rospy
from math import *
import numpy as np
from nav_msgs.msg import Odometry
import tf
last_odom = None
pose = [0.0,0.0,0.0]
a1 = 0.0
a2 = 0.0
a3 = 0.0
a4 = 0.0
base_frame = ""
# a callback for Gazebo odometry data
def callback(data):
global last_odom
global base_frame
global pose
global a1
global a2
global a3
global a4
if(last_odom == None):
last_odom = data
pose[0] = data.pose.pose.position.x
pose[1] = data.pose.pose.position.y
q = [ data.pose.pose.orientation.x,
data.pose.pose.orientation.y,
data.pose.pose.orientation.z,
data.pose.pose.orientation.w ]
(r, p, y) = tf.transformations.euler_from_quaternion(q)
pose[2] = y
else:
# calculate new odometry pose using the motion model
dx = data.pose.pose.position.x - last_odom.pose.pose.position.x
dy = data.pose.pose.position.y - last_odom.pose.pose.position.y
trans = sqrt(dx*dx + dy*dy)
q = [ last_odom.pose.pose.orientation.x,
last_odom.pose.pose.orientation.y,
last_odom.pose.pose.orientation.z,
last_odom.pose.pose.orientation.w ]
(r,p, theta1) = tf.transformations.euler_from_quaternion(q)
q = [ data.pose.pose.orientation.x,
data.pose.pose.orientation.y,
data.pose.pose.orientation.z,
data.pose.pose.orientation.w ]
(r,p, theta2) = tf.transformations.euler_from_quaternion(q)
rot1 = atan2(dy, dx) - theta1
rot2 = theta2-theta1-rot1
sd_rot1 = a1*abs(rot1) + a2*trans
sd_rot2 = a1*abs(rot2) + a2*trans
sd_trans = a3*trans + a4*(abs(rot1) + abs(rot2))
trans += np.random.normal(0,sd_trans*sd_trans)
rot1 += np.random.normal(0, sd_rot1*sd_rot1)
rot2 += np.random.normal(0, sd_rot2*sd_rot2)
pose[0] += trans*cos(theta1+rot1)
pose[1] += trans*sin(theta1+rot1)
pose[2] = theta1 + rot1 + rot2
last_odom = data
# publish the tf
br = tf.TransformBroadcaster()
br.sendTransform((pose[0], pose[1], 0),
tf.transformations.quaternion_from_euler(0, 0, pose[2]),
data.header.stamp,
base_frame,
"odom")
if __name__ == '__main__':
rospy.init_node('noisy_odometry', anonymous=True)
# alpha 1 is degree/degree
if rospy.has_param("~alpha1"):
a1 = rospy.get_param("~alpha1")
else:
rospy.logwarn("alpha1 is set to default")
a1 = 15.0*pi/180.0
# alpha 2 is degree/m
if rospy.has_param("~alpha2"):
a2 = rospy.get_param("~alpha2")
else:
a2 = 15.0*pi/180.0
rospy.logwarn("alpha2 is set to default")
# alpha 3 is m/meter
if rospy.has_param("~alpha3"):
a3 = rospy.get_param("~alpha3")
else:
a3 = 0.2
rospy.logwarn("alpha3 is set to default")
# alpha 4 is m/degree
if rospy.has_param("~alpha4"):
a4 = rospy.get_param("~alpha4")
else:
a4 = 0.01
rospy.logwarn("alpha4 is set to default")
# get odom topic
if rospy.has_param("~old_odom_topic"):
odom_topic = rospy.get_param("~old_odom_topic")
else:
odom_topic = "/odom"
# get base frame
if rospy.has_param("~base_frame"):
base_frame = rospy.get_param("~base_frame")
else:
base_frame = "/base_footprint"
rospy.Subscriber(odom_topic, Odometry, callback)
rospy.spin()
```

- Setting up a software stack for autonomous mono robot exploration and mapping using ROS (
**Score**: 0.13) - Simple (naive) document clustering using tf-idf and k-mean (
**Score**: 0.08) - Evaluation of grid maps produced by a SLAM algorithm (
**Score**: 0.08) - A first step guide to PhaROS (
**Score**: 0.06) - Meet Dolly the robot (
**Score**: 0.04)

Powered by antd server, (c) 2017 - 2020 Xuan Sang LE