智能机器人(37):IMU 传感器

1. setup robot
3. od_node
5. laser_node
7. imu_node

1. about imu sensor
2. anout imu usage
3. about imu code
3.1 odom-node.cpp的
3.2 imu-node.cpp的
4. about imu debug
REF:
https://answers.ros.org/question/200480/imu-message-definition/
https://github.com/jeskesen/i2c_imu/blob/master/src/i2c_imu_node.cpp
http://codegist.net/code/flora-9dof-accelerometer%2Bgyroscope%2Bmagnetometer—lsm9ds0/

(如果运行在Oracle 的VM VirtualBox , 需要install extend tools,使能 networks interface 桥接, 使能 com为主机设备 主机设备com1,然后install ubuntu 14.04 AMD64, ros ,setup env, init catkin_ws, start master )

1. setup robot
$ roslaunch funbot r_rbt.launch

.. …

这launch加载: odom的串口驱动,laser的以太网驱动,imu的USB驱动,
同时静态发布一个base_link到base_laser的tf变换。
同时静态发布一个base_link到imu_link的tf变换。
后面这项工作如果已经创建了机器人模型urdf,就不要再发布了。

3. od_node

5. laser_node

7. imu_node

1. about imu sensor

ros官方和开发者都存在的很大问题, 是imu和odom的消息格式。
odom的消息nav_msgs/Odometry, 指明了两个框架,tf就没有疑问。
imu的消息sensor_msgs/Imu,却只有一个坐标框架,就含混不清:
Q:
I am confused about the IMU though: it’s heading estimate should be in the map frame as it’s absolute, but it’s motion estimates should be in the base_link frame because they are relative to the robot ? The imu message only has a single frame_id, so which should I set it to?
Q:
It seems clear that nav_msgs::Odometry.pose.pose.orientation goes from header.frame_id to child_frame_id (or parent to child), but there is no child_frame_id in the sensor_msgs::Imu message and I do not see that . The two obvious choices are map and odom, but I don’t know if one is preferred over the other.

关键在于,要理解同样存在两个frame_id就可以:
传感器的线加速度和角速度, 是在传感器坐标框架内,例如imu_link;
传感器的转向旋转, 却是传感器相对于他的上层父坐标系,例如odom、map的那世界坐标系。

例如,写imu驱动程序时,定义imu_frame_id和tf_parent_frame_id及tf_frame_id这样三个框架:

那么,典型的:
**1. imu_frame_id would be something like imu_link, the acceleration and velocities are measured in this frame.
**2. the orientation is between a fixed inertial frame to sensor, the “fixed inertial frame” is implicitly assumed to be oriented in the same way as odom, map, or whatever top-level other fixed frame your application uses.
**3.1. orientation is calculated by user’ code using data both the accelerometer (pitch and roll) and magnetometer (yaw) and others …
**3.2. packages like robot_pose_ekf do not use the orientation vector in an absolute style ,it actually calculates changes in heading.

So:
You should populate the frame_id of the sensor_msgs/IMU message with the frame in which the sensor is mounted.
If IMU is oriented to base link, you can just fill the frame with base link, or publish a static tf publisher from base_link to imu_link.

2. about imu usage

方式1,需要-1来禁用传感器的转向数据。方式2,安全做法,像turtlebot这样的机器人,以及robot_pose_ekf这样的算法,对于imu的转向旋转,都是只作为相对值,微分,并不绝对值,积分。方式3,这种常见方式即使利用了电子罗盘,由于不能满足严格垂直这样的苛刻场景,需要解决。

*1. Orientation not provided
This is just gyros and accelerometers, no fusion yet.
In this case, ‘yaw’ or ‘heading’ still drifts since the gravity component for yaw is 0,
so you cannot determine absolute heading with only those 6 measurements.
We can set msg.orientation_covariance[0] = -1,
which tell other software not use the orientation.
For example, turtlebot and robot_pose_ekf all use the IMU only for the orientation changence, not the absolute orientation.
This is ultimately safest, as any orientation can be fused in this manner.
Unfortunately we lose the ability to remove drift, so more complicated approaches exist.

*2. Orientation integrated with drift from where the IMU started. This IMU doesn’t guarantee a known reference position. It’s possible for the ROS software to startup after the IMU calibrated, so the reference position is completely unknown due to drift. These IMUs are uncommon, but essentially only integrate the gyros for orientation and do not use a gravity reference. This a purely inertial output and the best you could do in empty space. The issue using this type in ROS is that the message itself does not let you know that the orientation is relative rather than absolute.
This is the case that ROS handles today and is a form of dead reckoning/Odometry. Turtlebot and robot_pose_ekf all use the IMU only for the change in orientation, not the absolute orientation. This is ultimately safest, as any orientation can be fused in this manner. Unfortunately you lose the ability to remove drift, so more complicated approaches exist.

*3. Orientation integrated with drift but pitch and roll referenced to gravity. This IMU is the most common. It uses the gyros to determine angle from startup, but references pitch and roll to gravity. The assumption here is that you will have to re-reference before you travel far enough for the Earth’s gravity vector to not aligned down in your Cartesian frame. In this case, ‘yaw’ or ‘heading’ still drifts since the gravity component for yaw is 0, so you cannot determine absolute heading with only those 6 measurements. Where this becomes a problem depends on the specific application.

*4. Orientation reference to gravity and magnetic north
This is very common that add magnetometers to reference the heading to magnetic north.
The problem using this method is that the frame must still be tangential to the Earth’s surface,
and that the magnetic field has ‘declination’ that varies based on where the IMU is located on the Earth’s surface.

*5. Orientation fused from a more complicated system (GPS INS).
This can be output a full sensor fusion system, but we highly recommend to NOT use this method.
If there is GPS, Odometry, INS, and more being fused into a globally corrected x, y, z, and orientation,
the output should primarily be a nav_msgs/Odometry.
However, if acceleration is required, the IMU message can be output, but we’d recommend it be the ‘raw’ sensor values.

What can we do to clarify this?
** Add an orientation_reference_frame_id. This is not a good idea since this is unknown and undefined for (1), (2), and (3). These don’t specify a reference that is repeatable. In the cases where you could provide it (4), (5), it still does not really solve the problem since the gravity and magnetometer reference frames are not Cartesian. So in order to accommodate ALL use cases, we’d have to reference something like this for gravity, but also factor in the magnetic field. Oh, and did I mention this varies over time?
** Let the sensor fusion stage handle it. This is ultimately the best solution. Since the sensor fusion stage has to handle custom orientation parameters, you might as well also define an imu_type parameter that can be changed to determine how best to use the IMU.

3. about imu code

3.1 odom-node.cpp的实现中,pose是在odom坐标,twist是在base_link坐标,tf是odom->base_link。

geometry_msgs::TransformStamped odom_trans;
double angle_radi =od->position.positionr*3.1415926/180.0;
geometry_msgs::Quaternion odom_quat = tf::createQuaternionMsgFromYaw(angle_radi);

odom_trans.header.stamp = current_time;
odom_trans.header.frame_id = “odom”;
odom_trans.child_frame_id = “base_link”;
odom_trans.transform.translation.x = od->position.positionx;
odom_trans.transform.translation.y = od->position.positiony;
odom_trans.transform.translation.z = 0;
odom_trans.transform.rotation = odom_quat;
//send the transform
if (tf_broadcast)
odom_broadcaster->sendTransform(odom_trans);
—————————————————-
nav_msgs::Odometry odom;
odom.header.stamp = current_time;
odom.header.frame_id = “odom”;
//set the position
odom.pose.pose.position.x = od->position.positionx;
odom.pose.pose.position.y = od->position.positiony;
odom.pose.pose.position.z = 0.0;
odom.pose.pose.orientation = odom_quat;
//set the velocity
odom.child_frame_id = “base_link”;
odom.twist.twist.linear.x = od->speed.speedx;
odom.twist.twist.linear.y = od->speed.speedy;
odom.twist.twist.angular.z = od->speed.speedr;
//publish the message
odom_pub->publish(odom);

3.2 imu-node.cpp的实现中,angular_velocity和linear_acceleration是在imu_link坐标,oritation应该是在odom或map坐标,tf是odom/map->base_link。
tf::Quaternion orientation(xf, yf, zf, wf);
delta_rotation = matrix_orientation.inverse() * orientation;
imu.header.stamp = measurement_time;
imu.header.frame_id = frame_id; // imu_link
quaternionTFToMsg(delta_rotation, imu.orientation);
imu.angular_velocity.x = gxf;
imu.angular_velocity.y = gyf;
imu.angular_velocity.z = gzf;
imu.linear_acceleration.x = axf;
imu.linear_acceleration.y = ayf;
imu.linear_acceleration.z = azf;
// publish imu message
imu_pub.publish(imu);
————————————-
// publish tf transform
if (tf_broadcast) {
transform.setOrigin(tf::Vector3(0,0,0));
transform.setRotation(delta_rotation);
tf_br.sendTransform(tf::StampedTransform(transform, measurement_time, tf_parent_frame_id, tf_frame_id));
}

4. about imu debug

4.1 ERROR: Covariance specified for measurement on topic xxx is zero
Each measurement that is processed by the robot pose ekf needs to have a covariance associated with it. The diagonal elements of the covariance matrix can not be zero. This error is shown when one of the diagonal elements is zero. Messages with an invalid covariance will not be used to update the filter.
Creating the covariance matrices for individual sensors:
For IMU and Odometry, the covariance matrix can be formed from the datasheet.
For Visual Odometry, covariance matrix may be obtained from the measurement equation that relates the measured variables to the pose coordinates [1].
*none of the covariance matrices should be zero, they should have some value, but I am not sure how it gets passed.

4.2
the odometry covariance is a float[36] a 6×6 matrix. diagonal terms are the trust you have in your sensor for each Dof. You have 6 Dof, position (x, y, z) and orientation (x, y, z)

to estimate your sensor or algorithm accuracy with experiment. If you see your data are good for 1cm in translation and 0.1 radian in rotation you can use this matrix:

[0.01 0.0 0.0 0.0 0.0 0.0,
0.0 0.01 0.0 0.0 0.0 0.0,
0.0 0.0 0.01 0.0 0.0 0.0,
0.0 0.0 0.0 0.1 0.0 0.0,
0.0 0.0 0.0 0.0 0.1 0.0,
0.0 0.0 0.0 0.0 0.0 0.1]

If have no information for one Dof you can put a huge value.

for example segway_rmp:
this->odom_msg.pose.covariance[0] = 0.00001;
this->odom_msg.pose.covariance[7] = 0.00001;
this->odom_msg.pose.covariance[14] = 1000000000000.0;
this->odom_msg.pose.covariance[21] = 1000000000000.0;
this->odom_msg.pose.covariance[28] = 1000000000000.0;
this->odom_msg.pose.covariance[35] = 0.001;

0 回复

发表评论

Want to join the discussion?
Feel free to contribute!

发表评论