Hi to all,
I have a GPS system (base station + rover station) with a sub-inch accuracy and I would like to use it on my mobile robot for ground navigation.
I have recorded a set of GPS points and I would like to make my robot follow them in order to be able to follow a specific track using GPS coordinates.
Is there any ROS package or solution which can I use as reference?
I think my robot should load all the coordinates and check them with its current location by adjusting it if it is different.
The problem is: how to make the robot move in the correct direction?
I hope you can help me!
Thank you!
↧
GPS navigation with mobile robot
↧
Issues using robot_localization with gps and imu
I only have two sensors, gps and and imu, that I am trying to integrate using robot_localization. I've created a launch file based on [this question](http://answers.ros.org/question/206043/ekf_localization-with-gps-and-imu/), pasted at the end.
There are a couple of issues. When I run the launch file I'm getting warnings about not having a base_link->map transform. The node handles the transform between odom and base link, do I need to create a transform from odom to map myself? There wasn't anything indicated in the linked answer so I assumed not, but I seem to be missing something. Warnings are pasted below
[ WARN] [1467926395.059514047]: Could not obtain base_link->map transform. Will not remove offset of navsat device from robot's origin.
[ WARN] [1467926397.126389602]: Could not obtain transform from map to base_link. Error was "map" passed to lookupTransform argument source_frame does not exist.
Second, the output of /odometry/gps is stuck at 0 for all values despite my gps data coming in fine, I would guess because of the map to base_link transform issue. Any help figuring out where I went wrong in my setup would be greatly appreciated.
EDIT- After rereading the robot_localization wiki I removed the map_frame line from the file as suggested but get the same results. Why is it still looking for a transform for the map_frame if I'm not using one?
[true, true, false,
false, false, false,
false, false, false,
false, false, false,
false, false, false] [false, false, false,
true, true, true,
false, false, false,
true, true, true,
true, true, true] [0.05, 0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0,
0.0, 0.05, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0,
0.0, 0.0, 0.06, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0,
0.0, 0.0, 0.0, 0.03, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0,
0.0, 0.0, 0.0, 0.0, 0.03, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0,
0.0, 0.0, 0.0, 0.0, 0.0, 0.06, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0,
0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.025, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0,
0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.025, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0,
0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.04, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0,
0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.01, 0.0, 0.0, 0.0, 0.0, 0.0,
0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.01, 0.0, 0.0, 0.0, 0.0,
0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.02, 0.0, 0.0, 0.0,
0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.01, 0.0, 0.0,
0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.01, 0.0,
0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.015] [1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9]
EDIT 2-
IMU
header:
seq: 1522
stamp:
secs: 1468000016
nsecs: 240786098
frame_id: imu_link
orientation:
x: 0.00976581033319
y: -0.0116279097274
z: 0.194468542933
w: 0.979070544243
orientation_covariance: [0.0025, 0.0, 0.0, 0.0, 0.0025, 0.0, 0.0, 0.0, 0.0025]
angular_velocity:
x: 0.0
y: 0.0
z: 0.0
angular_velocity_covariance: [0.02, 0.0, 0.0, 0.0, 0.02, 0.0, 0.0, 0.0, 0.02]
linear_acceleration:
x: 0.287402331829
y: 0.0
z: 9.9489107132
linear_acceleration_covariance: [0.04, 0.0, 0.0, 0.0, 0.04, 0.0, 0.0, 0.0, 0.04]
GPS
header:
seq: 1738
stamp:
secs: 1468000156
nsecs: 872385898
frame_id: gps
status:
status: 0
service: 1
latitude: 37.773469084
longitude: -122.417696017
altitude: 5.37939040617
position_covariance: [1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0]
position_covariance_type: 0
↧
↧
Can the robot_localization package be used with GPS-only?
I have a project where I am using GPS data to localize a robot. Unfortunately, I do not have a reliable IMU/Magnetometer to get orientation. Can the robot_localization package be used with only GPS data? an I get the robot orientation from the differential measurements? I have been unsuccessful in getting it to work. I am only doing 2D mode, so yaw is what I really would like to get.
In other words, is there a way to decouple the navsat_transform node from using IMU to get orientation?
Thanks!
↧
Example code for using GPS libraries
Currently the follow tutorial is missing the code - Writing a Subscriber for gpsd_client (C++) - http://wiki.ros.org/gpsd_client/Tutorials/Writing%20a%20Subscriber%20for%20gpsd_client%20%28C%2B%2B%29
If anyone has any example code for the use of GPS systems in ROS that would be very helpful!
Thanks for any help!
↧
REP-105 and robot_localization
Hello,
I have a series of questions about REP-105 and the robot_localization package.
What is confusing to me is the utility of the odom reference frame and it is possible that I don't understand its definition. REP-105 describes the odom reference frame as a world-fixed reference frame. But it also says that the pose of a robot can drift in the odom reference frame without any bounds.
I understand the problem. A vehicle whose position is estimated from dead-reckoning alone will accrue position estimate errors that will grow unbounded unless checked by an absolute position measurement.
So is the idea that one can track the dead-reckoned position estimate and the error between that and the true position separately? Perhaps this is done in the transform between the map frame and the odom frame?
What is the utility of the odom reference frame vs the map reference frame? Why is having a continuous "no-jumps" reference frame better than having a globally accurate one?
If one is using an EKF and there are jumps in the positioning due to GPS data, does that not mean that the measurement covariance (or possibly the process covariance) are not set properly? It seems to me one is relying on the GPS too much. Maybe it means the IMU is too poor quality to adequately filter the GPS?
I guess what I'm getting at is that it seems to me that a properly modeled and tuned EKF should (barring resets) not have jumps and the odom frame might be unnecessary.
I can't wait to hear your thoughts!
↧
↧
Fusing IMU + GPS with robot_location package
Currently, I am trying to realize the localization by using the robot_localization package based on a GPS and an IMU. THe realization principle is based on the setting in the following link: http://answers.ros.org/question/200071/how-to-fuse-imu-gps-using-robot_localization/ . I jut copied it here:
ekf_localization_node
Inputs
IMU (type: sensor_msgs/Imu; topic: /imu; frame_id: base_imu_link)
Transformed GPS data as an odometry message (navsat_transform_node output: topic: /odometry/gps)
Outputs
Odometry message (this is what you want to use as your state estimate for your robot; topic: /odometry/filtered)
navsat_transform_node
Inputs
IMU (type: sensor_msgs/Imu; topic: /imu; frame_id: base_imu_link)
Raw GPS (type: NavSatFix; topic: /fix; frame_id: gps_reddot)
Odometry (output of ekf_localization_node: topic: /odometry/filtered)
Outputs
Transformed GPS data as odometry message (topic: /odometry/gps)
The image of my hardware setting is quite simple:
https://drive.google.com/file/d/0BwCt69n0gpFbc0c0ek03TGd0aEk/view?usp=sharing
My current launch file is:
[true, true, true,
false, false, false,
false , false, false,
false, false, false,
false, false, false] [false, false, false,
true , true , true,
false, false, false,
true , true , true ,
true , true , true ] [0.05, 0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0,
0.0, 0.05, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0,
0.0, 0.0, 0.06, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0,
0.0, 0.0, 0.0, 0.03, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0,
0.0, 0.0, 0.0, 0.0, 0.03, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0,
0.0, 0.0, 0.0, 0.0, 0.0, 0.06, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0,
0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.025, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0,
0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.025, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0,
0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.04, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0,
0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.01, 0.0, 0.0, 0.0, 0.0, 0.0,
0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.01, 0.0, 0.0, 0.0, 0.0,
0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.02, 0.0, 0.0, 0.0,
0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.01, 0.0, 0.0,
0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.01, 0.0,
0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.015] [1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9]
Current problems:
1>. if I do not include the node [` `], all the outputs of /odometry/gps are zeros. If this tf node is included, the output of the /odometry/gps is as follows:
header:
seq: 5
stamp:
secs: 1470557701
nsecs: 931571006
frame_id: odom
child_frame_id: ''
pose:
pose:
position:
x: 0.484485645778
y: 0.158553831367
z: -1.06257850911
orientation:
x: 0.0
y: 0.0
z: 0.0
w: 1.0
covariance: [1.6656841125133872, -0.020322677227241386, -0.5022024606421389, 0.0, 0.0, 0.0, -0.020322677227241386, 1.6207245363996199, 0.19335550081983582, 0.0, 0.0, 0.0, -0.5022024606421389, 0.19335550081983582, 6.390991351086994, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
twist:
twist:
linear:
x: 0.0
y: 0.0
z: 0.0
angular:
x: 0.0
y: 0.0
z: 0.0
covariance: [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
Only the position of Pose message are nonzero but the orientation and Twist message are zeros.
Is there something wrong for my setting?
If this code is not included, the output of /odometry/filered is close to zeros when I was walking in an outdoor environment because of the zero output of /odometry/gps for package navsat_transform_node.
In addition, it seems from the link I followed that the previous author has not included this node.
2>. While including the tf node for gps, I have done an experiment.
image link: https://drive.google.com/file/d/0BwCt69n0gpFbeVpnamlKYUxxTm8/view?usp=sharing
The recorded data from /odometry/filtered are plotted by matching the google map, which are roughly accurate. However, it still has some location error. The frequency of IMU is 20HZ, and the frequency for GPS is 1HZ. Is this because of the low frequency of the GPS? The recording frequency is 2HZ. You can see that there are some jump at some of the segments.
Thanks very much for your help.
↧
Why does the accuracy of navsat_transform change with heading?
## The Problem (in short) ##
I am using the excellent robot_localization package to localize my robot in an outdoor environment. I'm trying to make the localization as accurate as possible, and it's currently accurate to <25cm (my GPS is accurate to ~20mm so that's what I'm aiming for). However this last 25cm is dependent on the heading of the robot. I.e. the error changes depending on the direction the robot is facing, as demonstrated if I simply rotate 360 degrees on the spot:

I am trying to find out why this is happening and how I can fix the issue so accuracy is independent of heading. This error is periodic, and consistent across several tests, so I think it is being cause by as systematic error in navsat_transform or in the way the frame transforms are being handled. All info and data is given under "System Information" at the bottom of this page.
## The Problem (in full) ##
I have three sensor inputs: wheel odometry, IMU data, and GPS data. I am using one instance of the ekf_localisation node to fuse wheels and IMU data ("local" odometry) and a second instance to fuse everything ("global" odometry). I am also using an instance of navsat_transform to convert my GPS data to a GPS odometry stream ("gps" odometry) which is what is being used in the global EKF node. When I rotate the robot on the spot, this is the odometry reported for each odometry stream:

Keep in mind that the local odometry is reported in the local /odom reference frame while the global and GPS odometry are both reported in the global /map frame. This leads me to believe that the problem is related to the navsat_transform node or one of the frame transforms that it uses.
I think there is actually a positive feedback loop in here since the GPS odometry is fused to produce the global odometry, but the position of the GPS odometry is partially determined by the global EKF which determines the location of the /map frame that the GPS odometry is relative to.
Now I know that my GPS is not the problem, because when I directly calculate the GPS UTM coordinates of the center of the robot (/base_link) it is different to the UTM coordinates obtained via sensor fusion (I'm effectively just looking at the origin of /base_link relative to the /utm frame which is broadcast by navsat_transform). This is what I get:

So the directly calculated data indicates that the robot is turning on the spot (as expected) however the UTM coordinates of /base_link relative to /utm match the previous graph. Also, when I directly calculate the UTM coordinates of /base_link I'm using the exact same WGS84->UTM calculations used by robot_localisation and I'm using the exact same magnetic declination as what I've specified in the launch file for navsat_transform.
Now when I watch the reference frames in RViz, the /base_link and /odom do not move relative to one another - the /base_link frame only rotates (as you would expect since it turns with the robot). The /map frame however moves relative to both. So this erroneous movement is really being caused by /map moving relative to /base_link rather than the other way around.
I currently have two working theories on the cause of this problem:
1. **There is a bug in navsat_transform.** This could distort the GPS odometry, in turn distorting the global EKF, in turn distorting the position of the /map frame, which in turn further distorts the GPS odometry. If this were the case, I think it might be due to a change between v2.2.2 and v2.2.3 because I didn't experience this issue prior to updating to v2.2.3.
2. **My reference frames are set up incorrectly** If this were the case, it won't be navsat_transform itself that's distorting the GPS odometry but one of the transforms it's using. I've triple-checked everything and according to the /tf topic all reference frames appear to be in the correct locations.
If the error were random or otherwise evenly distributed, I would have chalked this up to natural error in the system, but since it has a fixed relationship with the heading of the robot, it must be a systematic error. If anyone else has experienced this or otherwise has any advice, I'd be very appreciative.
## System Information ##
- **OS:** Ubuntu 14.04 (laptop), ROS Indigo Igloo (robot)
- **Robot:** Clearpath Robotics Jackal UGV (real, not simulated)
- **Sensors:** IMU, wheel encoders, high accuracy RTK GPS
- **Localization:** package used is [robot_localization](http://wiki.ros.org/robot_localization) (v2.2.3). IMU and encoder data is fused in the /odom frame (local odometry). IMU, encoder and GPS data is fused in the /map frame (global odometry).
- **Launch file:** [ekf_and_navsat.launch](https://drive.google.com/open?id=0B1KZT92BcdVNSE1rZ0dOVmR6d0E)
- **Frame tree:** [tree](https://drive.google.com/open?id=0B1KZT92BcdVNYlZTT1JUc0VBYW8)
- **Example bagfile:** [spin_test_bagfile](https://drive.google.com/open?id=0B1KZT92BcdVNZnUxLUNzajVFYjQ)
- **Example sensor inputs:** [wheel_encoder_input.csv](https://drive.google.com/open?id=0B1KZT92BcdVNVlhrcEY5SG5rdWs), [imu_input.csv](https://drive.google.com/open?id=0B1KZT92BcdVNNDgteEs5QTBxaHM), [gps_input.csv](https://drive.google.com/open?id=0B1KZT92BcdVNVVJEbUIzVFFCU3c)
- **Example odometry outputs:** [local_odom.csv](https://drive.google.com/open?id=0B1KZT92BcdVNQ01LeWlsU1dEcjg), [global_odom.csv](https://drive.google.com/open?id=0B1KZT92BcdVNbURuYi1VRl9SdlU), [gps_odom.csv](https://drive.google.com/open?id=0B1KZT92BcdVNVVJiblE3dGJINmc)
### UPDATE 1 ###
At @asimay_y's suggestion, I reduced the number of sensor inputs fusing yaw velocity to just the IMU. Unfortunately it hasn't significantly affected the global output, as shown here:

In hindsight this makes sense because the only difference between the local and global fusion is the GPS data, so if the yaw velocity was causing issues, I would have expected to see similar behaviour in the local odometry output.
Regarding the movement of /map relative to /odom in RViz, my understanding is that RViz converts the odometry from each source into the reference frame you are currently viewing (for me is was /odom at the time). And if the position estimate of the robot in the /odom and /map frames is different, then the odometry will reflect that. But the thing is, the location of these frames are effectively calculated relative to the robot's current position, so rather than the robot drifting and appearing in two places at once to represent the difference, the robot is consistent and the origins of the reference frames drift relative to each other. You'll only ever see /odom and /map matching if the odometry in both frames is exactly the same.
In addition, I've also collect some more data, just by making the robot spin on the spot, move a short distance and spin again, move, spin etc. So the odometry in each frame looks something like this:

Notice how the local data at each corner (when the robot spins) doesn't deviate, but the GPS and global data does, just like when the robot spins on the spot. The UTM coordinates calculated directly and via sensor fusion mirror this. This path produces the same periodic error at each corner where the robot stops and spins, as shown:

And to make things more interesting, the relationship between error and heading seems to drift:

Again, the same periodic relationship is observed, but it seems to drift sideways over time, perhaps as the IMU drifts or error builds up in the fusion estimate.
↧
Zigzag ground track from eight shape flight path
Hi all,
I tried to using ukf template recursively with navsat_transform node following http://answers.ros.org/question/200071/how-to-fuse-imu-gps-using-robot_localization/
With my eight flight path it seem zigzag ground track as shown in link below.
https://onedrive.live.com/?authkey=%21APrPogW5YVB82s0&cid=7022E3C868EF2F6B&id=7022E3C868EF2F6B%21177&parId=7022E3C868EF2F6B%21120&o=OneUp
This is my setup launch file.
and this is my paramerter
frequency: 30
sensor_timeout: 0.1
transform_time_offset: 0.0
two_d_mode: false
print_diagnostics: true
debug: false
debug_out_file: /path/to/debug/file.txt
# map_frame: map
odom_frame: odom
base_link_frame: fcu
world_frame: odom
use_control: false
stamped_control: false
control_timeout: 0.2
control_config: [true, true, true, true, true, true]
acceleration_limits: [1.3, 1.3, 1.3, 3.4, 3.4, 3.4]
deceleration_limits: [1.3, 1.3, 1.3, 4.5, 4.5, 4.5]
acceleration_gains: [0.8, 0.8, 0.8, 0.9, 0.9, 0.9]
deceleration_gains: [1.0, 1.0, 0.0, 1.0, 1.0, 1.0]
alpha: 0.001
kappa: 0
beta: 2
odom0: /odometry/gps
odom0_config: [true,true,true,
false, false, false,
false, false, false,
false, false, false,
false, false, false]
odom0_differential: false
odom0_relative: false
odom0_queue_size: 2
odom0_pose_rejection_threshold: 100
odom0_twist_rejection_threshold: 100
odom0_nodelay: false
# odom1: example/another_odom
# odom1_config: [false, false, true,
# false, false, false,
# false, false, false,
# false, false, true,
# false, false, false]
# odom1_differential: false
# odom1_relative: true
# odom1_queue_size: 2
# odom1_pose_rejection_threshold: 2
# odom1_twist_rejection_threshold: 0.2
# odom1_nodelay: false
# pose0: example/pose
# pose0_config: [true, true, false,
# false, false, false,
# false, false, false,
# false, false, false,
# false, false, false]
# pose0_differential: true
# pose0_relative: false
# pose0_queue_size: 5
# pose0_rejection_threshold: 2
# pose0_nodelay: false
# twist0: example/twist
# twist0_config: [false, false, false,
# false, false, false,
# true, true, true,
# false, false, false,
# false, false, false]
# twist0_queue_size: 3
# twist0_rejection_threshold: 2
# twist0_nodelay: false
imu0: /mavros/imu/data
imu0_config: [false, false, false,
true, true, true,
false, false, false,
true, true, true,
true, true, true]
imu0_nodelay: false
# odom1_differential: false
# odom1_relative: true
imu0_queue_size: 5
imu0_pose_rejection_threshold: 0.8
imu0_twist_rejection_threshold: 0.8
imu0_linear_acceleration_rejection_threshold: 0.8
process_noise_covariance: [0.05, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0.05, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0.06, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0.03, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0.03, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0.06, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0.025, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0.025, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0.04, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0.01, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.01, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.02, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.01, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.01, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0.015]
initial_estimate_covariance: [1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1e-9]
and this is my dataset
https://1drv.ms/u/s!Amsv72jI4yJwgS-R1PYE6r0e72F3
it's start at 180s so this command will help.
rosbag play px4_eight_football_2016-07-22-14-48-23.bag --clock -s 180
Thank you advanced.
↧
I am using robot_pose_ekf, but I am getting incorrect output, and I'm not sure why.
Hello,
I'm trying to get robot_pose_ekf working in ROS Indigo with a model I have in Gazebo, but I can't seem to get it to output the correct values. I have searched through documentation and various help threads for a while, and I can't seem to narrow down my problem.
I have a model in gazebo that has an differential drive controller using the plugin “libgazebo_ros_diff_drive.so”, a GPS sensor using “libhector_gazebo_ros_gps.so”, and an IMU sensor using “libgazebo_ros_imu.so”.
As far as I can tell, all of these sensors are producing correct values on their topics, so this led me to believe that my problem lies somewhere in my ROS code rather than anything with Gazebo.
Per this thread (http://answers.ros.org/question/229722/how-to-stop-gazebo-publishing-tf/), I have set the output of robot_pose_ekf to the head of my tf tree and remapped the gazebo transform output to a different topic. Here is my current frame tree with the base_link, IMU, GPS, and my two laser sensors. The odom at the head is the odometry output from robot_pose_ekf:

For my GPS sensor, per this page (http://wiki.ros.org/robot_pose_ekf/Tutorials/AddingGpsSensor) I remapped the input to the robot_pose_ekf vo topic. I have also written a node that converts the GPS coordinates to the Gazebo simulator coordinates in the local area of the Gazebo world I am using. So, if my robot model state is at (x=0,y=0), then the GPS coordinates are transformed and sent as (x=0,y=0). I plan to add some noise to that, but currently, it is very small in order to test.
Here is my launch file section that launches robot_pose_ekf:
I have tested several scenarios described below where I switch vo_used, imu_used, and odom_used to true/false to hopefully give a better indication of the problem.
After I had odometry(with robot_pose_ekf subscribed to the odometry topic from the differential controller), IMU, and GPS setup, I performed several test. For the first test, I ran robot_pose_ekf with only odometry and IMU. As the robot moved, I got good output until the robot hits an obstacle and stops. At that point, the robot_ekf output shows the robot still moving forward. I also get the message “Robot pose ekf diagnostics discovered a potential problem”. I have looked into that per the description on the ROS robot_pose_ekf page, but I don't see a problem. The two sensors have nearly identical timestamps, and I assume this falls under the category of “very different information about robot pose” since the wheels seem to still be moving which causes the odometry to show the robot moving, yet the IMU shows no motion or acceleration.
The following are the (x,y) plots of the robot position according to the differential drive odometry (red), the odometry combined of robot_pose_ekf (green), the GPS coordinates translated to (x,y) position from the node I wrote (blue), and the actual model position in Gazebo using get_model_state (pink). Due to currently having little noise, the model state and GPS are overlapped on all the graphs.

As can be seen in the graph, the green line (robot_pose_ekf) continues after the model stops in the simulator (purple,blue lines). While the odometry line (red) also continues due to the wheels spinning against the obstacle.
The next test I performed was with odometry, and GPS. IMU was deactivated. I set the GPS covariance to have a higher confidence than the odometry. This caused a new problem. Now the output for robot_pose_ekf shows a fairly straight line with minimal changes when the robot turns. However, if it encounters an obstacle and stops, the robot_pose_ekf registers this and the output stops at that point.

As can be seen in the graph, the robot_pose_ekf (green) is giving the wrong robot location, but when it hits the obstacle, the green like stops like it should as can be seen by how the red odometry line continues moving.
Next I tried equal confidence in the sensors, and this produced a result somewhat halfway between the previous two. The robot_pose_ekf output doesn't stop when it hits an obstacle, but it only slightly registers the positon changes when the robot turns so the robot drift is extremely large too.

In this graph we see that the robot_ekf_output (green) follows slightly better on the turns, however, it continues plotting once the robot hits an obstacle as can be seen by where the purple line stops plotting.
I hope I have provided all of the relevant information. I'm not sure where I am going wrong with this, can anyone offer any suggestions?
Thank you.
↧
↧
Aligning odometry coordinates with ENU (East = 0 deg)?
Hello readers,
I'm still slightly new to ROS and have gotten myself a bit far down a rabbit hole with my Husky. From how I've seen it talked about, it seems that there is a way to avoid the orientation angles of a robot (in 2D plane) to follow the convention of ENU.
To expand on this, what I'm trying to do is not have my ROS coordinate frame initialize with 0 degrees yaw being whatever direction the Husky is facing at initialization. I'd prefer a yaw orientation angle of 0 degrees to correspond to East, North 90, etc.
I feel that I may not be understanding fully how to configure navsat_transform to accomplish this, or perhaps just the robot_localization ekf settings themselves?
For more info, what I'm trying to do is have my Husky navigate to a GPS coordinate. I'm currently attempting to do so by using the bearing between the current and goal GPS points, the distance between them, and the current position in the ROS coordinate frame. I then calculate the x and y coordinates of the goal point in the ROS coordinate frame and publish it to the move_base node. This would work much better if I didn't have to make sure the robot is initialized facing East. I would much prefer the ROS coordinate frame to simply abide by the ENU convention.
If there is anything wrong with my question or my understanding of things at this point, please do not hesitate to correct me. Thanks in advance!
↧
NameError: global name 'NAME' is not defined
hello, I'm trying to publish data of my GPS but it appears this error
NameError: global name 'hora' is not defined
Anybody help me please :)
My code is this:
#!/usr/bin/env python
import rospy
import serial
import pynmea2
import roslib; roslib.load_manifest('ceres_pkg')
from std_msgs.msg import String
from geometry_msgs.msg import Point
serialStream = serial.Serial("/dev/ttyAMA0",9600,timeout=0.5)
class GPS(object):
def __init__(self):
self.pub_ = rospy.Publisher("/dat_gps", Point, queue_size = 1)
return
def start(self):
sentence = serialStream.readline()
global hora,lat,lon
if sentence.find('GGA') > 0:
data = pynmea2.parse(sentence)
hora = data.timestamp
lat = data.latitude
lon = data.longitude
pos_msg = Point(hora, lat, lon)
loop_rate = rospy.Rate(1)
while not rospy.is_shutdown():
self.pub_.publish(pos_msg);
loop_rate.sleep()
return
if __name__ == '__main__':
rospy.init_node('GPS_py', log_level=rospy.INFO)
rospy.loginfo("%s: starting GPS node", rospy.get_name())
Gps = GPS()
Gps.start()
↧
Hector GPS fix status not setting
Working with http://wiki.ros.org/hector_gazebo_plugins?distro=kinetic#GazeboRosGps.
Everything is working fine except for the fix status, and I can't make heads or tails of it. I have the plugin configured as so
50 base_link fix vel REMOVED REMOVED REMOVED 2 8
With these settings I'm getting an msg of
header:
seq: 7633
stamp:
secs: 152
nsecs: 780000000
frame_id: /world
status:
status: 50
service: 8
latitude:
longitude:
altitude:
position_covariance: [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
position_covariance_type: 2
I don't know how the `50` came about. What is odd is if I set the config to `1 ` I get a different output
---
header:
seq: 212
stamp:
secs: 4
nsecs: 560000000
frame_id: /world
status:
status: 49
service: 8
latitude:
longitude:
altitude:
position_covariance: [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
Thinking it was some type of casting I set `2.0 ` and I get a `status: 0`.
I have looked over the source and found the param set, https://github.com/tu-darmstadt-ros-pkg/hector_gazebo/blob/indigo-devel/hector_gazebo_plugins/src/gazebo_ros_gps.cpp#L127. Also found some other operations but finding nothing that would offset by 48.
↧
I have errors trying to catkinize MIDG II package
I am trying to catkinize a package I found on Github to run my "Microrobotics MIDG II INS/GPS" sensor on ROS.
I used "catkinize" and follwed the instructions to edit my `CMakeList` and package files.
After finishing all steps I am getting this error messge and I couldn't find any help to fix it.
...
midgPacket.cpp:(.text+0xa15): undefined reference to `ros::console::setLogLocationLevel(ros::console::LogLocation*, ros::console::levels::Level)'
midgPacket.cpp:(.text+0xa1f): undefined reference to `ros::console::checkLogLocationEnabled(ros::console::LogLocation*)'
midgPacket.cpp:(.text+0xa6a): undefined reference to `ros::console::print(ros::console::FilterBase*, void*, ros::console::levels::Level, char const*, int, char const*, char const*, ...)'
collect2: error: ld returned 1 exit status
make[2]: *** [/home/renawi/catkin_ws/devel/lib/midg/Midg] Error 1
make[1]: *** [midg/CMakeFiles/Midg.dir/all] Error 2
make: *** [all] Error 2
Invoking "make -j8 -l8" failed
My CMakeList file:
cmake_minimum_required(VERSION 2.8.3)
project(midg)
find_package(catkin REQUIRED COMPONENTS roscpp geometry_msgs std_msgs sensor_msgs message_generation)
include_directories(include ${Boost_INCLUDE_DIR} ${catkin_INCLUDE_DIRS} ${GSTREAMER_INCLUDE_DIRS})
# Set the build type. Options are:
# Coverage : w/ debug symbols, w/o optimization, w/ code-coverage
# Debug : w/ debug symbols, w/o optimization
# Release : w/o debug symbols, w/ optimization
# RelWithDebInfo : w/ debug symbols, w/ optimization
# MinSizeRel : w/o debug symbols, w/ optimization, stripped binaries
#set(ROS_BUILD_TYPE RelWithDebInfo)
#set the default path for built executables to the "bin" directory
set(EXECUTABLE_OUTPUT_PATH ${PROJECT_SOURCE_DIR}/include)
#set the default path for built libraries to the "lib" directory
set(LIBRARY_OUTPUT_PATH ${PROJECT_SOURCE_DIR}/lib)
#uncomment if you have defined messages
add_message_files(
DIRECTORY msg
FILES
IMU.msg
)
#uncomment if you have defined services
#add_service_files(
# FILES
# IMU.srv
#)
#common commands for building c++ executables and libraries
#add_library(${PROJECT_NAME} src/Midg_II.cpp)
#target_link_libraries(${PROJECT_NAME} Midg)
#target_link_libraries(${midg}-test ${midg})
#add_executable(midg midg/midg.cpp)
#target_link_libraries(example ${PROJECT_NAME})
#rosbuild_find_ros_package(mst_common)
#INCLUDE_DIRECTORIES(${mst_common_PACKAGE_PATH}/include)
## Generate added messages and services with any dependencies listed here
generate_messages(DEPENDENCIES geometry_msgs std_msgs)
#generate_messages()
# catkin_package parameters:
#http://ros.org/doc/groovy/api/catkin/html/dev_guide/generated_cmake_api.html
#catkin-package
# TODO: fill in what other packages will need to use this package
catkin_package(CATKIN_DEPENDS geometry_msgs std_msgs message_runtime roscpp sensor_msgs)
#install(DIRECTORY include/${PROJECT_NAME}/
# DESTINATION ${CATKIN_PACKAGE_INCLUDE_DESTINATION}
# FILES_MATCHING PATTERN "*.h"
# )
# PATTERN ".svn" EXCLUDE
add_executable(Midg src/Midg_II.cpp src/drivers/midgPacket.cpp)
add_dependencies(Midg ${${PROJECT_NAME}_EXPORTED_TARGETS} ${catkin_EXPORTED_TARGETS})
END of CMakeList.txt
There are many of the "undefined reference to..." messages showing up before the error message
Any help?
↧
↧
Kalman: different equations for same states
Hi guys :)
I've got a very theoretical Kalman-filter-related question that and I hope that someone has an idea how to solve this:
**How is it possible to use 'different' state equations for the same states?**
In detail: I have an RC-car and I can use a Kalman filter to fuse GPS and IMU data to an estimate of position, velocity and heading. However, the basic Kalman-equations do not account for the car dynamics! Instead, they only integrate IMU acceleration and find an optimal estimate together with orientation and GPS.
Since I know the car's dynamics (e.g. it can only move longitudinally) I would like to use this information as well. Problem: I'm getting multiple equations for the same states.
It would be awesome if you have any ideas! I can't be the only one dealing with this issue but I can't find anything about it in literature.
Please also let me know if my problem is not clear enough and I'll be glad to give some more information.
Thank you very much in advance for any help!
↧
Hi, How can I path plan a robot in ROS+gazebo using indoor GPS and odometry data?
Hi, How can I path plan a robot in ROS+gazebo using indoor GPS and odometry data?
↧
navsat_transform_node has zero orientation output?
I'm working with a robot with no wheel odometry measures, and have IMU, GPS and motor commands to fuse. Can anyone clarify how the setup should work for this?
At the moment I have GPS, IMU and motor command odometry going into a navsat_transform_node, which outputs odometry messages. I'm planning to fuse them with a second copy of the motor commands with an ekf_localization node.
My problem is: the odometry/gps messages coming out of the navsat_transform_node all contain zeros for all orientations. I thought the IMU data was being used to set these (and I've check the incoming IMU messages are non-zero).
Am I doing something wrong, or mis-interpreting what navsat_transform_node is doing? Perhaps the IMU is used like the motor command odometry to set only the initial state, then ignored the rest of the time? Even with that, it would be unlikely to have orientation exactly 0 at the start. Do I still need to fuse the odometry/gps with IMU again to get the desired effect, and if so why?
Thanks for any insights!
Sample odometry/gps message below:
position:
x: 0.806593844667
y: -4.15517381765
z: -0.674990490428
orientation:
x: 0.0
y: 0.0
z: 0.0
w: 1.0
covariance:
[1.259293318748, 0.03294127204511593, 0.41883145581259457, 0.0, 0.0, 0.0, 0.03294127204511593, 1.232013681194764, 0.2798927172566257, 0.0, 0.0, 0.0, 0.41883145581259457, 0.2798927172566257, 4.768693000057238, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
twist:
twist:
linear:
x: 0.0
y: 0.0
z: 0.0
angular:
x: 0.0
y: 0.0
z: 0.0
covariance: [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
↧
Novatel GPS Smart6 with ROS
Hi to all,
I have a novatel GPS system composed by a [Smart6](http://www.novatel.com/products/smart-antennas/smart6/) base station and a rover station capable to provide sub-inch accuracy.
I would like to mount the rover station on my wheeled robot and record GPS data in order to use these data for [navsat_odometry](http://wiki.ros.org/navsat_odometry) and [robot_localization](http://docs.ros.org/indigo/api/robot_localization/html/navsat_transform_node.html).
What I need to do in order to read and record data coming from my smart6 rover station?
The [novatel_span_driver](http://wiki.ros.org/novatel_span_driver) is the only packet I need to read data from my novatel device?
It describes a connection on the ethernet port, but the smart6 device only provides a *serial interface*.
Can you help me, please?
↧
↧
Integrate a GPS sensor with robot_localization and use move_base node
Hello,
I’m trying to integrate a `GPS` sensor to my mobile robot with the `robot_localization_node`. I have an IMU sensor added to my `robot_localization_node` but now I have some theoretical questions about how to integrate the `GPS`.
I would like to use the output of the `robot_localization node` to feed the `move_base node` in order to set some waypoints in GPS coordinates (can be in UTM) and then the robot follow them.
I’m a bit lost about how to proceed. This is what I think:
I launch `ek_localization_node` with this inputs: `IMU` and `wheel_odom`. And I remap the output topic to `local_odometry`. It'll be the local odometry in odom frame. (I must use this topic to input to `move_base` if I want to navigate in `odom` frame).
Then I launch `navsat_transform_node` with `IMU`, `wheel_odom`, and `GPS NavSatFix` and the output topic in world frame is `/odometry/gps`. But this topic is publishing only when the GPS signal is received so...
I launch another `ekf_localization node` with `IMU`,`Wheel_odom` and `odom1` (`/odometry/gps` from `navsat_transform node`). The output of this node (`/odometry/filtered`) is in world frame and is what I have to put as input to `move_base`, right?
With this configuration I don’t need to use `gps_common`, right?
When I launch all, I have to call the service `set_pose` in order to give some estimation pose in UTM coordinates to /odom in base to /world, isn’t?
**UPDATE 1:**
Thanks Tom. I've taken into account all your comments and here's my `robot_localization` [launch file](https://www.dropbox.com/s/k7pzj248pjdxwma/robot_localiz.launch?dl=0) and my [tf tree](https://www.dropbox.com/s/1tcikqswf0db1pb/frames.pdf?dl=0).
1. I feed the `move_base` odometry with the `local_odometry` returned by my first instance of `ekf_localization_node`.
2. As I've my waypoints in UTM coordinates, their should be in `utm frame`. I can't see the tf of utm frame publishing, but in the source code I see it's `/utm`.
3. I think the `global planner` of `move_base_node` needs the global frame set to the same frame of the waypoints. So It should be `/utm`, but when I launch this configuration, `move_base_node` says this:
> [ WARN] [1422876635.213009322]:> Waiting on transform from> summit_a/base_footprint to utm to> become available before running> costmap, tf error:
I understand that `move_base` can't find the tf between `utm` and `base_footprint` frames. I don't know if it can be because the tf isn't visible by `move_base_node` or I'm doing something wrong.
- (Just to test.) If I set the global frame of the `global_panner` of `move_base_node` to `summit_a/world` it reports me this:
> [ERROR] [1422873579.326459610]: The> goal pose passed to this planner must> be in the summit_a/world frame. It is> instead in the summit_a/utm frame.
- (other test) Setting the global frame of `global_planner` to `summit_a/odom`:
> [ERROR] [1422876188.622192440]: The> goal pose passed to this planner must> be in the summit_a/odom frame. It is> instead in the utm frame.
So, the `global_planner` frame of `move_base_node` needs to be consistent or the same that the frame of the goals. `/utm` in this case.
- (Another idea) Considering how `move_base_node` works, at first I thought the best way to make the integration of `GPS`, was that the world frame represent real-world coordinates (UTM) and was the father of frame `/odom`. Thus the `move_base_node` receives waypoints respect to frame world in UTM. I understand that for easy viewing in rviz and avoid the problems of continuity, `robot_localization_node` publishes the transform `utm` as son of `/odom`, but I think this can be a little problem for navigation by `move_base_node`.
**UPDATE 2:**
Thanks Tom.
1. Now I get the tf between `/world` and `/utm`. The point was that `navsat_transform_node` doesn't publish the tf until the GPS signal is nonzero.
2. The topic `/odometry/gps` without `child_frame_id` defined is meaningless for my purpose,right? Because it isn’t publish constantly if the GPS signal is lost.
3. Now my waypoints are in base to the `utm` frame.
4. With which topic I must feed the `move_base` node? I think the best option is the `/odometry/filtered` topic from my second instance of `ekf_localization_node`, isn’t it? It takes into account the output of `navsat_transform_node` and therefore the GPS position.
5. I don't really understand the problem with the heading of `IMU`, I'm sorry. My `IMU` hasn’t got compass and it doesn’t provide any kind of orientation data, just the angular velocity and lineal acceleration. In the `imu0_config` param just the angular velocity in Z is set to `true` and the stimation of the heading is enough. Is that a problem if I want to integrate the `GPS`?
Any ideas, corrections or suggestions would be appreciated.
Best regards and thanks in advance.
↧
Adding two robots in gazebo with different gps coordinates
OK I have a problem I have two robots with GPS plugin in Gazebo , and they have the same reference latitude and longitutde (for example 0,0) so they show the same result on gps topic 0,0. So it's assumed that they are in the same place, but of course they don't. So what I should do to separate these reference coordinates and gain dependency between these two robots? I mean that for example position of the one robot is 0,0 and the other is 21,31?
↧
robot_localization fails due to transform warnings
Hello,
I'm trying to use the robot_localization package for sensor fusion. I've got three sensors: IMU, odometry and GPS.
I was first trying to work with the configuration I've found here:
[link to other question](http://answers.ros.org/question/215904/robot_localization-failed-to-transform-warning/?comment=252296#post-id-252296)
I downloaded the files from [here](https://drive.google.com/file/d/0BzZeubT6hZReNGthQ2NRRjBaVjQ/view)
Unfortunately if I execute the launch file I first got these errors:
[ WARN] [1484753899.869852461, 1440023596.548412649]: Could not obtain transform from /gps to base_footprint. Error was Invalid argument "/gps" passed to lookupTransform argument source_frame in tf2 frame_ids cannot start with a '/' like:
As suggested from @tom-moore in an [answer](http://answers.ros.org/question/215904/robot_localization-failed-to-transform-warning/?comment=252296#post-id-252296) I changed the frame_id from /gps to gps
But just the error message changed:
[ WARN] [1484754315.316805588, 1440023620.550000278]: Could not obtain transform from gps to base_footprint. Error was "gps" passed to lookupTransform argument source_frame does not exist.
EDIT:
I can reproduce the error on different computers. I was setting up ubuntu with a clean ros installation without any success. As a result I get the same errors.
↧