According to tutorial about GPS Integration and use of navsat transform node, i put into node the imu with absolute heading values, odometry that is the output of second instance of EKF and NavSatFix message.
The problem is that the node doesn't publish any data in /odometry/gps. I check the covariances of IMU and odometry message. I check also the value of lat/lon/h of GPS message because I have readed that one of this is NaN the navsat_transform node doesn't make output.
So, what is my fault?
UPDATE 16-03-2016
Bagfile: https://drive.google.com/file/d/0B_1fqLQnQO6oVWszVnM2SWs3OE0/view?usp=sharing
Roslaunch navsat_node: https://drive.google.com/file/d/0B_1fqLQnQO6oZ0JGUlQwbllqUjA/view?usp=sharing
UPDATE 23-01-2017
Roslaunch navsat_node: https://github.com/NikoGiovannini/nostop_launch_files/blob/c9294d3a64973a4a272f546d417d3989e876d10a/launch/arduino/PC_robot_green_yellow.launch
↧
robot_localization: navsat transform node does not publish
↧
Problem while integrating GPS data into robot_localization
Hi to Tom and everybody,
I'm currently making some experiments with robot_localization, and I'm having some problems with the integration of GPS data for global localization of a non-holonomic mobile robot constrained to a 2D plane (no Z coordinate and no roll/pitch).
Here is the overview of my setup for odometry which is very similar to the one described in [this section](http://docs.ros.org/kinetic/api/robot_localization/html/integrating_gps.html#notes-on-fusing-gps-data) of the wiki where the use of a dual EKF node is discussed.

- X and Y are bidimensional coordinates
- Vx and Vy are linear velocities
- H*, X** and Y** are angular velocity and linear accelerations, respectively
- The topic name is inside parentheses
- The provided TF transformation is inside square brackets
This is my **launch file** (only the part relevant to replay the bag file):
[ false, false, false,
false, false, false,
true, true, false,
false, false, false,
false, false, false ]
[ false, false, false,
false, false, false,
false, false, false,
false, false, true,
true, true, false ] [43.542883, 11.571365, 0]
[ false, false, false,
false, false, false,
true, true, false,
false, false, false,
false, false, false ]
[ true, true, false,
false, false, false,
false, false, false,
false, false, false,
false, false, false ]
[ false, false, false,
false, false, false,
false, false, false,
false, false, true,
true, true, false ]
And below there is the output of `rqt_graph` which I suppose it's correct, except I don't know why "navsat_transform" is not subscribed to topic "/odometry/imu" even if I specified it with a remap.

[This](https://www.dropbox.com/s/bljxrfje7xgmp52/data.bag?dl=0) is the link to download the **bag file** that can be used with the above launch file.
When launched, this is the TF tree I'm seeing with `rosrun rqt_tf_tree rqt_tf_tree` that makes sense to me.

[Here](https://www.dropbox.com/s/6yguf9f9779inja/odometry.rviz?dl=0) you can download the RVIZ config file to obtain the same visualization I'm using.
While replaying the bag file, the `odom_link-->base_link` transformation is correctly calculated by the Kalman filter (I'm having pretty good accuracy, too!), but it seems there are problems with the `map-->odom` transformation, because I see these warnings multiple times:
[ WARN] [1485356716.681560393]: Transform from odom_link to map_link was unavailable for the time requested. Using latest instead.
[ WARN] [1485356718.683456544]: Transform from map_link to base_link was unavailable for the time requested. Using latest instead.
Other than that, the estimated global pose start diverging after a few seconds and the coordinates become huge, it is not so clear in the following image, but this is what I see in RVIZ after setting "map" as the fixed frame.

I would like to obtain something similar to what was presented by Tom here, but the GPS data is giving a really unpredictable behavior.
Does anybody having these kind of issues with a similar setup?
Many thanks,
Guido
↧
↧
Navsat_transform diverges from GPS points
Hi, I'm trying to use the navsat_transform_node as per the example provided in the robot_localization package. However, I the output of the node always diverges (quite significantly) from the original GPS points (see image below - odom.csv is the odometry without GPS, odom_fused.csv incorporates the GPS, odom_gps.csv is the output of the navsat_transform_node, gps.csv is the input GPS points, and gps_filtered.csv is the filtered GPS output from navsat_transform_node).

I'm pretty sure that the cause of the error is that the navsat_transform_node is using a heading that is slightly off to initialise itself. However, I have no idea how to go about fixing this. Any suggestions? I have uploaded the bagfile, launch and parameter files here - https://drive.google.com/drive/folders/0Bx8pkplHLcVUYVA1M2RGOWd6V2M?usp=sharing
Thanks in advance for your help!
↧
robot_localization map to odom transform seems to be incorrect
I'm using two robot_localization nodes, one for continuous data (wheels' odom + imu) and the other including gps for pose correction.
For instance ekf_localization and gps_ekf_localization are giving the exact same output seen from the odom frame which is normal, but when looking at their output in the map frame I am supposed to see a drift between them but they are still the same and the whole trajectory has the same form with a slight deformation and it's rotated with an important angle and also translated. This is an example of the published transforms:
transform:
translation:
x: 1.62237404993
y: -1.66250429176
z: 0.0
rotation:
x: 0.0
y: 0.0
z: 0.864963620982
w: 0.501834568736
This made me think that the map to odom transform is wrong for some reason,
also made me question: is the gps data really used for pose correction? With the same configuration, I eliminated the gps data and still have the same output.
The weird thing is that every thing seem to be working some times, the robot maintain its trajectory and seems to be very aware of its position and orientation. But with a more complex trajectory and in some different areas I'm getting the same behavior as the one described in this [paper](http://docs.ros.org/indigo/api/robot_localization/html/_downloads/robot_localization_ias13_revised.pdf) when fusing only odom and imu.
The launch file I'm using is a way too similar to the one provided as answer for the following
[question]( http://answers.ros.org/question/215904/robot_localization-failed-to-transform-warning/?comment=252396#post-id-252396).
Does this have something to do with the imu parameters ?
I've been reading the documentation and almost every published issue and question about the package but I still have so many questions bothering me in my head since I'm new to all this magic.
I will try to provide some bag files as soon as possible. But hopefully this little description will make you able to give me some points to check or to correct.
Thank you in advance.
↧
ROS Navstack for boat
Hello all,
From my knowledge, the navstack is *intended* for use with wheeled robots. However, I am currently working with a system which is a 16 foot catamaran vessel (one motor on each pontoon). We currently utilize ROS on the system for control, however we are looking to use the navstack for scripting autonomous tasks such as gps waypoint navigation, etc.
The only implementation requirement I am a bit curious about is the odometry information. I see that an odometry message consists of just position, orientation, linear velocity, and angular velocity. I'm guessing in the past, some of the estimates for this information came from assuming wheel-based motion with no slip.
My question is this:
If I have a sensor which gives me *very* accurate GPS location, orientations, linear velocities, and angular velocities, will the navstack work okay for a boat? The sensor we are using is a VectorNav VN-300, in case anyone is curious.
Thanks in advance for any insight and/or advice.
↧
↧
Xsens MTi-G 710 and robot_localization
Hey guys,
I am trying to implement Xsens INS device with robot_localization package that i could make robot navigate by GPS coordinates. However, i cannot get any data from navsat_transform node neither from ekf_node in the robot_localization package. I did find few suggestions what to do but it seems other people are more advanced than me so i got stuck at this point.
[false, false, false,
true, true, false,
false, false, false,
true, true, true,
true, true, true] [true, true, false,
false, false, false,
false, false, false,
false, false, false,
false, false, false]
I have to mention that i do not have odometry atm, as robot has no wheel encoders. Also i attach the configuration of the Xsens INS sensor.
rosrun xsens_driver mtdevice.py --configure=pl400fe,pa400fe,oq400fe,aa400fe,wr400fe,gd4fe --output-settings=tqMAG
For this sensor i use ethzasl_xsens_driver package as it seems to be properly working with the sensor.
What I am trying to achieve this moment is to get data from robot_localization package topics but nothing is being subscribed even though they subscribe to Xsens published topics /fix and /imu/data. Please let me know what I am doing wrong and what extra info i should attach.
**EDIT**
This is the following data that i receive when echo'ing. From topic /fix
header:
seq: 10809
stamp:
secs: 1488376624
nsecs: 581118106
frame_id: /imu
status:
status: 0
service: 0
latitude: 57.0143623352
longitude: 9.98601722717
altitude: 62.4811706543
position_covariance: [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
position_covariance_type: 0
---
From topic /imu/data
header:
seq: 1592301
stamp:
secs: 1488380578
nsecs: 149617910
frame_id: /imu
orientation:
x: -0.0012584940996
y: -0.0031715943478
z: -0.214321956038
w: -0.976757109165
orientation_covariance: [0.017453292519943295, 0.0, 0.0, 0.0, 0.017453292519943295, 0.0, 0.0, 0.0, 0.15707963267948966]
angular_velocity:
x: -0.000515580235515
y: 0.00211000442505
z: 0.000423193065217
angular_velocity_covariance: [0.0004363323129985824, 0.0, 0.0, 0.0, 0.0004363323129985824, 0.0, 0.0, 0.0, 0.0004363323129985824]
linear_acceleration:
x: -0.035540368408
y: -0.0115282256156
z: 9.81090068817
linear_acceleration_covariance: [0.0004, 0.0, 0.0, 0.0, 0.0004, 0.0, 0.0, 0.0, 0.0004]
---
And when launching navsat_transform.launch i only get the INFO status
INFO] [1488380635.870630401]: Datum (latitude, longitude, altitude) is (57.014286, 9.985973, 78.685829)
[ INFO] [1488380635.870718003]: Datum UTM coordinate is (559871.633204, 6319408.306693)
[ INFO] [1488380635.970500574]: Datum (latitude, longitude, altitude) is (57.014286, 9.985973, 78.685822)
[ INFO] [1488380635.970558908]: Datum UTM coordinate is (559871.633204, 6319408.306693)
However, when i echo /odometry/filtered , /odometry/gps or /gps/filtered i receive no data stream at all.
↧
Is my IMU using ENU convention? (for robot_localization)
Hi to all,
I'm using a Razor IMU from Sparkfun and it is the old model SEN-10736 which is compatible with ROS.
I followed all the steps reported on [razor_imu_9dof](http://wiki.ros.org/razor_imu_9dof) tutorial in order to upload the new firmware and make the IMU work with the `razor_imu_9dof` ROS driver.
Since I need to use it with `robot_localization` package, for [GPS integration](http://docs.ros.org/kinetic/api/robot_localization/html/integrating_gps.html), I would like to know if my IMU heading data start with zero point facing east or not as suggested on the robot_localization tutorial:
> Since version 2.2.1,> navsat_transform_node has moved to a> standard wherein all heading data is> assumed to start with its zero point> facing east. If your IMU does not> conform to this standard and instead> reports zero when facing north, you> can still use the yaw_offset parameter> to correct this. In this case, the> value for yaw_offset would be π/2
> (approximately 1.5707963).
How can I know if my IMU is already using the correct orientation or not?
I took two pictures when facing EAST and when facing NORTH.


So, I think my IMU reports 0 when facing NORTH.
In this case, should I use:
in my launch file in order to make the IMU outputs 0 when it faces EAST?
Is it correct?
↧
How to get Down Velocity in NED frame from Adafruit ultimate GPS using NMEA messages?
I am using Adafruit Ultimate GPS v3 with the nmea_navsat_driver ROS package. I am able to use the nmea_serial_driver node to publish /fix and /vel topics.
However, I don't see Z-axis velocity being published (it is always zero). Upon searching, I found that the Ultimate GPS uses GlobalTop Technology Inc. FGPMMOPA6H GPS Standalone Module and it gives out only the following NMEA messages:
GGA, GSA, GSV, RMC, VTG
And these NMEA messages do not give out Z-axis or Down velocity (NED). So my question is, how do I get this velocity from NMEA messages? Is there another ROS package which does post processing on NMEA sentences and gives out this velocity?
↧
extract data from bagfile
I am a new user of ROS. I have produced a bag file with data from Velodyne VLP 16 LIDAR and a GPS. I used a python script from https://www.clearpathrobotics.com/assets/guides/ros/Converting%20ROS%20bag%20to%20CSV.html to convert the bag file to csv. But I can't understand the csv file. Can someone explain to me which is x,y,z,intensity ? Or any other way where I could extract LIDAR (timestamp, x,y,z,i), GPS (lat, long)and IMU data (yaw,pitch, roll)? I will be using IMU later.
↧
↧
navsat_transform_node with /odometry/gps null
I'm trying to fuse GPS data in robot_localization node with IMU and wheel encoders.
I'm running a single instance of `ekf_localization_node` which takes as inputs IMU (`/imu)`, wheel encoders and my GPS (`/fix`).
Unfortunately, something is not working since I always get null values in `/odometry/gps` topics.
I hope you can help me and I'm sorry for my long topic!
I recorded a [bag file](http://www.skeetty.com/bags/gps.bag) with my acquisitions and these are some details.
When I try to run it in RVIZ, I get this error:
[ERROR] [1489693341.983362616, 7.022000000]: Could not obtain transform from odom->base_link
This is my [launch file](http://www.skeetty.com/control.launch):
and my [test.yaml](http://www.skeetty.com/test.yaml) file:
map_frame: map
odom_frame: odom
base_link_frame: base_link
world_frame: map
two_d_mode: false
frequency: 50
odom0: husky_velocity_controller/odom
odom0_config: [true, true, false,
false, false, true,
true, false, false,
false, false, true,
false, false, false]
odom0_differential: false
odom0_queue_size: 10
imu0: imu
imu0_config: [false, false, false,
true, true, true,
false, false, false,
true, true, true,
true, true, true]
imu0_differential: false
imu0_queue_size: 10
imu0_remove_gravitational_acceleration: true
odom1: odometry/gps
odom1_config: [true, true, true,
false, false, false,
false, false, false,
false, false, false,
false, false, false]
odom1_differential: false
This is my [TF tree](http://www.skeetty.com/bags/frames.png).
This is the output of my `/imu` node:
header:
seq: 8942
stamp:
secs: 1489494054
nsecs: 946403026
frame_id: base_imu_link
orientation:
x: 0.0261382685519
y: 0.00569945633341
z: -0.179503324187
w: 0.983393544681
orientation_covariance: [0.0025, 0.0, 0.0, 0.0, 0.0025, 0.0, 0.0, 0.0, 0.0025]
angular_velocity:
x: 0.05
y: -0.11
z: 0.13
angular_velocity_covariance: [0.02, 0.0, 0.0, 0.0, 0.02, 0.0, 0.0, 0.0, 0.02]
linear_acceleration:
x: -0.35316921875
y: 0.74540921875
z: 8.35463539062
linear_acceleration_covariance: [0.04, 0.0, 0.0, 0.0, 0.04, 0.0, 0.0, 0.0, 0.04]
This is the output of my `/fix` node:
header:
seq: 196
stamp:
secs: 1489494103
nsecs: 299462080
frame_id: /gps
status:
status: 2
service: 1
latitude: 40.0600583333
longitude: 18.346335
altitude: 121.65
position_covariance: [1.2100000000000002, 0.0, 0.0, 0.0, 1.2100000000000002, 0.0, 0.0, 0.0, 4.840000000000001]
position_covariance_type: 1
This is the output of my `/odometry/gps` topic:
header:
seq: 186
stamp:
secs: 1489494148
nsecs: 299743890
frame_id: odom
child_frame_id: ''
pose:
pose:
position:
x: 0.0
y: 0.0
z: 0.0
orientation:
x: 0.0
y: 0.0
z: 0.0
w: 1.0
covariance: [1.2153049291059825, 0.002373923604452718, -0.1386474481143354, 0.0, 0.0, 0.0, 0.0023739236044527045, 1.2110623164169005, -0.062043892236860874, 0.0, 0.0, 0.0, -0.13864744811433538, -0.06204389223686086, 4.833632754477118, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
twist:
twist:
linear:
x: 0.0
y: 0.0
z: 0.0
angular:
x: 0.0
y: 0.0
z: 0.0
covariance: [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
This is my `tf monitor` output:
RESULTS: for all Frames
Frames:
Frame: /base_imu_link published by unknown_publisher Average Delay: -0.102813 Max Delay: 0
Frame: /camera_color_optical_frame published by unknown_publisher Average Delay: -0.101398 Max Delay: 0
Frame: /camera_depth_optical_frame published by unknown_publisher Average Delay: -0.101403 Max Delay: 0
Frame: /gps published by unknown_publisher Average Delay: -0.101396 Max Delay: 0
Frame: /laser published by unknown_publisher Average Delay: -0.10139 Max Delay: 0
Frame: /realsense_frame published by unknown_publisher Average Delay: -0.103258 Max Delay: 0
Frame: base_footprint published by unknown_publisher Average Delay: 0.00276577 Max Delay: 0.0756417
Frame: base_link published by unknown_publisher Average Delay: 0.00892311 Max Delay: 0.0500184
Frame: front_bumper_link published by unknown_publisher Average Delay: 0.00276188 Max Delay: 0.0756394
Frame: front_left_wheel_link published by unknown_publisher Average Delay: 0.00317014 Max Delay: 0.0633538
Frame: front_right_wheel_link published by unknown_publisher Average Delay: 0.00317014 Max Delay: 0.0633538
Frame: imu_link published by unknown_publisher Average Delay: 0.00275932 Max Delay: 0.075638
Frame: inertial_link published by unknown_publisher Average Delay: 0.00275697 Max Delay: 0.0756368
Frame: rear_bumper_link published by unknown_publisher Average Delay: 0.00275588 Max Delay: 0.0756362
Frame: rear_left_wheel_link published by unknown_publisher Average Delay: 0.00317014 Max Delay: 0.0633538
Frame: rear_right_wheel_link published by unknown_publisher Average Delay: 0.00317014 Max Delay: 0.0633538
Frame: top_chassis_link published by unknown_publisher Average Delay: 0.00275492 Max Delay: 0.0756353
Frame: top_plate_link published by unknown_publisher Average Delay: 0.00275446 Max Delay: 0.0756347
Frame: user_rail_link published by unknown_publisher Average Delay: 0.00275379 Max Delay: 0.0756341
All Broadcasters:
Node: unknown_publisher 169.66 Hz, Average Delay: -0.032634 Max Delay: 0.075637
↧
Localization problem: costmap, move_base, robot_localization, gps, hector_mapping
So I have been trying to run robot_localization together with move_base and hector_mapping. However, I have ran into several problems which I am not sure what is the source for them. First of all, when I launch my robot_localization dual ekf_localization with two instances + navsat_transform instance i convert my GPS to UTM and have 3 different odometries: odom_ekf from first ekf instance, odom_gps which is published by navsat_transform and odom_navsat which is also published by navsat but used by second ekf instance. The move_base package subscribe to odom_ekf topic. Hector_mapping is publishing odom based on scan_matching data so it is kind of visual odometry. When I launch robot_localization, move_base and hector_mapping, I receive some weird readings on global and local costmap. It seems that global costmap is sliding around with odometry frames,instead of base_link, published by robot_localization instances and also shows obstacles where the odometry frame is located at that point (see the image http://i.imgur.com/mM0QOve.png). I know it is somehow caused with relation to robot_localization because if doing move_base + hector_mapping it does not show that behavior. Below i post my launch files and yaml files for costmap. Also i receive this warning, I have tried to change costmap size and other suggestions but neither fixed the issue:
[ WARN] [1489754764.299968883]: The origin for the sensor at (0.69, -0.11) is out of map bounds. So, the costmap cannot raytrace for it.
**robot_localization launch file**
[false, false, false,
false, false, false,
true, true, false,
false, false, false,
false, false, false] [false, false, false,
true, true, false,
false, false, false,
false, false, true,
false, false, false] [false, false, false,
false, false, false,
true, true, false,
false, false, false,
false, false, false] [true, true, false,
false, false, false,
false, false, false,
false, false, false,
false, false, false] [false, false, false,
true, true, false,
false, false, false,
false, false, true,
false, false, false]
**move_base launch file**
**global_costmap_params.yaml**
global_costmap:
map_type: costmap
global_frame: /map
robot_base_frame: /base_link
update_frequency: 4.0
publish_frequency: 2.0
static_map: false
rolling_window: true
width: 30.0
height: 20.0
resolution: 0.025
inflation_radius: 0.5
**local_costmap_params.yaml**
local_costmap:
map_type: costmap
global_frame: /map
robot_base_frame: /base_link
update_frequency: 5.0
publish_frequency: 2.0
static_map: false
rolling_window: true
width: 2.0
height: 2.0
resolution: 0.015
inflation_radius: 0.35
**hector_mapping.launch**
↧
using GPS error mtdef.MTException: fixed point precision not supported
Hi, I appreciate your fast response on this.
I am using the Xsense MTi-G710 with all outputs enabled.
Inside the building without a GPS fix everything is fine. Once I get outside to open sky and get a fix, the driver fails showing this message:
Traceback (most recent call last):
File "/home/renawi/catkin_ws/src/ethzasl_xsens_driver/nodes/mtnode.py", line 717, in
main()
File "/home/renawi/catkin_ws/src/ethzasl_xsens_driver/nodes/mtnode.py", line 713, in main
driver.spin()
File "/home/renawi/catkin_ws/src/ethzasl_xsens_driver/nodes/mtnode.py", line 122, in spin
self.spin_once()
File "/home/renawi/catkin_ws/src/ethzasl_xsens_driver/nodes/mtnode.py", line 629, in spin_once
data = self.mt.read_measurement()
File "/home/renawi/catkin_ws/src/ethzasl_xsens_driver/nodes/mtdevice.py", line 632, in read_measurement
return self.parse_MTData2(data)
File "/home/renawi/catkin_ws/src/ethzasl_xsens_driver/nodes/mtdevice.py", line 912, in parse_MTData2
raise MTException("fixed point precision not supported.")
mtdef.MTException: fixed point precision not supported.
Any help?
↧
gps time synchronization with pps?
I'd like to receive a GPS time message and be able to convert any ros time stamp into GPS time for consumption by other systems that have GPS receivers but aren't synchronized to my ros system clock.
It sounds like I would also want to make use of a gps receiver that sends a pulse when it sends the message with the timestamp, so that latency with receiving the serial message can be eliminated, and the timestamp could be made good to within a few milliseconds.
Is there software available for this, and perhaps recommended hardware for receiving the sync pulse? I can imagine having a microcontroller that receives the sync and the gps message while doing something ntp like to figure out what the ros time conversion is for the host computer (or eliminate the mcu and connect the pulse to audio input and low latency audio drivers?), but naturally would prefer to use something available rather than make this from scratch.
↧
↧
Using mapviz gpsfix vs navsatfix
Mapviz apparently uses GPSFix.It seems like most GPS nodes are still using NavSatFix which is a subset of the items in GPSFix.
- Is there a mapping node between these two message types for Indigo?
- In general is everything moving to GPSFix over NavSatFix?
- Does mapviz somehow support NavSatFix in Indigo?
↧
SLAM Algorithm to combine IMU measurements with 2D image or Point Cloud?
Hi all.
We are a research group at Aalborg University currently investigating the use of SLAM for indoor positioning of drones.
We have decided to use ROS to test out different implementations and develop our own algorithm+controller.
The short version of our question is:
**Which SLAM algorithm do you recommend for localization using a 2D camera or point cloud data from a depth sensor (RealSense) while also including other sensor measurements (IMU + GPS)**
----------
The detailed explanation and reason for this question can be seen below.
In our search for previous work with SLAM, where the primary goal is real-time localization (pose estimation), we have not been able to find that much.
For real-time localization a lot of work has been put into EKF-SLAM and FastSLAM, but mainly focused on 2D/planar navigation using LiDAR sensors
Otherwise it seems like a lot of research, especially when using other sensors such as camera and RGBD sensors, is put into the mapping portion of SLAM.
In our case we want to focus on the pose estimation and want to have as realiable and robust a position as possible.
We would also like to include other sensor information such as attitude estimates (roll, pitch, yaw) from the drone flight controller into the SLAM problem to enhance the positioning.
Furthermore we also have an indoor positioning system capable of delivering a position measurement with a slow rate (~1 Hz) but with quite noisy measurement and with a sensor that is likely to dropout in certain areas of the building. Hence we would like to incorporate these measurements as well but not only rely on these why we are investigating SLAM in the first place.
**Can any of you suggest any previous SLAM work or suggest any usefull paths for us to investigate?**
At the moment our plan is defined as follows:
- Extend FastSLAM to support features in 3D space and estimate 6D pose
- Use either a 2D camera or the point cloud output of a 3D depth sensor (RealSense, similar to Kinect).
- Investigate how other sensor information can be incorporated into FastSLAM
But is this a reasonable plan at all?
----------
Best regards,
Thomas Jespersen
↧
Pixhawk not publishing the local position of my Quadrotor
I am on Ubuntu 14.04 (ROS Indigo). I run "roslaunch mavros px4.launch" and am able to successfully connect to my quadrotor. I have a working GPS signal and also could see the data being published on "/mavros/global_position/local". However, no data is being published on "/mavros/local_position/pose". When I run the gazebo simulation, data is being published on "/mavros/local_position/pose". What could be the possible solution to this?
Thanks in advance
↧
Integrate UM7 IMU and Adafruit GPS with robot_localization
I'm trying to estimate the position of a robot using the robot_localization ROS package. Currently I use a UM7 Orientation Sensor wich has their own ROS package (um7_driver) and a Adafruit Ultimate GPS Breakout version 3.
To read the GPS data I use the nmea_serial_driver. I read a lot of answers in this page and documents as REP-103 and REP-105, but I'm still not able to configure correctly the nodes and topics to obtain an output in /odometry/filtered or /odometry/gps topics.
I'm usign the navsat_transform_node to convert the GPS raw data (fix) to a odometry/gps message. This odometry/gps message is an input for the ekf_localization_node. In addition the odometry message (output) of the ekf_localization_node is an input for the navsat_transform_node in a circular path as were described in: [How to fuse IMU & GPS using robot_localization](http://answers.ros.org/question/200071/how-to-fuse-imu-gps-using-robot_localization/#200087)
I think that my IMU configuration is ENU compliant. My launchers files are described below with samples of gps/fix and imu/data messages, unfortunately I can't upload an imageof my rqt_graph. Any comments and suggestions will be appreciated!!!
IMU Message Sample:
---
header:
seq: 352
stamp:
secs: 1491248090
nsecs: 477437036
frame_id: imu
orientation:
x: 0.0580077504
y: 0.0024169896
z: 0.9721333587
w: -0.2270291759
orientation_covariance: [0.002741552146694444, 0.0, 0.0, 0.0, 0.002741552146694444, 0.0, 0.0, 0.0, 0.007615422629706791]
angular_velocity:
x: 8.9495899038e-05
y: -0.000212560517745
z: 0.000418925544915
angular_velocity_covariance: [1.0966208586777776e-06, 0.0, 0.0, 0.0, 1.0966208586777776e-06, 0.0, 0.0, 0.0, 1.0966208586777776e-06]
linear_acceleration:
x: -4.19586237482
y: 1.07622469897
z: 8.44690478507
linear_acceleration_covariance: [0.0015387262937311438, 0.0, 0.0, 0.0, 0.0015387262937311438, 0.0, 0.0, 0.0, 0.0015387262937311438]
---
GPS Message Sample:
---
header:
seq: 54
stamp:
secs: 1491248749
nsecs: 594259977
frame_id: /gps
status:
status: 0
service: 1
latitude: -33.0408566667
longitude: -71.6299133333
altitude: 132.4
position_covariance: [3.9204, 0.0, 0.0, 0.0, 3.9204, 0.0, 0.0, 0.0, 15.6816]
position_covariance_type: 1
---
↧
↧
Using the navsat transfrom for position information
Hi,
I am simulating the movement of a jackal robot in gazebo. It comes with a gps sensor. I wanted to use the navsat/fix topic to get the position information of the robot information.
Do I need to use the navsat transform to transform the frames from the gps frame to the base_link frame?
Can I directly subscribe to the navsat/fix topic to get the information?
I am not very clear on the navsat transform and its use. Is there a proper link from where to read information on this?
Regards,
rsmitha.
↧
GPS waypoint navigation with Jackal
I have a the UGV "Jackal" from Clearpath Robotics. My Task is to let the robot follow a predetermined path (given in UTM -Coordinates). The robot has already GPS, IMU and odometry implemented. The `robot_localization` package is also already installed and preconfigured.
To let the robot navigate autonomously, I would like to use the `move_base` package. Therefore the goals (`move_base/goal` or `move_base_simple/goal`) have to be "in the robots world frame, or a frame that can be transormed to the world frame" (see Answer for this [question](http://answers.ros.org/question/251347/gps-waypoint-navigation-with-robot_localization-and-navsat_transform_node/)).
The `robot_localization` package should provide a transform from **world_frame->utm**. But when I start the robot outside, I cant't see this transformation in the `tf` node. Because of that I can't send goals to the `move_base` node with `frame_id` **utm** I think.
My Questions are now:
1. Where can I check that the `navsat_transform_node` provides the **world_frame -> utm** transformation?
2. How should I configure the `move_base` node that he will accepts goals in the utm-frame? (I know this is a very open question, but I am very glad for every tip.)
Thank you for your help and support.
↧
GPS Mission Planning in ROS?
I'm working on my entry for Sparkfun's [Autonomous Vehicle Competition](https://avc.sparkfun.com/), and my current mission planning software is non-existent.
Can anyone recommend a good mission-planning software that would allow me to create a series of GPS waypoints on a map? Bonus points if it integrates well with ROS and rviz, and can be used offline.
↧