-
Notifications
You must be signed in to change notification settings - Fork 559
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Running RTABMAP on Robot #1196
Comments
For rtabmap launch, you should use |
Thank you for your reply @matlabbe , |
Is the camera upside down? Maybe there is a missing rotation in the URDF. |
Thank You for your reply, I fix this by deleting the static transform in my robot_base launch file. My next plan is to use the rtabmap for the autonomous navigation. I already followed the "https://wiki.ros.org/rtabmap_ros/Tutorials/MappingAndNavigationOnTurtlebot" but it couldn't work for me. Can you kindly help me with this?. |
The navigation demo didn't work? which step failed? |
Actually, after creating the map. I am running the rtabmap in localization mode like
and then i am running the move_base node and then giving the goal using rviz but the robot is not moving. I am not sure weather I am doing the right way or not. Can you help me ? here is my move_base launch file 👍
|
@matlabbe Can you kindly help me . i still facing issues with navigation. |
That should be mapped to an odometry topic (in your case it seems the visual odometry topic The
You may try turtlebot3 example on https://wiki.ros.org/rtabmap_ros/Tutorials/MappingAndNavigationOnTurtlebot to see how topics are connected and look also the TF tree ( cheers, |
@matlabbe , Thank You |
Can you share the database? With stereo indoor, the default |
Here is the link of my database : |
Thank You for your reply,
Should i give this parameters in my rtabmap.launch file? and i already set my frame_id= base_footprint for my rtabmap. but for the camera the base id is :
<arg name="base_frame" default="oak-d_frame" />
<arg name="parent_frame" default="oak-d-base-frame" />
and then oak-d-base-frame" atatched to my base_link using static transform.
for the odometry, i am using the visual odometry from the rtabamap , rgbd_odom node. I am trying to better my odometry as well but i don't have any idea about it. Can you guide me how i do this.
|
What is the transform between For better visual odometry, I think you may have to re-calibrate your camera to remove any distortion. Then feed the rectified image to rtabmap. |
Actually, i don't have a direct transform between base_footprint to oak-d-base-frame. I have a 3 different launch files. Is this true? or i need a direct transformation between the base_footprint and camera abse frame as well. ? For visusal odometry, i recalibrate my camera and checking again for that. |
@matlabbe , I again create an outdoor map with a artifical trees of 5 rows, 3 columns setup but this time i attached my camera directly on the base_link. and the map looks fine but i don't know what happened when i turn fron first row the map didn't get turn and its start going stright and didn't update the map.: |
The link of the database is the one of the image on your previous post. For TF, no need to have a direct TF link between base_footprint and the camera link. The TF tree looks fine. In the URDF, the camera should have been |
Hello @matlabbe , |
Thank You @matlabbe , |
For the stereo mode, you can try this tutorial following depthai instructions. For the baseline error, the camera_info may be not correctly generated (inverted sign on the The
For a ground robot moving on a relativaly flat ground, cheers, |
Thank you for the detailed answer, i tried the both Right depth IR and the stereo(Left/righ) . In right IR tge odometry is lost every time when i move my camera but the stereo(left/right is working good). But in both cases i am getting the error :Overwriting previous data! Make sure IMU is published faster than data rate. (last image stamp buffered=1727833812.942230 and new one is 1727833813.008896, last imu stamp received=1727833812.823021)Overwriting previous data! Make sure IMU is published faster than data rate. (last image stamp buffered=1727833812.942230 and new one is 1727833813.008896, last imu stamp received=1727833812.823021). How i resolve this error.. and secondly in your previous reply you plot a graph of visual odometry processing. I tried to plot that graph from rtabmap but i didn't get the data for this. what i am getting is just the odometry trajectory from rtabmap. Can you tell me the way how i get that data? . Here is my new map with stereo: Regards, |
Hello, I have already set approx_sync_max_interval:=0.001, and after testing the mapping again, I noticed an improvement. The left and right images now appear much better than before, and the odometry time estimation has decreased to less than 55 ms. Here is the link to the database: I am not sure how to compare the disparity between the left and right stereo images. Could you kindly explain that to me? I apologize if this is a basic question, as I am still new to working with visual SLAM with rtabmap. Regarding the IMU publishing rate, I am working on resolving that as soon as possible. Thank you for your ongoing support and help. |
Thank you. What I aim to do is assess the error and perform a comparison. Do you have any suggestions for a better approach?" |
It is possible, only thing to have is a TF representing the ground truth. You can then set the fixed and moving frames with those parameters under rtabmap node: <param name="ground_truth_frame_id" type="string" value="world"/>
<param name="ground_truth_base_frame_id" type="string" value="base_link_gt"/> where Example of usage can be found in this example: rtabmap_ros/rtabmap_examples/launch/rgbdslam_datasets.launch Lines 57 to 58 in c3f7cfe
|
I followed your instructions and used an RTK-GPS as the ground truth, but there is significant error, likely due to the inherent inaccuracies of the GPS system. Is there another method to obtain more accurate ground truth? I also didn't fully understand the example you provided earlier—specifically, what type of ground truth was used. (I installed the GPS Antenna to the camera) Here is the frames: |
Hello, The ground truth is wrong, it should be relative to a local frame and in meters / rad (starting at 0,0,0) and having all 6DOF estimated if you are comparing to 6doF poses like this. The IMU TF frame looks wrong, in yellow below is the gravity estimated from that frame, it should be more like red arrow that I added. It makes the whole map folding in a strange roll/pitch value. I remember having to fix the IMU rotation in https://github.com/luxonis/depthai-ros/blob/6c07c27324201013befb751139fe7252ee9a93e9/depthai_descriptions/urdf/include/base_macro.urdf.xacro#L70-L97 for a depthai camera because it was inverted (or make sure you use the right model). In rtabmap-databaseViewer, when setting I don't know if the GPS values are super accurate or not, but if VLSAM path here is wrong, maybe try not to tilt too much toward the sky, features on the sky are very far and would make position estimation difficult (ideally, point horizontally or slightly down). cheers, |
I will look for the IMU transform problem gain, but currently, the problem with my system is that the GPS data is not very accurate. This makes it hard to compare the trajectory with the ground truth based on the GPS data. Are there any other techniques for calculating the ground truth that are more accurate and work with RTAB-Map? Also, can you tell me how I should set the ground truth? For now, I just followed your guidance by setting the parameters in the rtabmap.launch file, setting a static transform between world and map in the rtabmap.launch file, and also setting the static transform between world and base_link_gt. Additionally, I set the GPS code to broadcast the world to base_link_gt.My GPS Code is below :
[hada_rtk_gps.zip](https://github.com/user-attachments/files/17343883/hada_rtk_gps.zip)
|
You may use code like this https://gist.github.com/MikeK4y/1d99b93f806e7d535021b15afd5bb04f to convert GPS values to local ENU system (the format that should be sent on TF). The ground truth should look better afterwards (to compute valid RMSE values). Do you use a D-GPS ? With a calibrated base station, you could get more accurate GPS values (<1 cm error). This would work only outside though and GPS signal should be fairly stable. |
The IMU issue has been resolved, but I am still facing problems with the ground truth. I might be confused about which topic to publish the ENU values to. Should I publish them on the /gps/fix topic, and will that be sent to the transform as well? Here are the steps I have taken so far. Based on the example you shared, I added the following transform in my rtabmap.launch file:
Additionally, I configured the following parameters in the same launch file:
However, I am still unsure how to correctly set up the GPS node so that it works properly with RTAB-Map. Do you have any examples or suggestions that might help? For the GPS, I am using a calibrated RTK GPS, but the values are floats rather than fixed, which could be causing an error of around 2-3cm, I am also working on it but i need to resolve the ground truth first. Here is the link to my new database, but I am still not receiving ground truth information: GPS_Code: Thanks for helping, |
The |
Thank you for your response. If I understand correctly, you're suggesting that I use the /gps/fix topic and not rely on any ground truth data. Then, I can compare the GPS trajectory with the visual robot trajectory. However, I’m unsure how to align both graphs to the same axis as my robot’s. Additionally, I have another question. I tested localization indoors in an orchard-like environment (since my focus is on orchards). During mapping, when I mapped the environment with just one straight round, I successfully achieved loop closure detection along the path I followed. However, in the opposite direction, I didn’t detect any loop closure. I then tried mapping the entire environment by covering all areas from different directions, but many loops were rejected, and I only managed to achieve 20 successful loop closures despite covering all areas. I've also encountered some errors, which I’ve attached in the output file below. Could you suggest any improvements to enhance the mapping process, so the robot can detect loop closures consistently from all angles? Database with Simple Way Mapping: Database with full Area: Thank You for Helping and your contribution. |
Thank you for your response. I made further attempts in both indoor and outdoor environments, successfully detecting loop closures during the mapping process while driving the robot on both sides. However, I am still encountering challenges with ground truth verification, as the GPS data remains inaccurate. I have also researched the possibility of using the KITTI dataset as a ground truth reference. Could you please provide guidance on how to utilize the KITTI dataset for evaluating the performance of Visual Odometry? |
DIY RTK:
Commercial RTK: |
I am using a depthai camera (Model: OAK-D PRO POE-W) with the rtabmap for creating a map. but i am facing a problem in tf. as i am new to this field so i don't know whats the problem. I am following the below steps:
1: roslaunch robot_bringup robot_base_2.launch ( This launch file include the robot.urdf.xacro file and robot_state publisher and joint state publisher along with a node cansend_message which is used to send the CAN control message to robot for movement.
2: roslaunch depthai_examples stereo_inertial_node.launch
3: rosrun imu_filter_madgwick imu_filter_node imu/data_raw:=/stereo_inertial_publisher/imu imu/data:=/stereo_inertial_publisher/imu/data _use_mag:=false _publish_tf:=false
4: roslaunch rtabmap_launch rtabmap.launch args:="--delete_db_on_start" rgb_topic:=/stereo_inertial_publisher/color/image depth_topic:=/stereo_inertial_publisher/stereo/depth camera_info_topic:=/stereo_inertial_publisher/color/camera_info imu_topic:=/stereo_inertial_publisher/imu/data frame_id:=oak-d_frame approx_sync:=true wait_imu_to_init:=true
After this my robot tf frames are broken and i got below in rviz.
and when i set the argument fixed frame to base_footprint. then the camera and imu link is broken. i am not sure about this. can anyone help me regarding this?
I am also uplaoding the frame.pdf
frame_id_oak_d_frame.pdf
frame_id_base_footprint.pdf
The text was updated successfully, but these errors were encountered: