Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Incorrect SLAM in simulation using RGBD sensor #1199

Open
Shivam7Sharma opened this issue Aug 20, 2024 · 1 comment
Open

Incorrect SLAM in simulation using RGBD sensor #1199

Shivam7Sharma opened this issue Aug 20, 2024 · 1 comment

Comments

@Shivam7Sharma
Copy link

Shivam7Sharma commented Aug 20, 2024

Hi,

I am getting an incorrect point cloud when I start moving the robot forward. Initially, by only rotating the robot, the map is okay. I am using a gazebo simulation and the following command to run rtabmap in RGBD mode:

roslaunch rtabmap_launch rtabmap.launch \
    rtabmap_args:="--delete_db_on_start \
        --RGBD/LoopClosureReextractFeatures:=true \
        --RGBD/OptimizeFromGraphEnd:=true \
        --RGBD/OptimizeMaxError:=0.05 \
        --Grid/GlobalFullUpdate:=true \
        --Vis/MinInliers:=10 \
        --Vis/InlierDistance:=0.1 \
        --Viz/EstimationType:=1 \
        --Vis/MaxFeatures:=1000 \
        --Vis/CorType:=1 \
        --Odom/Strategy:=1 \
        --RGBD/NeighborLinkRefining:=true \
        --Optimizer/Iterations:=200 \
        --Optimizer/Robust:=true" \
    rgb_topic:=/camera/color/image_raw \
    depth_topic:=/custom_depth_registered/image_rect \
    camera_info_topic:=/camera/color/camera_info \
    depth_camera_info_topic:=/custom_depth_registered/camera_info \
    visual_odometry:=true \
    wait_imu_to_init:=false \
    frame_id:=base_link \
    approx_sync:=true \
    approx_rgbd_sync:=false

The following is an image of point cloud and the simulation environment:
Screenshot from 2024-08-20 17-57-19
Screenshot from 2024-08-20 17-56-53

What am I missing? Why are the same objects being repeated at different locations even after using the right arguments? The problem is not as bad as when I am using a different robot with the same command and environment.

This is how another robot (with the same sensor) mapped it:

Screenshot from 2024-08-20 18-02-05

The image above has imperfections but is better, but I don't know why the other robot is giving a bad result in SLAM. I am using quadrupeds, so there are some shaky movements in the camera on both robots.

@matlabbe
Copy link
Member

matlabbe commented Sep 5, 2024

Can you share database of the first (or both) map? Is the camera point of view the same between the robots? When you say same sensor, does it mean exactly the same simulated camera? Bad depth registration can be a cause. Your simulated environment seems to have varied textures, though depending where the camera is looking at, poor texture or repetitive patterns can cause large odometry drift. If one robot makes the camera shaking more that could be also another issue.

For the parameters:

  • RGBD/OptimizeMaxError below 1 is maybe taken from an old example, it should be now > 1, or 3 as the default.
  • RGBD/LoopClosureReextractFeatures: if taken from an old example, we don't need this anymore
  • Grid/GlobalFullUpdate doesn't exist anymore, you may have seen a warning/error log about this
  • Vis/InlierDistance is ignored if Vis/EstimationType=1
  • There is maybe a typo Viz/EstimationType -> Vis/EstimationType (anyway default is 1)
  • I would not use RGBD/NeighborLinkRefining if you don't use a lidar
  • Optimizer/Iterations=200 is quite high, default is 20
  • If you want to use Optimizer/Robust, you should set RGBD/OptimizeMaxError to 0
  • depth_camera_info_topic is ignored
  • To make sure you get everything synchronized, set approx_sync:=false
  • approx_rgbd_sync is ignored unless you use also rgbd_sync:=true

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants