Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

problem about localization in an octomap using hmanoid_localization with 3D lidar pointcloud2,IMU,Odometry datas #18

Open
weisongwen opened this issue May 8, 2017 · 4 comments

Comments

@weisongwen
Copy link

weisongwen commented May 8, 2017

Firstly, thanks for your excellent work on octomap and humanoid_navigation.

These days I am working on the humanoid_localization package, could you pls provide bags file containing the topics and tf needed by the humanoid_localization package? such as the IMU, Tf (containing the information of odometry), pointCloud2.

Now I am working on localization issue for a robot working in outdoor environment. I created the octomap for the environment by velodyne64 Lidar, IMU, Odometry by SLAM. Now I want to localization in the environment using the existed octomap and I find that your humanoid_localization is a good method to do it. But I found some problems:

My system:

 Ubuntu14.04
 ROS indig0

I got the Error:

ERROR: Raycasting in direction (0,0,0) is not possible!

my imput data for humanoid_localization node can be listed as follow:

1. **octomap** (.bt format)
2. **Lidar datas** from Velodyne64 in PointCloud2, topic name is **velodyne_points**, frame_id is **velodyne_link**.
3. **IMU data** from RT2000 sensor in sensor_msg/IMU, topic name is **imu**, frame_id is **imu_link**.
4. **odometry data** from encoders. I send the data by the tf from the **odom → torso**. I do not sure whether this is right, this tf send the odometry information to **humanoid_localization**.

this is my launch file:

<node pkg="humanoid_localization" type="localization_node" name="humanoid_localization" output="screen" > 
    <param name="odom_frame_id" value="/odom" />
    <param name="base_frame_id" value="/torso" />
    <param name="global_frame_id" value="/map" />
    <param name="num_particles" value="500" />
    <param name="use_raycasting" value="true" />    
    <param name="use_imu" value="true" />
</node>

I change the topic name in humanoid_localization package and humanoid_localization node subscribe the three topic: tf, imu, velodyne_points. But there was no output except for the Error:Raycasting in direction (0,0,0) is not possible!

except for the Error(ERROR: Raycasting in direction (0,0,0) is not possible!), I also have several other question as follow:

  1. dose this tf odom → torso send odometry information to the humanoid_localization node?
2. what function does this frame_id **base_footprint** have? just a frame_id fixed on the robot?

3. I am not sure: how the humanoid_localization node get the transfer the odometry data?

Thanks a lot !

@weisongwen weisongwen changed the title could you provide bag files for 6D localization? problem about localization in an octomap using hmanoid_localization with 3D lidar pointcloud2,IMU,Odometry datas May 8, 2017
@ahornung
Copy link
Owner

ahornung commented Jun 2, 2017

1.) yes, this is a required tf transform as odometry source
2.) it's the base link projected onto the ground plane, see http://answers.ros.org/question/12770/base_link-to-base_footprint-transform/
3.) see 1

The raycasting error could occur due to invalid transforms or invalid points in your point cloud.

@weisongwen
Copy link
Author

Hi, @ahornung

Thanks for your reply and your great work on humanoid_navigation package.

my launch file was named nao_localization_laser.launch which was provided in the config folder.

my robot was a mobile car with encoders measuring the odometry information and Velodyne64 Lidar providing the PointCloud2 topic. The lidar was installed on the top of the car about 2m above the ground.

in my tf trree, I fixed the base_link with the velodyne_0(the frame_id of velodyne64 sensor), and I did not provided the imu topic because the orientation information was provided in the odometry topic.

The octomap for localization was provided by the SLAM method with a resolution of 1cm.

  1. I provide the odometry information in the tf between odom to base_link. The odometry was quite exact with a resolution of less than 5cm.

  2. The exact tf tree when I ran the humanoid_localization ndoe can be seen as follow.
    frames

  3. when I run the humanoid_localization, the localization is not good at all. it spend about 4 seconds to observe in MCL. The localization result can be seen as follow.
    image

image

  1. I have Checked the humanoid_localization node and find that the raycasting process consumed a lot of time.
    // iterate over beams:
    PointCloud::const_iterator pc_it = pc_transformed.begin();
    std::vector::const_iterator ranges_it = ranges.begin();
    for ( ; pc_it != pc_transformed.end(); ++pc_it, ++ranges_it){
    .................

    in this part, it can iterate for 4500 times one integrateMeasurement because there was 4500 measurements to raycasting. as a result, the localization result is not as good as expected in terms of the real-time computation and localization.

Some of the tf translation can be see as follow after I use the tf_echo.

rosrun tf tf_echo /odom /base_link
image

image

rosrun tf tf_echo /base_footprint /base_link
image

rosrun tf tf_echo /base_link /velodyne_0

image

I checked the source file in humanoid_localization node, there are three steps to filter the pointCloud2:

  1. pcl::PassThroughpcl::PointXYZ pass;
    pcl::PointCloudpcl::PointXYZ::Ptr pcd_tmp(new pcl::PointCloudpcl::PointXYZ());
    #if PCL_VERSION_COMPARE(>=,1,7,0)
    pcl::PCLPointCloud2 pcd2_tmp;
    pcl_conversions::toPCL(*msg, pcd2_tmp);
    pcl::fromPCLPointCloud2(pcd2_tmp, *pcd_tmp);
    #else
    pcl::fromROSMsg(*msg, *pcd_tmp);
    #endif
    pass.setInputCloud (pcd_tmp);
    pass.setFilterFieldName ("z");
    pass.setFilterLimits (m_filterMinRange, m_filterMaxRange);
    pass.filter (pc);
  2. Eigen::Matrix4f matSensorToBaseFootprint, matBaseFootprintToSensor;
    pcl_ros::transformAsMatrix(sensorToBaseFootprint, matSensorToBaseFootprint);
    pcl_ros::transformAsMatrix(sensorToBaseFootprint.inverse(), matBaseFootprintToSensor);
    // TODO:Why transform the point cloud and not just the normal vector?
    pcl::transformPointCloud(pc, pc, matSensorToBaseFootprint );
    filterGroundPlane(pc, ground, nonground, m_groundFilterDistance, m_groundFilterAngle, m_groundFilterPlaneDistance);
  3. pcl::transformPointCloud(ground, ground, matBaseFootprintToSensor);
    voxelGridSampling(ground, sampledIndices, m_sensorSampleDist*m_sensorSampleDistGroundFactor);
    pcl::copyPointCloud(ground, sampledIndices.points, pc);
    numFloorPoints = sampledIndices.size();
    }

after the three steps, the pointCloud2 for raycasting is still too much which resulted in much time consumption.

could you give me some advice? Thanks a lot!

@hanhan119
Copy link

@weisongwen
hello,i am working on humanoid_localization package now,i have the same questions with you,and My particles can't converge when using pointcloud2 ,could you please give me some advice?

@weisongwen
Copy link
Author

Hi @hanhan119 i do not work on this package any more.
Best,
Welson,

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants