-
Notifications
You must be signed in to change notification settings - Fork 34
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Pose and IMU data fusion, adapting e2e example. #9
Comments
@antithing thank you for your question and appologies for the delay. It almost seems as if the Aruco pose measurement is not processed. Can you assure that this section of the code is executed? if (frame == true)
{
mars::BufferDataType dataP;
ReadLock w_lock(grabberLock);
dataP = dataPose;
w_lock.unlock();
frame = false;
core_logic.ProcessMeasurement(pose_sensor_sptr, timestamp, dataP);
dataGot = true;
} Should
And check if they are unreasonably high? Also, can you send us a plot that shows the Aruco pose and the MaRS pose? The position in X,Y,Z and orientation in Euler should give us a good idea about whats going on. |
Hi @Chris-Bee @mascheiber , thank you for getting back to me. I can confirm that the code is hit, and that the pose data is populated. I am trying this to print the residual, but it doesn't seem to work, is there a trick here?
I also see:
Are there tools in Mars to plot? Or will I need a third party function? |
Hi @antithing, I appreciate the confirmation, however, I would have a couple of questions to help you further:
Concerning your questions:
Doing something like std::cout << pose_sensor_sptr->residual_.transpose() << std::endl; should be sufficient for what we need right now, no need to format.
Within the MaRS lib, there are no tools provided. However, we tend to store the results in csv files and then plot the data using external tools such as Matlab or Python scripts. An example is given here: mars_lib/source/examples/mars_thl/mars_thl_example.cpp Lines 122 to 172 in 18e9a92
If you adhere to the convention of the csv writing (lines 157 and 167), the csv files would be enough at the moment to help you. We might consider including scripts for plotting in a future release! |
Thank you! I have checked through the code and stabilised the POSE output. I know see:
The blank line SHOULD be printing the residual, but it does not. Yes, i am using the ZED camera, and yes the IMU data and image are both from that sensor. I am attaching the full code here again in case you are able to take a look. I feel like I am close! But definitely missing something. Thank you so much for your time. |
Apologies for the late response and thank you for checkign the POSE output. So I did take a look at your code and saw a couple of things: mars::PoseSensorStateType last_state = pose_sensor_sptr->get_state(latest_result.data_.sensor_);
std::cout << last_state.p_ip_.x() << " FILTER " << std::endl; With this code you are printing the calibration state of your pose sensor, i.e. the translation between the IMU and one of your cameras. That is the reason, why the value is so small, as they are expected to be close together. To print the filter's core state you should rather do something like core_logic_.buffer_.get_latest_state(&latest_state);
mars::CoreStateType latest_core_state = static_cast<mars::CoreType*>(latest_state.data_.core_.get())->state_;
std::cout << latest_core_state.p_wi_.x() << " FILTER " << std::endl; In your code you are initializing the calibration states of your pose sensor (camera) and IMU to identity: pose_init_cal.state_.p_ip_ = Eigen::Vector3d::Zero();
pose_init_cal.state_.q_ip_ = Eigen::Quaterniond::Identity(); Although I have never worked with the ZED cam, I highly doubt that there is no translational and rotational offset between these two sensors. While in therory, these states should become observable if you move, your estimation performance is better if they are initialized somewhat close to their real values. This in turn, might again explain some of the differences between your aruco pose estimation, and MaRS' pose. Similarly, you are initializing the filter's core states to identity in orientation, and 5m height in translation: // Initialize the first time at which the propagation sensor occures
if (imu_sensor_sptr == core_logic.core_states_->propagation_sensor_)
{
Eigen::Vector3d p_wi_init(0, 0, 5);
Eigen::Quaterniond q_wi_init = Eigen::Quaterniond::Identity();
core_logic.Initialize(p_wi_init, q_wi_init);
} But clearly, your first pose information is different ( We do something similar within the MaRS ROS wrappers: Concerning the residual printing, I just saw that this is a bug within the pose sensor class, where the value is only stored locally but not written to the Thus thank you for pointing this out, will be providing a fix soon. For now you could add the printing the in the mars_lib/source/mars/include/mars/sensors/pose/pose_sensor_class.h Lines 202 to 204 in 18e9a92
I do hope this helps, please let me know if you have any further questions to these points. |
As discussed, fixed by @mascheiber in 365396b |
Hi @Chris-Bee thank you! I have made those changes. Now I see a better result when initialising:
However when I start moving the camera, the values slip apart:
Does this section look correct now?
The imu-camera transform is from the sensor SDK, and is listed as:
Do I need this line:
Additionally, if the markers are lost and re-found, I see this error:
Thank you again! |
Hi @antithing, It's great to hear that the changes improved the initialization result! Before I go into detail on the values when moving the camera, two things:
Created: [IMU] Sensor
Created: [Pose] Sensor
Created: Buffer (Size=300)
Created: Buffer (Size=100)
- Created: CoreLogic
+ Created: Core Logic - Using MaRS Version: <BRANCH_HASH> See also here:
2. Once verified, can you redo this test, as this error should not occur in the latest version anymore
Further, I thought the initial pose you set for MaRS is a bit far away (121 meters in z?).
So, I double-checked your code and saw that you do the following: std::vector<cv::Vec3d> rvecs, tvecs;
cv::aruco::estimatePoseSingleMarkers(markerCorners, 17.2, K, D, rvecs, tvecs); However, the output for the pose here is in the same unit as the
Assuming the Thus, can you try to set the |
Hey @Chris-Bee, okay! Changes made, new pull from main branch. The lost-marker error is indeed resolved, thanks! New output still seems problematic though:
It is better! But still some 'drift' in the filtered values. Is this normal? Also, the imu-camera transform, is it possible I have it inverted? Should it be IMU-to-Camera? or Camera-to-IMU?
Thanks again! |
@antithing I'm happy the solution to this issue is on a good way. The credit for this support goes to my colleague @mascheiber. |
Ah apologies for the incorrect tag! @mascheiber, thank you very much! Getting closer! |
Hi @antithing, no worries, and thanks @Chris-Bee, for pointing this out. @antithing, thanks so much for the detailed debug information, especially the pictures. This really helps to figure out the problem you are having.
This might be the remaining issue you are having. Let me explain this with the following figure:
MaRS generally estimates the IMU pose, denoted with frame Now, the aruco tag has a defined frame definition, which is given your pictures, However, since you placed it on the wall, there is a rotational offset between these two frames (in the figure where Further given your setup, which corresponds to a 90-degree rotation around the At this stage, you have two options:
// setup additional sensors
// Pose sensor
std::shared_ptr<mars::VisionSensorClass> vision_sensor_sptr = std::make_shared<mars::VisionSensorClass>("Vision", core_states_sptr);
vision_sensor_sptr->const_ref_to_nav_ = true; // TODO is set here for now but will be managed by core logic in later versions
// Define measurement noise
Eigen::Matrix<double, 6, 1> pose_meas_std;
pose_meas_std << 0.02, 0.02, 0.02, 2 * (M_PI / 180), 2 * (M_PI / 180), 2 * (M_PI / 180);
vision_sensor_sptr->R_ = pose_meas_std.cwiseProduct(pose_meas_std);
// Define initial calibration and covariance
mars::VisionSensorData vision_init_cal;
vision_init_cal.state_.p_ic_ = Eigen::Vector3d(imuLeftX, imuLeftY, imuLeftZ);
vision_init_cal.state_.q_ic_ = Eigen::Quaterniond(imuLeftQW, imuLeftQX, imuLeftQY, imuLeftQZ);
// Define initial offset between vision world and MaRS world
Eigen::Matrix3d R_wv;
R_wv << 1, 0, 0,
0, 0, -1,
0, 1, 0;
vision_init_cal.state_.p_vw_ = Eigen::Vector3d::Zero();
vision_init_cal.state_.q_vw_ = Eigen::Quaterniond(R_wv.transpose()); // invert R_wv to get R_vw
// set lamda (scale) to 1 - as output is metric
vision_init_cal.state_.lambda_ = 1;
// The covariance should enclose the initialization with a 3 Sigma bound
Eigen::Matrix<double, 13, 1> std;
std << 1e-3, 1e-3, 1e-3, 1e-3, 1e-3, 1e-3, 0.1, 0.1, 0.1, (10 * M_PI / 180), (10 * M_PI / 180), (10 * M_PI / 180), 1e-3;
vision_init_cal.sensor_cov_ = std.cwiseProduct(std).asDiagonal();
vision_sensor_sptr->set_initial_calib(std::make_shared<mars::VisionSensorData>(vision_init_cal)); Note that for the vision sensor, the frame naming convention is exactly as in the figure described above, though the state is inversed, i.e., aruco These are some examples of which changes you need to apply, though I might have forgotten something -- please refer to an example for the Please let me know if this information helps you, and please do not hesitate to ask if you have any further questions regarding this! |
What I forgot to mention above: Regardless of your choice, you also need to initialize // when you first get a pose with tvecs and rvecs
Eigen::Matrix3d R_vc = rvecs; // or however you convert this
Eigen::Vector3d p_vc = tvecs; // or however you convert this
// try to retrieve the information stored in the vision_init_cal above
Eigen::Matrix3d R_ic = vision_init_cal.state_.q_ic_.toRotationMatrix();
Eigen::Vector3d p_ic = vision_init_cal.state_.p_ic_;
Eigen::Matrix3d R_vw = vision_init_cal.state_.q_vw_.toRotationMatrix();
Eigen::Vector3d p_vw = vision_init_cal.state_.p_vw_;
q_wi_init_ = Eigen::Quaterniond(R_vw.transpose() * R_vc * R_ic.transpose());
p_wi_init_ = R_vw.transpose()* (p_vc + R_vc*(-R_ic.transpose()*p_ic)) + p_vw; |
@mascheiber Thank you for the excellent explanation. This is amazing support, and it is much appreciated! Before I test the vision sensor changes above, in the interest of confirming there is nothing else strange happening, I have also tested with the pose sensor, and putting the marker on the floor. Although I see a better alignment initially, as soon as the camera is moved, the points drift off in the wrong direction. Please see the video below: Mars_poseSensor.movIs this likely a secondary issue to the previous one? I am adding the latest full code here again in case that helps. Thank you again! |
Testing this on the floor first is a good idea! So given your code and what I see, can you try two things:
// Line 351-354 in your main.cpp
// The covariance should enclose the initialization with a 3 Sigma bound
Eigen::Matrix<double, 6, 6> pose_cov;
pose_cov.setZero();
pose_cov.diagonal() << 1e-9, 1e-9, 1e-9, 1e-9, 1e-9, 1e-9;
pose_init_cal.sensor_cov_ = pose_cov;
// Line 453-460 in your main.cpp
// get latest core state
mars::BufferEntryType latest_result;
core_logic.buffer_.get_latest_state(&latest_result);
mars::CoreStateType latest_core_state = static_cast<mars::CoreType*>(latest_result.data_.core_.get())->state_;
// get latest sensor state
mars::BufferEntryType latest_sensor_result;
core_logic.buffer_.get_latest_sensor_handle_state(pose_sensor_sptr, &latest_sensor_result);
mars::PoseSensorStateType pose_sensor_state = pose_sensor_sptr.get()->get_state(latest_sensor_result.data_.sensor_);
Eigen::Vector3d p_wi(latest_core_state.p_wi_);
Eigen::Matrix3d R_wi(latest_core_state.q_wi_.toRotationMatrix());
Eigen::Vector3d p_ic(pose_sensor_state.p_ic_);
Eigen::Matrix3d R_ic(pose_sensor_state.q_ic_.toRotationMatrix());
Eigen::Vector3d position(R_wi*p_ic + p_wi);
Eigen::Quaterniond quaternion(R_wi*R_ic)
// continue as you did If these two changes, do not produce the desired result, could you please output one sample run to CSV files so I can have a better understanding of what is going on using all state information. mars_lib/source/examples/mars_thl/mars_thl_example.cpp Lines 151 to 168 in 1c08c90
Remark on using vision sensorThe 2nd change above would also be needed for visualization when using the vision sensor, but the equations do change due to having the aditional aruco |
Copying that code gives me an error, should these lines be?
If so, please see csv attached, and corresponding video below. Mars_poseSensor_csv.movThanks! |
Ah yes, apologies for the typo there! Mixed up the variable naming between the
Alright so they are still moving (more than expected). Would it be possible to do another recording, and also include a csv for the pose sensor states, as in mars::BufferEntryType latest_result;
core_logic.buffer_.get_latest_sensor_handle_state(pose_sensor_sptr, &latest_result);
mars::PoseSensorStateType last_state = pose_sensor_sptr->get_state(latest_result.data_.sensor_);
ofile_pose << last_state.to_csv_string(latest_result.timestamp_.get_seconds()) << std::endl; |
mars_core_state.csv Absolutely. Both csv's attached! Let me know if you want a video as well and I will get another dataset over to you. Thanks! |
Thank you @antithing, this data really helps to see a bit in more detail what is going on. In the meantime, I have noticed that the biases grow exceptionally large when you move: You do initialize the IMU measurment noise, and random walk behavior with the standard values, as far as I saw: //mars lib stuff:
std::vector<double> imu_n_w{ 0.013, 0.013, 0.013 };
std::vector<double> imu_n_bw{ 0.0013, 0.0013, 0.0013 };
std::vector<double> imu_n_a{ 0.083, 0.083, 0.083 };
std::vector<double> imu_n_ba{ 0.0083, 0.0083, 0.0083 }; However, since it seems like the ZED Cam is providing these values also through its API ( What is more, you are currently using the modified IMU data from the ZED cam, which by its doc should already compensate the bias. Could you maybe also give the uncalibrated data a try, to see if that changes the behavior? MaRS is most likley trying to perform the same bias estimation as the cameras api! |
Hi @mascheiber ! Copy that. Just so I am clear, I have this:
Can you please double confirm how these values map to the below matrix?
Thank you! |
Yes, these values should map as you have written, and use them for all three axes, respectively! For the angular velocity ( |
Excellent. Like this?
|
..printing :
gives me:
I assume this means I can remove the radians conversion, right?
|
These changes give the attached result, still seeing a lot of drift. Could it be the imu-camera offset values as well? Would that cause this kind of drift? Thank you yet again! mars_core_state.csv 2023-09-22_16-08-49.mov |
Hi @antithing, Apologies for the delayed response. Actually the conversion should be like this: std::vector<double> imu_n_w{ gyroNoise * (M_PI /180), gyroNoise * (M_PI / 180), gyroNoise * (M_PI / 180) };
std::vector<double> imu_n_bw{ gyroWalk * (M_PI /180), gyroWalk* (M_PI /180), gyroWalk* (M_PI /180) };
std::vector<double> imu_n_a{ accelNoise, accelNoise, accelNoise};
std::vector<double> imu_n_ba{ accelWalk, accelWalk, accelWalk }; Which of these versions did you end up using in the latest video and csvs above, the one with degree to radian conversion or without? And would it be possible to print the noise/walk values once, just to know what the sensor's calibration is saying?
In general, yes it could. However, the imu-camera offset is not changing at all (see mars_pose_state.csv), and makes sense given the documentation from the Zed cam (identity orientation and a minor x offset). Therefore, I do not think this is the cause of it, also since the difference between tag and projected corners keeps growing over time while the calibration stays static at the same value (as it is supposed to). We just updated the measurement storing in one of our latest commits. Thus, could you checkout the latest development version with commit 64ace9a? With this version, you now can also store the measurements within a CSV. // setup filestream
std::ofstream ofile_meas_pose, ofile_meas_imu;
ofile_meas_pose.open("/tmp/mars_pose_meas.csv", std::ios::out);
ofile_meas_imu.open("/tmp/mars_imu_meas.csv", std::ios::out);
ofile_meas_pose << std::setprecision(17);
ofile_meas_imu << std::setprecision(17);
// thread for IMU callbacks and mars lib updates
// whenever a imu measurement is processed
core_logic.ProcessMeasurement(imu_sensor_sptr, timestamp, data);
// Retreive the last measurement and write it to the measurement file
mars::BufferEntryType latest_meas;
core_logic.buffer_.get_latest_sensor_handle_measurement(imu_sensor_sptr, &latest_meas);
mars::IMUMeasurementType* last_meas = static_cast<mars::IMUMeasurementType*>(latest_meas.data_.sensor_.get());
ofile_meas_imu << last_meas->to_csv_string(k.timestamp_.get_seconds()) << std::endl;
// whenever a pose measurement is processed
// Retreive the last measurement and write it to the measurement file
if (frame == true)
{
// other stuff
// ...
core_logic.ProcessMeasurement(pose_sensor_sptr, timestamp, dataP);
mars::BufferEntryType latest_meas;
core_logic.buffer_.get_latest_sensor_handle_measurement(pose_sensor_sptr, &latest_meas);
mars::PoseMeasurementType* last_meas = static_cast<mars::PoseMeasurementType*>(latest_meas.data_.sensor_.get());
ofile_meas_pose << last_meas->to_csv_string(k.timestamp_.get_seconds()) << std::endl;
} Could you possibly do another recording and csv file creation by generating these csv files? Then, I would have everything to reproduce your setup and scenario on my end. Further, would you mind sharing your latest |
Hi @mascheiber all attached here! Thank you yet again for your incredible support on this. printing:
gives me:
mars_core_state.csv Mars_test_25092023.mov |
Thank you for these files, I think (and hope) now I figured out what is going on there! Before I go into detail though, two important steps are still missing in your latest version for this to work:
This is still missing in your latest version. Place it before your current definitions for Furhter, the initialization steps mentioned in #9 (comment) also still need to be applied: // when you first get a pose with tvecs and rvecs
Eigen::Matrix3d R_vc;
Eigen::Vector3d p_vc;
ReadLock w_lock(grabberLock);
dataP = dataPose;
p_vc = pos;
R_vc = quat.toRotationMatrix();
frame2 = image;
w_lock.unlock();
// try to retrieve the information stored in the vision_init_cal above
Eigen::Matrix3d R_ic = pose_init_cal.state_.q_ip_.toRotationMatrix();
Eigen::Vector3d p_ic = pose_init_cal.state_.p_ip_;
Eigen::Matrix3d R_vw = Eigen::Matrix3d::Identity(); // vision: vision_init_cal.state_.q_vw_.toRotationMatrix();
Eigen::Vector3d p_vw = Eigen::Vector3d::Zero(); // vision: vision_init_cal.state_.p_vw_;
q_wi_init_ = Eigen::Quaterniond(R_vw.transpose() * R_vc * R_ic.transpose());
p_wi_init_ = R_vw.transpose()* (p_vc + R_vc*(-R_ic.transpose()*p_ic)) + p_vw;
if (!core_logic.core_is_initialized_) {
// Initialize the first time at which the propagation sensor
// occures
if (imu_sensor_sptr ==
core_logic.core_states_->propagation_sensor_) {
core_logic.Initialize(p_wi_init, q_wi_init);
} else {
continue;
}
} Do this within lines With these changes, the reprojection error should come down quite a bit. Would be happy to see a video with these changes included! If then it is still visible, it is due to the associated measurement noises between IMU and the aruco detection. To force MaRS to use the output of your aruco detection, you could lower the measurement noise of the pose sensor Eigen::Matrix<double, 6, 1> pose_meas_std;
pose_meas_std << 0.02, 0.02, 0.02, 2 * (M_PI / 180), 2 * (M_PI / 180), 2 * (M_PI / 180);
pose_sensor_sptr->R_ = pose_meas_std.cwiseProduct(pose_meas_std); However, by doing so you are limiting yourself and overruling the advantages of having a filtered output. |
2023-09-25_18-17-27.movThanks! Changes made (new main.cpp attached, as well as video). Not much improvement unfortunately. Lowering the measurement noise does help, but not enough to give a useable output. Is there anything else I could be doing wrong here? |
I'm quite interested in this as I'm facing the same issue. |
@golbog unfortunately not, despite the incredible support in this thread, I was unable to get this to a usable state. Close! But not perfect. |
Hi, I am setting up a system to use pose data (from aruco markers) and fuse it with IMU data.
I have everything compiling and running, but when I print the filter result, it is very different from the aruco pose 'ground truth'.
Using this code to print teh data:
mars::BufferEntryType latest_result; core_logic.buffer_.get_latest_sensor_handle_state(pose_sensor_sptr, &latest_result); mars::PoseSensorStateType last_state = pose_sensor_sptr->get_state(latest_result.data_.sensor_); std::cout << last_state.to_csv_string(latest_result.timestamp_.get_seconds()) << std::endl;
The results are:
The relevant code is:
//mars setup:
Then I am running a live camera and grabbing the camera pose from an aruco marker pnp solve.
II also grab the imu data, and run both through the filter.
I would expect the output from the filter and the pose to be similar, what am I doing wrong?
Thank you
The text was updated successfully, but these errors were encountered: