Skip to content

We present a Morphology-Informed Heterogeneous Graph Neural Network (MI-HGNN) for learning-based contact perception. The architecture and connectivity of the MI-HGNN are constructed from the robot morphology.

License

Notifications You must be signed in to change notification settings

DanielChaseButterfield/Morphology-Informed-HGNN-Backup

 
 

Repository files navigation

MI-HGNN for contact estimation/classification on various robots

This repository implements a Morphology-Informed Heterogeneous Graph Neural Network (MI-HGNN) for estimating contact information on the feet of a quadruped robot.

Additionally, by providing a compatible URDF file, this software can convert a variety of robot structures to graph format for learning with the MI-HGNN. See #Applying-MI-HGNN-to-your-own-robot for more information.

Figure 2

For information on our method, see our project page and paper.

Installation

To get started, setup a Conda Python environment with Python=3.11:

conda create -n mi-hgnn python=3.11
conda activate mi-hgnn

Then, install the library (and dependencies) with the following command:

pip install .

Note, if you have any issues with setup, refer to environment_files/README.md so you can install the exact libraries we used.

URDF Download

The necessary URDF files are part of git submodules in this repository, so run the following commands to download them:

git submodule init
git submodule update

Replicating Paper Experiments

To replicate the experiments referenced in our paper or access our trained model weights, see paper/README.md.

Applying MI-HGNN to your own robot

Although in our paper, we only applied the MI-HGNN on quadruped robots for contact perception, it can also be applied to other multi-body dynamical systems. New URDF files can be added by following the instructions in urdf_files/README.md, and our software will automatically convert the URDF into a graph compatible for learning with the MI-HGNN.

Editing and Contributing

Datasets can be found in the src/mi_hgnn/datasets_py directory, and model definitions and training code can be found in the src/mi_hgnn/lightning_py directory. We encourage you to extend the library for your own applications. Please reference #Replicating-Paper-Experiments for examples on how to train and evaluate models with our repository.

After making changes, rebuild the library following the instructions in #Installation. To make sure that your changes haven't broken critical functionality, run the test cases found in the tests directory.

If you'd like to contribute to the repository, write sufficient and necessary test cases for your additions in the tests directory, and then open a pull request.

Citation

If you find our repository or our work useful, please cite the relevant publication:

@article{butterfield2024mi,
  title={MI-HGNN: Morphology-Informed Heterogeneous Graph Neural Network for Legged Robot Contact Perception},
  author={Butterfield, Daniel and Garimella, Sandilya Sai and Cheng, Nai-Jen and Gan, Lu},
  journal={arXiv preprint arXiv:2409.11146},
  year={2024},
  eprint={2409.11146},
  url={https://arxiv.org/abs/2409.11146},
}

Contact / Issues

For any issues with this repository, feel free to open an issue on GitHub. For other inquiries, please contact Daniel Butterfield ([email protected]) or the Lunar Lab (https://sites.gatech.edu/lunarlab/).

About

We present a Morphology-Informed Heterogeneous Graph Neural Network (MI-HGNN) for learning-based contact perception. The architecture and connectivity of the MI-HGNN are constructed from the robot morphology.

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Python 99.9%
  • CMake 0.1%