Skip to content

Commit

Permalink
Update doc
Browse files Browse the repository at this point in the history
  • Loading branch information
Skylark0924 committed Sep 19, 2022
1 parent 7433706 commit 8b52ec6
Show file tree
Hide file tree
Showing 3 changed files with 63 additions and 1 deletion.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@
Rofunc package focuses on the **robotic Imitation Learning (IL) and Learning from Demonstration (LfD)** fields and provides valuable and
convenient python functions for robotics, including _demonstration collection, data pre-processing, LfD algorithms, planning, and control methods_. We also plan to provide an Isaac Gym-based robot simulator for evaluation. This package aims to advance the field by building a full-process toolkit and validation platform that simplifies and standardizes the process of demonstration data collection, processing, learning, and its deployment on robots.

![](./img/pipepline.png)
![](./img/pipeline.png)

### Installation
The installation is very easy,
Expand Down
3 changes: 3 additions & 0 deletions doc/source/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,8 @@ Rofunc: The Full Process Python Package for Robot Learning from Demonstration

Rofunc
----------------
:doc:`overview`

:doc:`devices/README`
How to record, process, visual and export the multimodal demonstration data.
:doc:`lfd/README`
Expand All @@ -29,6 +31,7 @@ Roadmap
:caption: Rofunc
:hidden:

overview
devices/README
lfd/README
planning/README
Expand Down
59 changes: 59 additions & 0 deletions doc/source/overview.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,59 @@
# Overview

Rofunc package focuses on the **robotic Imitation Learning (IL) and Learning from Demonstration (LfD)** fields and provides valuable and
convenient python functions for robotics, including _demonstration collection, data pre-processing, LfD algorithms, planning, and control methods_. We also plan to provide an Isaac Gym-based robot simulator for evaluation. This package aims to advance the field by building a full-process toolkit and validation platform that simplifies and standardizes the process of demonstration data collection, processing, learning, and its deployment on robots.

![](../../img/pipeline.png)

## Installation
The installation is very easy,

```
pip install rofunc
```

and as you'll find later, it's easy to use as well!

```python
import rofunc as rf
```

Thus, have fun in the robotics world!


## Available functions
Currently, we provide a simple document; please refer to [here](./rofunc/). A comprehensive one with both English and
Chinese versions is built via the [readthedoc](https://rofunc.readthedocs.io/en/stable/).
The available functions and plans can be found as follows.


| Classes | Types | Functions | Description | Status |
|---------------------------------|--------------|-------------------------|----------------------------------------------------------------------|--------|
| **Devices** | Xsens | `xsens.record` | Record the human motion via network streaming | |
| | | `xsens.process` | Decode the .mvnx file ||
| | | `xsens.visualize` | Show or save gif about the motion ||
| | Optitrack | `optitrack.record` | Record the motion of markers via network streaming | |
| | | `optitrack.process` | Process the output .csv data ||
| | | `optitrack.visualize` | Show or save gif about the motion | |
| | ZED | `zed.record` | Record with multiple cameras ||
| | | `zed.playback` | Playback the recording and save snapshots ||
| | | `zed.export` | Export the recording to mp4 ||
| | Multimodal | `mmodal.record` | Record multi-modal demonstration data simultaneously | |
| | | `mmodal.export` | Export multi-modal demonstration data in one line ||
| **Learning from Demonstration** | DMP | `dmp.uni` | DMP for one agent with several (or one) demonstrated trajectories | |
| | GMR | `gmr.uni` | GMR for one agent with several (or one) demonstrated trajectories ||
| | TP-GMM | `tpgmm.uni` | TP-GMM for one agent with several (or one) demonstrated trajectories ||
| | | `tpgmm.bi` | TP-GMM for two agent with coordination learned from demonstration ||
| | TP-GMR | `tpgmr.uni` | TP-GMR for one agent with several (or one) demonstrated trajectories ||
| | | `tpgmr.bi` | TP-GMR for two agent with coordination learned from demonstration ||
| **Planning** | LQT | `lqt.uni` | LQT for one agent with several via-points ||
| | | `lqt.bi` | LQT for two agent with coordination constraints ||
| | | `lqt.recursive` | Generate smooth trajectories for robot execution recursively ||
| **Logger** | | `logger.write` | Custom tensorboard-based logger | |
| **Coordinate** | | `coord.custom_class` | Define the custom class of `Pose` | |
| | | `coord.transform` | Useful functions about coordinate transformation ||
| **VisuaLab** | Trajectory | `visualab.trajectory` | 2-dim/3-dim/with ori trajectory visualization ||
| | Distribution | `visualab.distribution` | 2-dim/3-dim distribution visualization ||
| | Ellipsoid | `visualab.ellipsoid` | 2-dim/3-dim ellipsoid visualization ||
| **RoboLab** | Kinematics | `robolab.kinematics` | ... ||

0 comments on commit 8b52ec6

Please sign in to comment.