Skip to content

AlbertoGhiotto/MobileRobotTeleoperation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

77 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Mobile Robot Teleoperation

This project allows to read the orientation of the right arm estimated by three different sensors: smartwatch, kinect and leap motion. The controller collects these data and evaluates the linear and the angular velocity to be sent to the robot in order to move it.

The System’s Architecture

The hardware is composed by a smartwatch, a kinect, a leap motion and the robot we want to move. The modules are wrote in cpp or in python.

Description of the Modules

The architecture is composed by three sensors that get information regarding the arm and interface to the PC using the respective drivers. The adapter nodes receive the orientation data from the respective sensors and in different ways convert them into RPY data and then send everything to the controller. The controller receives RPY angles which are easier to interpret than quaternions and converts them into linear and angular velocities, makes the average and sends the actual velocities to the robot.

Kinect Module

This module takes in input the tf transforms from the kinect data preprocessed by the openny_tracker module, and gives as output the corresponding RPY (roll-pitch-yaw) data.

Leap Motion Module

This module takes in input images taken by two cameras are analyzed to reconctruct a 3D representation of what the device sees. Tracking algorithms interprat the 3D data and infer the pitch and the roll of occluded objects.

Smartwatch Module

The smartwatch module takes as input the data sent by the smartwatch, filtered by the Complementary Filter Node, and gives as output the corresponding RPY (roll-pitch-yaw) data.

Controller Module

The controller module takes as input the RPY data sent by the three adapter of the sensors and converts them into linear and angular velocities. Then it computes the weighted average between the velocities available and gives as output the actual velocity that the robot has to take.

Gazebo Simulation

During the test phase the simulator listen at the topic /cmd_vel and a simulated hrp_automower moves in Gazebo according to the message received.

Kinect-Unity-Oculus modules

The objective of this modules is to create a 3D pointcloud map from the images acquired by a Microsoft Kinect in a ROS environment on Linux, to transmit it to a Windows based Unity project which will tweak and improve the map in order to make it more user-friendly before sending it to the Oculus visor weared by the user. The Kinect could be even mounted on a moving robot in order to create a real-time dynamic map of its surrounding.

Implementation

Prerequisites

Hardware prerequisites:

  1. LG G Watch R W110

  2. LG G6 H870

  3. Husqvarna Automower

Software prerequisites:

  1. ROS kinetic, to download it follow this guide.

  2. Imu Stream, a set of Android applications (mobile and wear) to stream IMU data from the smartwatch to an MQTT broker. For more feature follow this guide.

  3. Mosquitto on Ubuntu, to download it follow this guide.

  4. GAZEBO robotic simulator for ROS, to download it follow this guide.

How to run the project

  1. Clone this repository in your workspace through the command
    git clone

Kinect Setup

  1. Follow the README in the src/kinect_listener folder

Leap Motion Setup

  1. Follow the README in the src/leap_teleop folder

Smartwatch Setup

  1. Follow the README in the src/mqtt_ros_bridge folder

For the simulation on Gazebo (Optional)

  1. (ONLY IF YOU WANT TO SIMULATE ON GAZEBO)For the simulation of the Husqvarna Automower on GAZEBO install all the dependencies (for more info about this part look at this guide

    sudo apt-get install ros-kinetic-gazebo-ros-control
    sudo apt-get install ros-kinetic-joint-state-controller
    sudo apt-get install ros-kinetic-hector-gazebo-plugins
    sudo apt-get install ros-kinetic-hector-gazebo
    sudo apt-get install python-pygame
  2. Setup the model path

    export GAZEBO_MODEL_PATH=[your path]/src/haro/am_gazebo/models:$GAZEBO_MODEL_PATH

Kinect-Unity-Oculus Setup

  1. Follow the README in the Unity branch

Note: the Kinect-Unity-Oculus interface section it still is on the relative "Unity" branch since it must be cloned on a different machine than the one above. The relative complete Readme it can be found on the above mentioned branch.

Compilation and running

  1. Compile your workspace

    catkin_make
  2. Kinect:

    (Terminal 1)  roslaunch openni_launch openni.launch camera:=openni
  3. Smartwatch: Check the Mosquitto broker status.

    sudo service mosquitto status
  4. Start the Mosquitto broker. (if the broker is already active skip this step)

    (Terminal 2) mosquitto
  5. Leap Motion:

    (Terminal 3) LeapControlPanel
    (Terminal 4) roslaunch leap_motion sensor_sender.launch
  6. In another terminal tab launch the controller and all the other nodes (inside the launch file you could comment components not needed, for example those for the simulation)

    (Terminal 5) roslaunch controller controller.launch
  7. In order to start the simulation on gazebo

    (Terminal 6) roslaunch am_gazebo am_gazebo_hrp.launch gui:=true
  8. To run the Kinect-Unity-Oculus side follow the guide in the Unity branch

Results

The three sensors were fully tested and we can conclude that the Husqvarna Automower is totally controllable through the usage of them. In order to look at the simulation developed during the test phase follow this link. All the three modules (Kinect-Unity-Oculus) have been thoroughly tested and have demonstrated to be fully working. The final implementation allows the user to visualize the entirety of the map in a realistic and dynamic way while the virtual environment keeps expanding as the robot explores its surroundings. You can see the implemented Kinect-Unity-Oculus architecture working in the following videos:

Kinect - Unity - Oculus interface Video 1

Kinect - Unity - Oculus interface Video 2

Recommendations

During the test phase some issues raised, one of them is the fact that the connection between smartwatch, smartphone and computer introduce a considerable delay that retard the movement of the robot with respect to the smartwatch orientation.

Authors

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published