This repo is DEPRECATED. It has been replaced with a fork called the Stretch Teleop Interface. The original README is saved below.
This repository holds a distinct version of the original Stretch RE1 web interface that uses two cameras with ultra wide-angle lenses (fisheye lenses). One camera is attached to the robot's gripper and looks at the fingertips. The other camera is attached to the top of the robot and looks down at the robot's mobile base. These perspectives are useful for navigation and manipulation.
This interface specifically uses prototype fisheye camera accessories developed by Hello Robot. If you're interested in fisheye cameras for the Stretch RE1, please post to Hello Robot's forum or email Hello Robot at [email protected]. The cameras and web interface are based on early prototypes for remote teleoperation developed at Georgia Tech and Hello Robot prior to the Stretch RE1.
To install the fisheye web interface, first follow the instructions in the web-interface installation section below. Then, use the following commands to install the fisheye cameras with install_gripper_and_navigation_cameras.sh.
cd ~/catkin_ws/src/stretch_fisheye_web_interface/bash_scripts/
./install_gripper_and_navigation_cameras.sh
This script first installs udev rules for both cameras that create the /dev/hello-gripper-camera
and /dev/hello-navigation-camera
device symlinks. For these udev rules to work, no other devices should be plugged into the head and wrist USB ports, since the rules use the USB bus and port topology to distinguish the two cameras. The script then installs usb_cam
, which is a ROS USB camera package. Finally, the script copies a configuration file to /etc/modprobe.d
that configures the uvcvideo
kernel module to work with the cameras. After running this installation script, you either need to unplug and replug the two cameras or reboot the Stretch RE1's NUC computer.
Currently, the camera settings need to be adjusted manually. We hope to improve this situation in the future. Without adjustment, the video streams can be unusually dark, bright, or otherwise problematic. Open the following two launch files to see the current camera settings. You will likely need to adjust the values for your specific cameras.
./launch/gripper_camera.launch
./launch/navigation_camera.launch
To determine how to set the parameters, we recommend you use guvcview
. You can install it with the following command:
sudo apt install guvcview
Now, go to the "Show Applications" menu, search for guvcview
, run the application, and connect it to one of the USB 2.0 cameras. Once guvcview
is running, you can use it to see video from the camera while adjusting the camera's settings. Once you find settings you like, you can edit the appropriate launch file. Then, you can do the same thing with the other camera.
The interface expects the fisheye gripper camera to be attached as far down the gripper as it can be (i.e., as close to the fingers as it can be). It expects the navigation camera to be attached to the top of the head and positioned so that it is pointing straight down at the ground and looking at the center front of the mobile base. The top of the navigation camera should be pointed in the direction that the telescoping arm extends.
The gripper camera should be plugged into the wrist USB port, and the navigation camera should be plugged into the head USB port. Nothing else should be plugged into these ports, since bandwidth is limited and the udev rules use the USB port topology to find and distinguish the gripper and navigation cameras.
When you run the interface later, it should look like the following screenshots. Adjust the poses of the cameras accordingly.
Manipulation Mode
Navigation Mode
The cameras have identical identifiers, so your robot's device symlinks (/dev/hello-gripper-camera
and /dev/hello-navigation-camera
) might point to the wrong cameras. The screenshots above show you where the two cameras should be in the interface. If the navigation camera video is showing where the gripper camera video should be showing, then you'll need to edit the following two files using sudo
privileges:
/etc/udev/rules.d/88-hello-navigation-camera.rules
/etc/udev/rules.d/89-hello-gripper-camera.rules
You should only need to make the following changes and then reboot the robot's computer:
KERNELS=="1-1.3.*" changed to KERNELS=="1-1.2.*"
KERNELS=="1-1.2.*" changed to KERNELS=="1-1.3.*"
If this doesn't work, please let us know and we'll help you find a solution.
To run the fisheye web interface, you can use the original quick start instructions below.
The current fisheye camera system can only be used a low resolution at a high frame rate or high resolution at a low frame rate. We chose to use 1024x768 resolution videos from the gripper and navigation cameras at 6 frames per second. Trading frames per second for higher resolution appears to be worthwhile, since the robot moves slowly and haptic feedback (joint torque visualization) updates frequently with low latency. Rapid haptic feedback helps compensate for the low video rate during teleoperation. For example, the operator can notice contact and send a compensatory command to the robot without video. The rate and latency of haptic feedback and commands to the robot are determined by the WebRTC real-time data channel, which is distinct from the WebRTC audio and video.
This repository holds code that enables a person (the operator) to remotely teleoperate a Stretch RE1 (the robot) through a recent Chrome/Chromium web browser on an Android mobile phone, laptop, or desktop. The Stretch RE1 is a mobile manipulator from Hello Robot Inc.
WARNING: This prototype code has been useful to the community, but is not well tested. There are also security issues, especially if you use the default credentials. Use this code at your own risk.
When we started Hello Robot Inc. back in 2017, part of our goal was to create a robot that could be intuitively teleoperated from afar. We took an iterative approach, building a series of 7 prototype robots before the Stretch RE1 that we sell today. In conjunction with these prototypes, we developed a series of web interfaces, so that we could control our robots via a web browser and test remote teleoperation. While we eventually deemphasized this aspect of the robot, we thought it could be useful to the community. With this goal in mind, we ported parts of our old web interface code to the Stretch RE1 and made them available in this repository back in June of 2020.
Since then, we've been gratified to learn of others working with this code. For example, The Human Centered Robotics Lab at the University Washington has made impressive improvements to the code, which can be found in their repository. We've also learned that the The Human Factors and Aging Laboratory at the University of Illinois at Urbana-Champaign has explored this interface as part of their impressive research to improve the lives of older adults.
This web interface works via Web Real-Time Communication (WebRTC). Code runs in a browser on the robot, in a browser on the operator's device (e.g., a mobile phone), and on a server. This is analogous to the robot and the operator video conferencing with one another, although they communicate via realtime data in addition to audio and video. By using web browsers, the robot and the operator make use of well-tested high-performance implementations of WebRTC. This symmetry also simplifies development, since a developer can use the same browser-based developer tools on both sides of the communication. The robot's browser and the operator's browser first login to the server, which helps connect them and provides them with the interface code.
The robot’s browser uses rosbridge to connect with ROS on the robot. Rosbridge translates JSON from the robot’s browser into ROS communications and vice versa. The JavaScript code used by the robot’s browser to connect with ROS can be found in ros_connect.js under the robot directory, which holds files made available to the robot's browser.
With puppeteer the robot can automatically launch and login to its browser. For example, start_robot_browers.js uses puppeteer to launch the robot's browser and login.
While the robot’s browser has access to most of the robot via ROS, the operator’s browser can only access the robot indirectly through the robot’s browser. The robotic commands available to the operator’s browser can be found in commands.js under the shared directory, which holds files available to both the operator's browser and the robot's browser. The operator's browser also has access to files in the operator directory.
In the example below, the server runs on the robot. In a production environment, you would use an external server, instead of the robot, to handle things like connecting robots and operators behind firewalls. In a later section, we provide an example of an external server that uses Amazon Lightsail. When used on a production server with proper certificates, this code uses HTTPS and avoids scary messages.
The web server uses the Express web framework with Pug templates. The server provides a WebRTC signaling service using socket.io. It uses Redis to store sessions.
passport provides authentication for the robot and the operator. mongoose and a MongoDB database store credentials for robots and operators. The stretch_fisheye_web_interface repository comes with default MongoDB content found at ./mongodb/ for testing behind a firewall. These default contents come with multiple robot and operator accounts. Make sure not to use these default database contents on a deployed system!
By default, send_recv_av.js uses a free STUN server provided by Google. The Amazon Lightsail example below uses coturn as a STUN and TURN server.
These installation instructions describe how to install both the server and relevant ROS code on the onboard computer of a Stretch RE1 robot. This is suitable for use on a trusted and secure local area network (LAN) behind a strong firewall.
The web interface depends on stretch_ros, which is used to control the robot. You should first make sure it is up-to-date and working properly on the robot.
Clone the stretch_fisheye_web_interface repository to ~/catkin_ws/src/ on the robot.
cd ~/catkin_ws/src/
git clone https://github.com/hello-robot/stretch_fisheye_web_interface.git
Run catkin_make.
cd ~/catkin_ws/
catkin_make
rospack profile
Run the installation script.
cd ~/catkin_ws/src/stretch_fisheye_web_interface/bash_scripts/
./web_interface_installation.sh
WARNING: The script uninstalls tornado using pip to avoid a rosbridge websocket immediate disconnection issue. This could break other software on your robot.
When running on a trusted and secure local area network (LAN) behind a strong firewall, you can use the following insecure method to more conveniently start the system.
First, make sure the robot is calibrated. For example you can run the following command.
stretch_robot_home.py
Next, in a terminal, run the following command to start ROS. This will start ROS nodes on the robot for the D435i camera, the driver for Stretch RE1, and rosbridge. Rosbridge connects JavaScript running in the robot's browser to ROS using JSON.
roslaunch stretch_fisheye_web_interface web_interface.launch
In another terminal, run the following command to start the web server on the robot, launch the robot's browser, and log the robot into the browser. The convenience script calls start_robot_browser.js, which uses puppeteer to log the robot into its browser.
roscd stretch_fisheye_web_interface/bash_scripts/
./start_web_server_and_robot_browser.sh
Typically, this script can be exited with Ctrl+C and then restarted without issue.
WARNING: start_robot_browser.js contains the default robot credentials in plain text! This is only appropriate for simple testing on a local network behind a firewall. The username and password are public on the Internet, so this is not secure! Deployment would require new credentials and security measures.
You will now login to a browser as the operator and connect to the robot. You can use a Chrome browser on a recent Android mobile phone or a recent Chrome/Chromium browser on a laptop or desktop.
Open the browser goto the robot’s IP address. You can use ifconfig
on the robot to determine its IP address.
Select "Advanced" and then click on "Proceed to localhost (unsafe)".
Click on "Login" and use the following username and password.
username:
o1
password
xXTgfdH8
WARNING: This is a default operator account provided for simple testing. Since this username and password are public on the Internet, this is not secure. You should only use this behind a firewall during development and testing. Deployment would require new credentials and security measures.
You should now see a screen like the following. Click on "no robot connected" and select the robot "r1" to connect to it.
You should now see video from the robot on your mobile phone or other device. Click in the designated regions to command the robot to move. You can also click on "Drive", "Arm" down, "Arm" up, "Hand" and "Look" to move different joints on the robot.
The following steps describe how to manually start the web server and the robot's browser on the robot, instead of using the convenience script described above.
First, make sure the robot is calibrated. For example you can run the following command.
stretch_robot_home.py
Next, in a terminal, run the following command to start the ROS side of things. This will start ROS nodes on the robot for the D435i camera, the driver for Stretch RE1, and rosbridge. Rosbridge connects JavaScript running in the robot's browser to ROS using JSON.
roslaunch stretch_fisheye_web_interface web_interface.launch
In another terminal, run the following command to start the web server on the robot.
roscd stretch_fisheye_web_interface/bash_scripts/
./start_desktop_dev_env.sh
Open a Chromium browser on the robot and go to localhost. Select "Advanced" and then click on "Proceed to localhost (unsafe)".
Click on "Login" and use the following username and password.
username:
r1
password
NQUeUb98
WARNING: This is a default robot account provided for simple testing. Since this username and password are public on the Internet, this is not secure. You should only use this behind a firewall during development and testing. Deployment would require new credentials and security measures.
You should now see video from the robot's camera in the browser window.
Please see the instructions above.
The server for the web interface typically runs on the robot's onboard computer or on a remote machine connected to the Internet.
Credentials for robots and operators are stored on the server using MongoDB.
On the server, you can view and edit the credentials using mongodb-compass
, which is installed by default. First, use the following command in a terminal to start the application.
mongodb-compass
Next, use "Connect to Host" by typing localhost
in the Hostname area at the top of the window and then clicking the green "CONNECT" button at the bottom right of the window. This should show you various databases. The node-auth
database holds the web interface credentials.
Clicking on node-auth
will show a collection named users
.
Clicking on users
will show the current credentials.
If you've only used the default development credentials in this repository, you should see entries for the following: three robots with the usernames r1, r2, and r3; three operators with the usernames o1, o2, and o3; and an administrator with the username admin. Each entry consists of encrypted password information (i.e., salt and hash), a username, a name, a role, a date, and a Boolean indicating whether or not the user has been approved. Without approval, the user should be denied access. The role indicates whether the entry is for a robot or an operator. You can click on the image below to see what this should look like.
First, start the server. Next, go to the web page and click register
. Now enter a username and a password. This process creates a new user entry in MongoDB.
You can now follow the instructions for viewing credentials above to view the new account you just created. In order for this account to function, you will need to edit the role to be operator
or robot
and edit approved to be true
. You can do this by clicking on the elements with mongodb-compass
.
Prior to testing anything on the Internet, you should delete all of the default credentials. The default credentials are solely for development on a secure local network behind a firewall.
On the server, you can backup credentials using a command like the following in a terminal. You should change ./
to match the directory into which you want to saved the backup directory.
mongodump --db node-auth --out ./
You can restore backed up credentials using the following command in a terminal. You'll need to change ./
to match the directory that holds the backup directory.
mongorestore -d node-auth ./node-auth/user.bson
Running the server on the robot is useful when the robot and the operator are both on the same local area network (LAN). For example, a person with disabilities might operate the robot in their home, or you might be developing new teleoperation code. In these situations, the robot, the operator's browser, and the server should all be behind a strong firewall, reducing security concerns.
Running the server on a remote machine can be useful when the robot and the operator are on separate LANs connected by the Internet. This can enable a person to operate the robot from across the world. In this section, we'll provide an example of setting up the server to run on an Amazon Lightsail instance. This is not a hardened server and is only intended to serve as a helpful example. It likely has significant security shortcomings and is very much a prototype. Use at your own risk.
One of the challenges for remote teleoperation is that browsers on different LANs can have difficulty connecting with one another. Peer-to-peer communication may not be achievable due to firewalls and other methods used to help secure networks. For example, home networks, university networks, and corporate networks can all have complex configurations that interfere with peer-to-peer communication. Running the server on a remote machine connected to the Internet helps the robot's browser and the operator's browser connect to one another using standard methods developed for WebRTC video conferencing over the Internet. The server performs a variety of roles, including the following: restricting access to authorized robots and operators; helping operators select from available robots; WebRTC signaling, Session Traversal Utilities for Network Address Translation (STUN), and Traversal Using Relays around Network Address Translation (TURN). Notably, when direct peer-to-peer connectivity fails, TURN relays video, audio, and data between the robot's browser and the operator's browser. Relaying data is robust to networking challenges, but can incur charges due to data usage.
By default the Content Security Policy (CSP) for helmet is enabled. The CSP should be enabled for deployed systems on the Internet. Having it disabled is only appropriate for a secure local area network (LAN) with a strong firewall. For unknown reasons, the previous approach to disabling the CSP is not functioning. In the past, one would disable it by setting use_content_security_policy
to false
in app.js
, such as at this location in one version of the code.
This section describes the steps we used to create an Amazon Lightsail instance that runs the server for the web interface.
- Obtain a domain name to use for your server.
- Create a new Amazon Lightsail instance.
- We used an OS only instance with Ubuntu 20.04, 512 MB RAM, 1 vCPU, 20 GB SSD.
- Create a static IP address and attach it to your instance.
- Connect your new instance to SSH, so that you can access it.
- A command like the following can then be used to login to your instance:
ssh -i /path/to/private-key.pem username@public-ip-address
.
- A command like the following can then be used to login to your instance:
- While logged into your instance.
- Run
sudo apt-get update
to avoid installation issues. - Install the
net-tools
package, which will be used later. You might also want to install your preferred text editor, such asemacs
.sudo apt install net-tools
- Configure Git.
git config --global user.name "FIRST_NAME LAST_NAME"
git config --global user.email "[email protected]"
- Clone this GitHub repository.
cd
mkdir repos
git clone https://github.com/hello-robot/stretch_fisheye_web_interface.git
- Use certbot from Let's Encrypt to obtain certificates so that your server can use Hypertext Transfer Protocol Secure (HTTPS). HTTPS is required to fully utilize WebRTC.
- You will need to first connect your domain name to the static IP address used by your instance.
- Follow certbot installation instructions for Ubuntu 20.04.
- Run the teleoperation server installation script
cd ~/repos/stretch_fisheye_web_interface/bash_scripts/
./web_server_installation.sh
- Run
- Initialize the database with secure credentials for at least one robot and one operator. For example, you can do the following.
- Create and export credentials from MongoDB by running a server on your robot.
- Use
scp
to copy the exported credentials database from your robot's computer to your Lightsail instance by running a command likescp -ri ./LightsailDefaultKey-us-east-2.pem ./mongodb_credentials ubuntu@public-ip-address:./
on your robot's computer. - On your Lightsail instance, restore the credentials database with a command like
mongorestore -d node-auth ./mongodb_credentials/node-auth/users.bson
- While logged into your instance.
- Edit the coturn configuration file.
- Find your instance's private IP address by running
ifconfig -a
and looking at theinet
IP address. - Confirm your instance's public IP address and your domain name by running
ping YOUR-DOMAIN-NAME
and looking at the IP address. - Add the following lines at appropriate locations in
/etc/turnserver.conf
.listening-ip=PRIVATE-IP-ADDRESS
relay-ip=PRIVATE-IP-ADDRESS
external-ip=PUBLIC-IP-ADDRESS
Verbose
lt-cred-mech
pkey=/etc/letsencrypt/live/YOUR-DOMAIN-NAME/privkey.pem
cert=/etc/letsencrypt/live/YOUR-DOMAIN-NAME/cert.pem
no-multicast-peers
secure-stun
mobility
realm=YOUR-DOMAIN-NAME
- Find your instance's private IP address by running
- Create TURN server accounts and credentials.
- Create an administrator account.
sudo turnadmin -A -u ADMIN-NAME -p ADMIN-PASSWORD
- Create a TURN user. In the next step, you will add these credentials to
./stretch_fisheye_web_interface/shared/send_recv_av.js
.sudo turnadmin -a -u TURN-USER-NAME -r YOUR-DOMAIN-NAME -p TURN-USER-PASSWORD
- Open
./stretch_fisheye_web_interface/shared/send_recv_av.js
in an editor.- Comment out the free STUN server.
- Uncomment the STUN and TURN servers and fill in the values using your domain name and the credentials you just created (i.e., YOUR-DOMAIN-NAME, TURN-USER-NAME, and TURN-USER-PASSWORD).
- The relevant code will look similar to
var pcConfig = { iceServers: [ {urls: "stun:YOUR-DOMAIN-NAME", username "TURN-USER-NAME", credentials: "TURN-USER-PASSWORD}, {urls: "turn:YOUR-DOMAIN-NAME", username "TURN-USER-NAME", credentials: "TURN-USER-PASSWORD}]};
- The relevant code will look similar to
- Create an administrator account.
- Edit the coturn configuration file.
- Open the following ports for your Amazon Lightsail instance. These are standard ports for HTTPS, STUN, and TURN.
HTTPS TCP 443
Custom TCP 3478
Custom TCP 5349
Custom UDP 3478
Custom UDP 5349
- Reboot your instance.
- Login to your instance and run the following commands to start the server.
cd ~/repos/stretch_fisheye_web_interface/bash_scripts/
./start_server_production_env.sh
- Your server should now be running and you can test it by taking the following steps.
- Turn on and calibrate your robot.
- Use your robot's Chromium browser to visit your server's domain and login with the robot's credentials that you created.
- Open a Chrome or Chromium browser of your own on another computer or recent Android phone. Visit your server's domain and login with the operator credentials you created. If you want to test communication between distinct networks, you could turn off your phone's Wi-Fi and then either use your phone or tether to your phone to connect from the mobile phone network to your robot. Please note that this has only been tested with the Chrome browser on recent Android phones.
- Once you've logged in as an operator, you should be able to select your robot from the drop down list and begin controlling it.
- After trying it out, be sure to shutdown your Amazon Lightsail instance. It is not hardened and likely has security vulnerabilities. It would be risky to leave it on over a significant length of time.
This software is intended for use with S T R E T C H (TM) RESEARCH EDITION, which is a robot produced and sold by Hello Robot Inc. For further information, including inquiries about dual licensing, please contact Hello Robot Inc.
For license details for this repository, see the LICENSE files, including TUTORIAL_LICENSE.md, WEBRTC_PROJECT_LICENSE.md, and LICENSE.md. Some other sources and licenses are described by comments found within the code.
The Apache 2.0 license applies to all code written by Hello Robot Inc. contained within this repository. We have attempted to note where code was derived from other sources and the governing licenses.