Skip to content

DiffBot is an autonomous 2wd differential drive robot using ROS Noetic on a Raspberry Pi 4 B. With its SLAMTEC Lidar and the ROS Control hardware interface it's capable of navigating in an environment using the ROS Navigation stack and making use of SLAM algorithms to create maps of unknown environments.

License

Notifications You must be signed in to change notification settings

YohanBorda/diffbot

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

DiffBot

CI Documentation CI

DiffBot is an autonomous differential drive robot with two wheels. Its main processing unit is a Raspberry Pi 4 B running Ubuntu Mate 20.04 and the ROS 1 (ROS Noetic) middleware. This respository contains ROS driver packages, ROS Control Hardware Interface for the real robot and configurations for simulating DiffBot. The formatted documentation can be found at: https://ros-mobile-robots.com.

DiffBot Lidar SLAMTEC RPLidar A2

If you are looking for a 3D printable modular base, see the remo_description repository. You can use it directly with the software of this diffbot repository.

Remo Gazebo Simulation RViz

It provides mounts for different camera modules, such as Raspi Cam v2, OAK-1, OAK-D and you can even design your own if you like. There is also support for different single board computers (Raspberry Pi and Nvidia Jetson Nano) through two changable decks. You are agin free to create your own.

Demonstration

SLAM and Navigation

Real robot Gazebo Simulation

📦 Package Overview

  • diffbot_base: ROS Control hardware interface including controller_manager control loop for the real robot
  • diffbot_bringup: Launch files to bring up the hardware drivers (camera, lidar, imu, ultrasonic, ...) for the real DiffBot robot
  • diffbot_control: Configurations for the diff_drive_controller of ROS Control used in Gazebo simulation and the real robot
  • diffbot_description: URDF description of DiffBot including its sensors
  • diffbot_gazebo: Simulation specific launch and configuration files for DiffBot
  • diffbot_msgs: Message definitions specific to DiffBot, for example the message for encoder data.
  • diffbot_navigation: Navigation based on move_base launch and configuration files
  • diffbot_slam: Simultaneous localization and mapping using different implementations to create a map of the environment

Installation

The packages are written for and tested with ROS 1 Noetic on Ubuntu 20.04 Focal Fossa. For the real robot Ubuntu Mate 20.04 for arm64 is installed on the Raspberry Pi 4 B with 4GB. The communication between the mobile robot and the work pc is done by configuring the ROS Network, see also the documentation.

Dependencies

The required Ubuntu packages are listed in the documentation. Other ROS catkin packages such as rplidar_ros need to be cloned into the catkin workspace. It is planned to use vcstool in the future to automate the dependency installtions.

🔨 How to Build

To build the packages in this repository, clone it in the src folder of your ROS Noetic catkin workspace:

catkin_ws/src$ git clone https://github.com/fjp/diffbot.git

After installing the required dependencies build the catkin workspace, either with catkin_make:

catkin_ws$ catkin_make

or using catkin-tools:

catkin_ws$ catkin build

Finally source the newly built packages with the devel/setup.* script, depending on your used shell:

# For bash
catkin_ws$ source devel/setup.bash

# For zsh
catkin_ws$ source devel/setup.zsh

Usage

The following sections describe how to run the robot simulation and how to make use of the real hardware using the available package launch files.

Gazebo Simulation with ROS Control

Control the robot inside Gazebo and view what it sees in RViz using the following launch file:

roslaunch diffbot_control diffbot.launch world_name:='$(find diffbot_gazebo)/worlds/corridor.world'

To run the turtlebot3_world make sure to download it to your ~/.gazebo/models/ folder, because the turtlebot3_world.world file references the turtlebot3_world model.

corridor.world turtlebot3_world.world
corridor-world turtlebot3-world

Navigation

To navigate the robot in the simulation run this command:

roslaunch diffbot_navigation diffbot.launch world_name:='$(find diffbot_gazebo)/worlds/turtlebot3_world.world'

Navigate the robot in a known map from the running map_server using the 2D Nav Goal in RViz.

DiffBot navigation

SLAM

To map a new simulated environment using slam gmapping, first run

roslaunch diffbot_gazebo diffbot.launch world_name:='$(find diffbot_gazebo)/worlds/turtlebot3_world.world'

and in a second terminal execute

roslaunch diffbot_slam diffbot_slam.launch slam_method:=gmapping

Then explore the world with the teleop_twist_keyboard or with the already launched rqt_robot_steering GUI plugin:

DiffBot slam

DiffBot Control in Gazebo

roslaunch diffbot_control diffbot.launch

DiffBot Gazebo

RViz

View just the diffbot_description in RViz.

roslaunch diffbot_description view_diffbot.launch

DiffBot RViz

Navigating and Mapping on the real Robot

The following video shows how to map a new environment and navigate in it

First, brinup the robot hardware including its laser with the following launch file in the diffbot_bringup package. Make sure to run this on the real robot (e.g. connect to it via ssh):

roslaunch diffbot_bringup diffbot_bringup_with_laser.launch

then, in a new terminal on your remote/work pc (not the single board computer) run the slam gmapping with the same command as in the simulation:

roslaunch diffbot_slam diffbot_slam.launch slam_method:=gmapping

As you can see in the video, this should open up RViz and the rqt_robot_steering plugin.

Next, steer the robot around manually and save the map with the following command when you are done:

rosrun map_server map_saver -f office

Finally it is possible to use the created map for navigation, after running the following launch files:

On the single board computer (e.g. Raspberry Pi) make sure that the following is launched:

roslaunch diffbot_bringup diffbot_bringup_with_laser.launch

Then on the work/remote pc run the diffbot_hw.lauch from the diffbot_navigation package:

roslaunch diffbot_navigation diffbot_hw.lauch

Among other essential navigation and map server nodes, this will also launch an instance of RViz on your work pc where you can use its tools to:

  1. Localize the robot with the "2D Pose Estimate" tool (green arrow) in RViz
  2. Use the "2D Nav Goal" tool in RViz (red arrow) to send goals to the robot

🚧 Future Work

Contributions to these tasks are welcome, see also the contribution section below.

ROS 2

  • Migrate from ROS 1 to ROS 2

Drivers, Odometry and Hardware Interface

  • Add diffbot_driver package for ultrasonic ranger, imu and motor driver node code.
  • Make use of the imu odometry data to improve the encoder odometry using robot_pose_ekf.
  • The current implementation of the ROS Control hardware_interface::RobotHW uses a high level PID controller. This is working but also test a low level PID on the Teensy 3.2 mcu using the Arduino library of the Grove i2c motor driver. -> This is partly implemented (see diffbot_base/scripts/base_controller) Also replace Wire.h with the improved i2c_t3 library.

Navigation

  • Test different global and local planners and add documentation
  • Add diffbot_mbf package using move_base_flex, the improved version of move_base.

Perception

To enable object detection or semantic segmentation with the RPi Camera the Raspberry Pi 4 B will be upated with a Google Coral USB Accelerator. Possible useful packages:

Mseg Example

Teleoperation

Tooling

  • vcstool to simplify external dependency installation
  • Adding instructions how to use rosdep to install required system dependencies

Part List Diffbot

SBC RPi 4B MCU Teensy 3.2 IMU Bosch
Part Store
Raspberry Pi 4 B (4 Gb) Amazon.com, Amazon.de
SanDisk 64 GB SD Card Class 10 Amazon.com, Amazon.de
Robot Smart Chassis Kit Amazon.com, Amazon.de
SLAMTEC RPLidar A2M8 (12 m) Amazon.com, Amazon.de
Grove Ultrasonic Ranger Amazon.com, Amazon.de
Raspi Camera Module V2, 8 MP, 1080p Amazon.com, Amazon.de
Grove Motor Driver seeedstudio.com, Amazon.de
I2C Hub seeedstudio.com, Amazon.de
Teensy 4.0 or 3.2 PJRC Teensy 4.0, PJRC Teensy 3.2
Hobby Motor with Encoder - Metal Gear (DG01D-E) Sparkfun

Part List Remo

Part Store
Raspberry Pi 4 B (4 Gb) Amazon.com, Amazon.de
SanDisk 64 GB SD Card Class 10 Amazon.com, Amazon.de
Remo Base 3D printable, see remo_description
SLAMTEC RPLidar A2M8 (12 m) Amazon.com, Amazon.de
Raspi Camera Module V2, 8 MP, 1080p Amazon.com, Amazon.de
Adafruit DC Motor (+ Stepper) FeatherWing adafruit.com, Amazon.de
Teensy 4.0 or 3.2 PJRC Teensy 4.0, PJRC Teensy 3.2
Hobby Motor with Encoder - Metal Gear (DG01D-E) Sparkfun
Powerbank (e.g 15000 mAh) Amazon.de This Powerbank from Goobay is close to the maximum possible size LxWxH: 135.5x70x18 mm)
Battery pack (for four or eight batteries) Amazon.de

Additional (Optional) Equipment

Part Store
PicoScope 3000 Series Oscilloscope 2CH Amazon.de
VOLTCRAFT PPS-16005 Amazon.de
3D Printer for Remo's parts Prusa, Ultimaker, etc. or use a local print service or an online one such as Sculpteo

Hardware Architecture and Wiring

DiffBot

Hardware Architecture and Wiring

Remo

Hardware Architecture and Wiring

🤝 Acknowledgment

🔧 Contributing

Your contributions are more than welcome. These can be in the form of raising issues, creating PRs to correct or add documentation and of course solving existing issues or adding new features.

📝 License

diffbot is licenses under the BSD 3-Clause. See also open-source-license-acknowledgements-and-third-party-copyrights.md. The documentation is licensed differently, visit its license text to learn more.

About

DiffBot is an autonomous 2wd differential drive robot using ROS Noetic on a Raspberry Pi 4 B. With its SLAMTEC Lidar and the ROS Control hardware interface it's capable of navigating in an environment using the ROS Navigation stack and making use of SLAM algorithms to create maps of unknown environments.

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • C++ 47.1%
  • CMake 37.2%
  • Python 13.2%
  • Lua 1.7%
  • Other 0.8%