Getting started with virtual GoPiGo3 in ROS

Check the kinematic behaviour using the 3D visualizer RViz

This is the first of a serie of articles for ROS beginners who would like to learn state-of-the-art robotics with a physical robot. Since we are starting from scratch we wouldn’t like you had to deal with hardware complexities at this point. The approach is: let’s make the robot work in the simulation, then reproduce the exercise with the actual hardware.

For the goal of simulation we built a realistic 3D model of GoPiGo3 Starter Kit, that also includes a Pi camera, an IMU sensor and a X4 YDlidar laser sensor. So, at this moment you only need to be aware of the kind of sensors that are on-board what is the purpose of each one:

  • The distance sensor (in red color in the picture above, part of the Starter Kit) measures the distance to the first object in front using a single laser beam.
  • The Inertial Measurement Unit (IMU) sensor (cyan color) contains an accelerometer, a gyroscope and a magnetometer, whose combined measurements allow for the detection ofmotion, orientation and position of the robot.
  • The Pi camera is a 8 megapixel resolution 2D camera able to provide video modes 1080p30 and 720p60.
  • The X4 YDlidar laser sensor provides a 360º scan angle to detect any obstacle around the robot. The scan frequency is 6–12 Hz, detecting objects up to 10 m far away.

For the practical exercise we are going to cover, you only need a common laptop with Ubuntu 18.04 with ROS Melodic installed. If you still work with Ubuntu 16.04, you should install ROS Kinetic instead.

Software installation

Clone the source code into your ROS workspace. Replace catkin_ws with the name of your actual workspace:

$ cd ~/catkin_ws/src
$ git clone

This repository contains a collection of ROS packages specifically developed for GoPiGo3. It allows the user to control the virtual robot, as well as the physical robot. In this first article, we will focus on the simulated robot, hence only two packages of the collection will be needed. Easier, right?

In addition, the repository includes the key_teleop ROS package, part of the basic remote controller bundle known as teleop_tools. This way, you will be able to control the virtual robot with the keyboard.

After cloing the source code, build the ROS workspace and you will be ready to start the simulation:

$ cd ~/catkin_ws
$ catkin_make
$ source ~/catkin_ws/devel/setup.bash

In the following two sections we first describe the gopigo3_description package, that contains the 3D definition of GoPiGo3 as well as its mechanical properties. After that, in the second and last section, we use the other package, gopigo3_fake, to run the kinematic simulation.

Robot 3D model

The package gopigo3_description contains the URDF description of GoPiGo3 that you can use to perform simulations of the robot. The acronym URDF stands for Unified Robot Description Format, and is the standard defined by ROS to describe any robot using XML format. In particular, it allows to define the visual, collision and inertial properties of GoPiGo3.

The following command allows to inspect the model of the robot with the 3D visualizer RViz:

$ roslaunch gopigo3_description gopigo3_rviz.launch

You can set the gui parameter to launch a widget that lets you rotate both wheels with the sliders:

$ roslaunch gopigo3_description gopigo3_rviz.launch gui:=true

The figure below shows what the RViz window should look like:

In the right-bottom widget there are the two sliders that you can use to rotate the wheels from -180º to 180º (angles are shown in radians, ranging -3.14 to 3.14 rad).

You can appreciate in the image above that every part of the robot has its own frame of reference. The color definition is universal:

  • X positive axis is red,
  • Y positive axis is green, and
  • Z positive axis is blue.

Once you get familiar with the robot model, you will be ready to teleoperate it.

Kinematic simulation

Kinematic refers to the fact that, in this first kind of simulation, we are only going to deal with pure motion of the robot, i.e. position, velocity and acceleration. This means that you can command velocities and accelerations as high as you want, even though the actual robot did not have power to achieve them!

Hence masses, inertia and forces are not taken into account at this level. This is left for the dynamic simulation that we will cover in a later article.

The package gopigo3_fake allows to experiment with GoPiGo3 without needing the physical one. It reproduces its kinematics according to its relevant chacteristics: distance between wheels and their radius.

To run the kinematic simulation, first launch the virtual robot:

$ roslaunch gopigo3_fake gopigo3_fake.launch

To make sure that teleoperation is enabled, check in a bash terminal that cmd_vel topic is being published:

$ rostopic list | grep cmd_vel

If there is the output as above, then GoPiGo3 is ready to accept motion commands. You can control it with the keyboard launching this command in another terminal:

$ rosrun key_teleop /key_vel:=/cmd_vel

After some strokes to the arrow keys, the RViz window should show something similar to the following screenshot:

The red arrow represents the set of poses (position + orientation) that the robot has covered. Hence, the trajectory is the line joining all the positions (starting point of every arrow) tangent to every arrow.

In a later article you will find that this teleoperation works in the same manner for the physical GoPiGo3.

Highly autonomous & AI capable mobile robots

You can learn robotics from scratch using the cheap GoPiGo3 robot by following the book (also authored by me) Hands-On ROS for Robotics Programming. You may get the book in the electronic or paperback version at Packt Publishing or Amazon.

The book guides you following a learning by example approach, reaching advanced topics such as Robot Navigation and Deep & Reinforcement Learning applied to Robotics.

See you in the next article about dynamic simulation.

Software Engineer spreading cutting edge technologies. Psychologist applying the current knowledge of the human brain to power intelligent robots & IoT devices

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store