SCUTTLE Mobile Robot

Sensing, Connected Utility Transport Taxi for Level Environments

SCUTTLE was designed to support teaching within Multidisciplinary Engineering Technology (MXET) at Texas A&M. The bot is a payload-capable mobile platform that is made of readily-available off-the-shelf parts and 3D printed designs. SCUTTLE is the main platform currently used for in-lab lessons and semester project for MXET 300, Mobile Robotics undergraduate course.

scuttle

Figure 1. SCUTTLE Mobile Robot Rendering

Highlights

A few highlights of this open-source mobile platform

 

Figure 2. Github Repository

Figure 3. Beaglebone Embedded Linux

Figure 4. Full Open-Source CAD Assembly

Figure 5. Machine Vision Software

 

 

Figure 6. Gamepad Driving Integration

Figure 7. 3D Printable Design

 

 

Purpose:

Performance

SCUTTLE is designed to create access for students, researchers, and tinkerers to an affordable mobile robot that can carry a payload.  This platform supports the load of additonal actuators, materials handling, extra battery packs, displays, or other gadgets to suit new projects. We searched high and low and found no robot under one thousand USD that can carry over 20 pounds. So, we created SCUTTLE within the Mechatronics Engineering Technology major at Texas A&M. The acronym tells what we aim to accomplish.

Affordable, Modular, Off-The-Shelf

Affordability is an important target and this bot can be built from under $350 in parts. After exploring the designs availble (from DIY all the way to turnkey solutions) we found there was a need for this kind of robot.

The Bill of Materials (BOM) for this project is offered with suppliers and price so new teams can reproduce the project .

Kinematics

Kinematic parameters are predefined for classroom instruction and to give students a frame of reference for sampling sensors, calculating motion and creating autonomous navigation software. This guide explains the kinematics for SCUTTLE.

Our choices of coordinate frames, system variables, and driving strategy closely follow a research article published in Advances in Robotics & Automation on the subject of Dynamic Modeling of Differential-Drive Mobile Robots.

Kinematic Frames

The two main frames for defining the robot position are the body fixed frame and the global coordinate frame. For most applications, SCUTTLE boots up with all coordinates at zero, and the body-fixed coordinates move while the global frame remains fixed.

Figure 8. Kinematic Frames

Phi Left and Phi right are configured such that positive increment on these will generate a positive movement in the body-fixed x-direction. The x vector on the body fixed frame is separated from the global frame by angle theta.

Robot Geometry Definitions

The geometry shown here defines the kinematic functions in the SCUTTLE software.  Since the robot is modular, the chassis may be modified and the speed controller will be updated just by changing a few key parameters in the Kinematics program.

Figure 9. Robot Geometry

R and L are the main variables that need to be defined in order to calculate motion, indicating the wheel radius and half of the wheelbase. The default values are shown here.

A Non-Holonomic System

The twin driven-wheeled platform was chosen for economy and robustness, but because we control three system variables (x, y, and theta) with only two actuators, the user must strategically move the actuators to achieve a given state. A holonomic robot has the same number (or more) of controllable degrees of freedom as the number of degrees of freedom, which is not our case.

SCUTTLE DOF: (x, y, theta)

SCUTTLE Controllable DOF: (left motor, right motor)

Mecanum Robot DOF: (x, y, theta)

Therefore, the configuration of the robot depends on the path taken to achieve the movement.

Figure 10. Comparison of Holonomic & Non-Holonomic Designs

Figure 9 Compares some qualities of holonomic and non-holonomic designs. In the case of a mecanum robot, there are four motors required and the user can directly make a command for a movement in x, y, or theta (provided that the software was written for it). The SCUTTLE robot has only two controlled degrees of freedom (phi dot left and right). Only two motors are required and the user cannot directly request a movement in the y-direction, for example.

Figure 11. Example of Non-Holonomic Movement

 

 

 

Controller Interaction

Figure 12. Controller Driving Configuration

In the provided demo program, the EasySMX USB wireless controller sends commands to the robot. The left-hand joystick outputs the motor-controlling axes, which correspond to an x_dot request and a theta_dot request as shown in Figure 11. With this configuration, each axis actually effects both wheels. When an autonomous driving program is implemented, the navigation software replaces the human driver to generate x_dot and theta_dot requests.

How-To Videos:

The videos support fabrication, electronics setup, robot assembly, programming, and robot operation.  They are designed to help students build and operate the mobile robot within one semester.

View the SCUTTLE youtube playlist to see all SCUTTLE videos.

Robot Assembly Instructions

https://mxet.github.io/SCUTTLE/index_files/image001.png

https://mxet.github.io/SCUTTLE/index_files/image002.png

https://mxet.github.io/SCUTTLE/index_files/image005.png

 

 

 

Fabrication Instructions

https://mxet.github.io/SCUTTLE/index_files/image003.png

https://mxet.github.io/SCUTTLE/index_files/image004.png

 

 

 

Programming Instructions

 

 

 

Demos

 

 

Programming

The SCUTTLE robot software has been programmed in Python3 on an embedded Linux platform. Both Beaglebone Blue and Raspberry Pi have been tested successfully. The software has been architected to make a robust starting point for students to create their own autonomous missions. These slides detail the software architecture.

 

Overall Architecture

Figure 13. Software Architecture

The blocks in yellow are sensors, and the items in orange are actuators or other outputs. The level-2 blocks in teal are specific to the hardware platform (beagle, pi, etc) and perform communication with the low level devices. The blocks above level 2 are non-hardware specific.

Each block aside from sensors and actuators represent an individual python program. The purple text indicates what important information is passed between programs and the black arrows indicate (for the most part) what direction the data is flowing. If a level 3 program needs information from another, it must receive the information from the top-level program, in order to maintain the structure of independence in program functions.

This software structure is preferred in order to perform subsystem testing. The data flowing through the top level is minimal and can be replaced with artificial data in the even that a sensor is unavailable.

 

Python importing guidelines:

1.    Each file should import the files below it in hierarchy, and not the files above it.

2.    Each file should not import files more than 1 level below itself.

3.    Each file may import non-scuttle libraries as needed (import numpy, import time, etc)

Libraries Utilized:

BeagleBone Blue Integration:

·         RCPY for communicating with MPU9250 & commanding motor drivers

·         Adafruit GPIO for I2C Communication

·         BMP280 for communicating with the onboard bmp280 sensor.

Raspberry Pi integration:

·         pysicktim for accessing LIDAR data

·         gpiozero for controlling GPIO pins.

Common Libraries

·         os for making shell commands via python code.

·         time for keeping track of time

·         threading for peforming multithreading

·         numpy for performing math operations

·         pygame for accessing gamepad controller data

·         cayenne.client for sending MQTT messages

·         smbus for accessing i2c bus through python commands

 

Electronics Hardware

This section gives background on standard hardware components, their configuration, and links to datasheets.

Wiring Diagrams

The latest Wiring Diagram PDF is hosted on the SCUTTLE github. This document shows how each sensor/actuator is connected on the robot.

BeagleBone Blue

The beaglebone Blue has been fully integrated in the SCUTTLE robot. Figure 13 shows the ports on the board.

Figure 14. Beaglebone Bubble Diagram

More information on the Beaglebone Blue exists in the following resources:

·         Beaglebone Blue Wiki

·         Beaglebone Blue Specification Sheet

·         Beaglebone Blue Schematic

 

Dual Motor Driver

A number of motor driver options are available. The standard choice is HW-231 motor driver which uses the NXP MC3386 H-Bridge. The ground is connected directly to the battery pack and it accepts two input pin pairs as PWM channels.

https://images-na.ssl-images-amazon.com/images/I/71LWIE2u4iL._SL1500_.jpg

Figure 15. Motor Driver

Resources for the motor driver are here:

·         MC3386 Datasheet

Ultrasonic Sensor

The ultrasonic sensor is an optional item to support autonomous driving and obstacle avoidance. There are versions of this board which require 5v (more common) or only 3.3v (less common). If your board requires 5.0v then power will need to be drawn directly from the power port on the beaglebone.

SainSmart HC-SR04 Ranging Detector Mod Distance Sensor (Blue)

Figure 16. Ultrasonic Sensor HC-SR04

Resources for the HC-SR04 are here:

·         HC-SR04 Datasheet

Encoders

Two AMS AS5048B encoders are required for the robot, with one at each motor pulley. Some details about the encoders:

·         They are actually angular position sensors. They return a degrees value from 0 to 360.

·         This sensor communicates over I2C. The left-hand encoder is addressed as 0x40 and the Right-hand encoder is addressed as 0x41, with the pullup wire connected to the A1 contact.

·         They measure the rotation of the motor pulley, and the software must adjust for 1:2 wheel:motor turn ratio

There is one standardized encoder bracket used for right and left, but different mounting holes are used as shown in Figure 20. The proper mounting holes make the sensor centered over the magnet which mounts to the end of the motor output shaft.

Figure 17. Left and Right Encoder Brackets

Resources for the encoder are here:

·         AMS AS5048B Datasheet

Battery Pack

The battery pack is a 3-cell lithium ion pack with a nominal voltage of 11.1v (3.7v per cell) combined with a few off-the-shelf parts and a 3D printed case. The capacity is 3400 mAh (we verified!) And they have enough capacity to drive SCUTTLE for several hours. With additional actuators, a significant payload, or demanding sensors such as the SICK LIDAR, it is advised to add more capacity to the robot. The cells must not be drained below 2.8 volts each to prevent damage.

 

Figure 18. Battery Pack with Cover Removed

Resources for the battery pack are here:

·         Panasonic NCR18650B datasheet

·         Battery Cell handling and protection youtube video

·         Battery Cell supplemental info sheet shows more on assembly.

 

USB Camera

The integrated USB camera is a Microsoft LifeCam HD-3000. On the robot, we remove the camera’s mounting bracket and insert the camer into a 3D printed bracket.

LifeCam HD-3000

Resources for the HD-3000 Camera are found here:

·         LifeCam HD-3000 datasheet

·         Camera Supplemental Info sheet shows how to remove the flexible grip and infrared filter

 

Motor

Scuttle has two motors for driving the rear wheels. Each is a DC 12v motor (shown in Figure 31) with a gearbox that reduces the output speed to 200 RPM. The shaft is offset by 7mm from the centerline of the motor, which helps raise the clearance of the motor housing from the ground in the robot chassis. Three M3x10 screws fasten the motor to the motor plate.

Figure 19. Driving Motors, 12v DC with gearbox

The motor leads must be soldered to wires of 18 AWG or larger. The other ends of those wires connect to an Anderson Powerpole connector with a 15 amp crimp connector, or heavier. The powerpole connectors shown in Figure 32 are used throughout the robot to isolate the battery pack, beaglebone, motors, and motor driver.

 

LIDAR (optional)

SCUTTLE has been enhanced with a lidar manufactured by SICK sensor company. This lidar performs laser-based ranging in 270 degree plane at 0.33 degree resolution, 15 times per second! The USB interface is used to communicate to the microprocessor and the power is directly provided by the 11.1v battery pack.

TiM561-2050101

Figure 20. Sick TiM561 LIDAR unit.

Resources for the LIDAR unit are here:

·         Pysicktim python library on github

·         Operating instructions from SICK

·         Technical information from SICK

Supporters of the SCUTTLE Mobile Robot Project

Thank you to these groups for their support

https://misl.tamu.edu/wp-content/uploads/sites/123/2016/09/misl-logo-remake-300x120.png

The Mobile Integrated Solutions Laboratory at Texas A&M is an applied research team within the Engineering Technology department, and has collaborated in development and testing of the robot.

Image result for sick USA

SICK USA has become a partner of the ETID department and has provided LIDAR units for the MXET Lab which are being integrated into the mobile robots.

Beagle Board - beagleboard.org

Beagle has offered design feedback and support on integrating their education-oriented robotics hardware, the beaglebone Blue.

Image result for texas instruments logo

TI has held a longstanding partnership with the ETID at Texas A&M and supplied electronics hardware equipped with TI microcontrollers.

 

Keywords for this Project:

SCUTTLE, Robot, mechatronics, Beaglebone, beaglebone blue, raspberry pi, arduino, Texas A&M, ETID, Engineering Technology, Computer Vision, Automation, Multidisciplinary Engineering, Anderson Powerpole, powerwerx, vision tracking, autonomous navigation, OpenCV, Ultrasonic, Payload, low cost robot, DIY, IOT, TAMU, sensing connected utility transport taxi for lab environments, Industrial IOT, Python, hackster.io, thingiverse, grabCAD, open source, github, industrial robotics, educational robot, research, mechanical engineering, electronics, electrical engineering, roomba, mobile robotics, autonomous vehicle, dynamics