Robotics is one of my main passions. I have worked on several robotics projects while at Cornell and after graduating. My current project is designing and programming the Stewart-Gough Platform shown in Figure 1. This is a prototype robot designed for Ultra Moton, to showcase their high precision linear actuators.
Ultra Motion Stewart-Gough Platform
Figure 2: UM Stewart Gough Platform final iteration in Solidworks. This is a rendering of the platform shown in Figure 1. In the bottom left the Galil motion controller is included for scale.
Figure 1: Ultra Motion Stewart-Gough Platfom
Figure 5: Skewed Orientation. It is difficult, but possible, to manually position the Platform.
Figure 6: Joints. The platform has high precision spherical joints (left) between the linear actuators and the top platform, and universal joints (right) between the actuators and the bottom platform. The spherical joints are sunk into the top plate at an orientation optimized to maximize the reachable workspace. This is necessary because the spherical joints have a relatively small range of motion (±30 Degrees). This orientation was determined using the Python program described in Figure 4. The universal joints have a much wider range of motion, so their joint limits are not generally a limiting factor.
Figure 3: Design Iterations. First, second and final design iterations. Major design changes are obvious between iterations 1 and 2. Between iterations 2 and 3 changes are mostly to the base of the platform.
Autonomous Robotics Final Project
Figure 10: Baxter Robot
Figure 11: Arduino Sumo Bot. The part that looks like eyes is actually the sonar rangefinder. Our robot was coated in foam insulation in an effort to stealth it from other rangefinders. Our robot had a large spatula-like paddle on the front which was used to slide under other robots and reduce their traction.
Figure 7: Simulator. The platform simulator that I wrote in Python allows the user to test out and visualize different motions. Here the robot moves through a circular rocking motion. The plot on the right shows a 3D line visualization of the platform and the plots on the left show how close the actuators are to their minimum or maximum lengths, and how close the ball joints are to the joint limit. Note that the path of the center of the upper platform is traced with a dashed blue line.
In 2017 I designed and programed the prototype Stewart-Gough Platform shown in Figure 1 for Ultra Motion, which is a high precision linear actuator manufacturer. The platform uses electronic linear actuators, along with high precision spherical and universal joints to precisely and rigidly position the top platform in any position and orientation within the robot's reachable workspace. Motion is coordinated using a Galil motion controller and a computer which runs inverse kinematics and supplies the trajectory. Figure 5 shows another view of the completed platform.
The mechanical design for the platform went through several iterations (Figure 3) and was completed using Solidworks. Note that the joints and actuators were either purchased or suppled by Ultra Motion, so the design mostly consists of the upper and lower platforms. Major design considerations included ensuring rigidity in the platform, and maximizing the reachable workspace. The platform workspace is determined by the maximum and minimum actuator lengths and joint limits, which I could not change, and by joint locations, which I could change. I wrote a Python script which determines and displays the workspace for a given set of joint positions and orientations, and used this program to optimize the platform geometry (see Figure 4 for details). One consequence of the optimization is that the spherical joints between the actuators and the top plate are sunk into the top plate an optimized 3D angle (see Figure 6).
Programming and Control
Unfortunately, the project is currently waiting for a couple of replacement parts before it can get moving. The motion of the platform is designed to be coordinated using a Galil motion controller (shown in Figure 2) and an attached computer which runs the inverse kinematics and does 3D interpolation to build the trajectory. The motion controller is written in Python and also allows the user to simulate motions and preview them before commanding the robot. Some example output from the simulator is shown in Figure 7.
Figure 4: Robot Workspace Visualization calculated using Python. This shape is a workspace visualization for the region of space that the robot can reach with the center of the top platform while maintaining a constant flat orientation. The 3D region is radially symmetric and forms an umbrella shape that is about twice as wide as it is tall (about 5 inches tall and 10 inches across). The boundary region is found by sampling an inverse kinematics routine to determine if a given point is within the space reachable by all actuators and joints. Using this strategy it was also possible to determine the maximum roll, pitch and yaw angles for the platform. Running this routine for lots of prospective geometries allowed me to determine a close to optimal geometry.
Figure 8: Autonomous Robot localization, mapping and path planning competition. For this final competition the goal was to program the robot so that it would be able to autonomously determine its random start location, navigate the above environment to as many goal points as possible, and resolve several uncertain map features within the time limit. The QR code on top of the robot was used to externally record its position for later comparison using an overhead camera system.
I worked on several robotics projects for the graduate course Autonomous Mobile Robots (MAE 5180). The class focussed on teaching path planning, localization, mapping, SLAM, and other topics on real robots. The course used modified iRobot Creates, which were essentially Roombas that had been modified to include sonar rangefinders and allow communication with a host computer. The final project, which was completed in 3 person teams, was to program the robot so that it could fully autonomously navigate a maze-like map, after being placed in a random starting position, and determine uncertain map features. Figure 8 shows the robot in the process of navigating the map.
Figure 9: Final Competition Record showing the comparison between the recoded position and where the robot's particle filter localization system thought it was. Note that we successfully visited five out of six target points, which was a very good score. Only one team in the competition managed to reach all six points.
Our control program was written in Matlab, and used a particle filter based system for localization. Part of the competition was to reach as many goal points as possible, so there was also an element of the traveling salesman problem to determine the optimal route to take. In order to plot a course to each point we used a probabilistic roadmap algorithm. Figure 9 shows recorded data from our competition run, unfortunately.
I completed several in-depth robotics projects for the graduate course Robotic Manipulation (CS 5752), which was taught using the Baxter robot shown in Figure 10. Topics covered in course projects included velocity and torque control, path planning, computer vision and others. Our most successful project was programming Baxter's seven degree of freedom arm to smoothly draw on a whiteboard using a velocity control algorithm. Programming was done in Python and ROS.
One of my earliest robotics projects was to program and build a small sumo competition robot (Figure 11) for a Mechatronix course. The robot used an Arduino control board, which was programmed in C (use of any Arduino or other libraries was banned). Our robot used a sonar rangefinder to locate other robots and attempt to push them out of the sumo ring.