This is an Open Source Mars Rover that contains a full rocker bogie suspension. Testing the Freedom Robotics Platform in challenging and new environments is integral to the work we do. So I am driving my Mars Rover through obstacles in my house.
At Freedom Robotics, we strongly believe in field-testing every single element of our system and therefore invest in building full-stack demonstration robots which mimic challenges our users will go through. The robot below is one that I started working on personally before I even joined the Freedom team and is a great example of how integrating Freedom can significantly speed up debugging, development and give out-of-the-box features like teleoperation, visualization, and remote editing.
In January, I started work on building the NASA-JPL Open Source Rover (OSR), a downscaled but functional replica of the Mars Curiosity rover built using off-the-shelf components - but without the on-board fusion reactor of course. The Mars Rovers have rocker-bogie suspensions which allow the robot to remain level with all wheels touching the ground - especially useful when navigating unknown rocky terrain. The real rover drives so that it can back out of any situation it gets stuck in - the OSR design has flipped the driving direction, allowing the wheels to scale bigger obstacles when driving forward. I spent about 8 weekends assembling all the parts to make the OSR (which I named Robert, after my grandfather) and it’s been 100% worth the effort.
The rover’s rocker bogie suspension is great on rough terrain.
The All-terrain Robot
The Mars Rover makes for an amazing robotics development platform. It can scale objects twice the size of its wheels which allows it to climb stairs and sidewalk ledges. It has fantastic traction, which allows you to compete in the Rainforest XPRIZE or drive on rocky terrain, like the surface of Mars. The rocker-bogie and differential steering suspension means that the body of the OSR averages the motions nicely without needing any spring-damper suspension, which helps sensors stay level and helps with teleoperation.
Robert the rover. Beefier motors and several sensors.
Fast Forwarding Development
After setting the electronics and mechanics up - time to dig into the software. The code had several bugs and didn’t follow ROS or programming conventions so that it wasn’t usable out of the box with any of the packages in the ROS ecosystem. It didn’t have the standard Twist velocity interface or odometry used for navigation. Hardware and electronics are hard and software is hard, but combining the two is straight up evil. A solid foundation was needed to enable work on developing autonomous navigation, stair climbing, and exploring Mars. An overhaul was needed.
Throughout the development, testing, and debugging process, I used Link ROS extensively to verify that the newly implemented odometry and Twist input were working as expected. Here’s what the response from incoming linear and angular velocity commands to the motors looked like:
Diving into the response from the OSR to commands: linear and angular velocities
The top graph shows the linear velocity command sent in red in m/s and the encoder velocities of the middle two wheels in rad/s. This shows that the response follows the commands nicely. The bottom graph shows a similar response for the angular velocity, and the output from the front left steering wheel. There is a little bit more delay due to having to overcome static friction, but what’s interesting here is that for one angular velocity command, the wheel actually turned in the opposite direction. If you compare it with the linear velocity, you can see that the OSR was going backwards at that time. That’s what’s causing this: to maintain the same twist while reversing, the OSR has to turn in the opposite direction. Seems to be working!
Robotics development ready - Live on GitHub
I sent these commands using the Pilot feature, and then simply went back in time to look at the results, no need to record a rosbag! You can try these useful tools for yourself with a free trial here.
I also made some hardware adjustments to the OSR which will allow autonomous exploration of Mars, the rainforest, and the deep sea.
- I added a 3D (livox dji) and 2D (SICK TIM561) lidar, A realsense for monocular and stereo vision and an IMU. Localization and mapping of unknown terrain requires sensors!
- I left out the head of the OSR - a face used to convey emotion - HRI is cool, but I wanted space for sensors and more power budget.
- I upgraded to battery and drive motors to be beefier.
I’m excited to say that all of the fixes and improvements I made are live on GitHub! Shoutout to the JPL team (Eric Junkins and more recently George Fosmire) for putting in the time and effort to test and review these 4000+ lines of code.
What’s next for Robert the Mars Rover
I will be teaching Robert to start to do things on his own - roving around Mars, or carrying my groceries!This would require extrinsic sensor calibration and some navigation stack setup and tuning. Calibration on a robot like this is a tricky and fun problem! There aren’t precise CAD drawings for this kind of robot, it’s got many different sensors and lots of moving parts.
If you have suggestions, ideas, or want to get involved - reach out (firstname.lastname@example.org). Stay tuned!