Software lead for winning entry in NASA Robotic Mining Competition
My main contributions to the team were our perception systems, mainly localization and obstacle detection. For localization, we used multiple ultra-wideband transmitters and sensors to triangulate the location of the robot with multiple time-of-flight measurements. These coordinates were processed with a Kalman filter to fuse measurements with the robot IMU and other sensors and accurately localize the robot. Obstacle detection was accomplished using ground-plane detection from an RGB-D sensor. after filtering the ground plane, any points left over can be added to an obstacle occupancy map which is built using the localization data. I also led the software team as a whole, coordinating the overall design of the software system using the Robot Operating System (ROS) and ensuring our Autonomy sub-team was able to implement and test their planning algorithms.
Combining this with our incredible mechanical and hardware teams, our robot placed third overall and second for autonomy in the 2019 competition at the University of Alabama.
The full codebase for the robot is open source here
You can see a full demo of our robot here
And our 3rd place competition run here