Autonomous Driving for Adverse Road Conditions

Motivation

This is my MRSD capstone project at CMU. My teammates and I built an autonomous system that adapts to adverse road conditions. Wet asphalt is the number one cause of traffic accidents, so our system focuses on preventing hydroplaning and loss of vehicle control.

Overview

My focus was mainly on the perception and localization subsystems. I trained a real-time segmentation network using 516 images that we collected and annotated. The camera on the system is a Flir Grasshopper3. The model is deployed on a Jetson Xavier and multithreading is used to improve inference speed. Here is a video of the segmentation. The mean IoU on the test dataset is 95.61%.

I also worked on using GPS, IMU, and custom encoders to localize our robot. Sensor measurements are passed to an extended Kalmin Filter to estimate the robot state, and we achieved centimeter level accuracy. Custom encoders were made to provide velocity information. 36 magnets are inserted into a 3D-printed ring, which is mounted on the wheel hub. A magnetometer is mounted near the magnet ring and outputs data via I2C.

Some other things I worked on include vehicle controls, system architecture design, sensor calibration, parameter tuning, PCB design, vehicle fabrication, wiring, and integration.