Developing Self-Driving Cars: ‘Look Ma, My Other Car Drives Itself’

Published:


Autonomous Mobility by Shared Autonomy A technology stack that based on driving contexts, automatically and seamlessly switches its autonomy back and forth between two driving modes: autonomous and remote. Read the business statement to learn more about this endeavor.


Analyzing CPU Utilization Patterns to Understand Dynamic Computational Workload of a Self-driving Car This study aims at developing a method that predicts the CPU usage patterns of software tasks running on a self-driving car. To ensure safety of such dynamic systems, the worst-case-based CPU utilization analysis has been used; however, the nature of dynamically changing driving contexts requires more flexible approach for an efficient computing resource management. To better understand the dynamic CPU usage patterns, this paper presents an effort of designing a feature vector to represent the information of driving environments and of predicting, using regression methods, the selected tasks’ CPU usage patterns given specific driving contexts. Experiments with real-world vehicle data show a promising result and validate the usefulness of the proposed method. Read the following papers to learn more about this work:


Tartan Racing Boss is an autonomous vehicle that uses on-board sensors (global positioning system, lasers, radars, and cameras) to track other vehicles, detect static obstacles, and localize itself relative to a road model. A three-layer planning system combines mission, behavioral, and motion planning to drive in urban environments. The mission planning layer considers which street to take to achieve a mission goal. The behavioral layer determines when to change lanes and precedence at intersections and performs error recovery maneuvers. The motion planning layer selects actions to avoid obstacles while making progress toward local goals. The system was developed from the ground up to address the require- ments of the DARPA Urban Challenge using a spiral system development process with a heavy emphasis on regular, regressive system testing. During the National Qualification Event and the 85km Urban Challenge Final Event, Boss demonstrated some of its capabilities, qualifying first and winning the challenge. Read the following papers to learn more about this work:

  • Chris Urmson, Joshua Anhalt, Drew Bagnell, Christopher Baker, Robert Bittner, M. N. Clark, John Dolan, Dave Duggins, Tugrul Galatali, Chris Geyer, Michele Gittleman, Sam Harbaugh, Martial Hebert, Thomas M. Howard, Sascha Kolski, Alonzo Kelly, Maxim Likhachev, Matt McNaughton, Nick Miller, Kevin Peterson, Brian Pilnick, Raj Rajkumar, Paul Rybski, Bryan Salesky, Young-Woo Seo, Sanjiv Singh, Jarrod Snider, Anthony Stentz, William Red Whittaker, Ziv Wolkowicki, Jason Ziglar, Hong Bae, Thomas Brown, Daniel Demitrish, Bakhtiar Litkouhi, Jim Nickolaou, Varsha Sadekar, Wende Zhang, Joshua Struble, Michael Taylor, Michael Darms, and Dave Ferguson, Autonomous driving in urban environments: Boss and the urban challenge, Journal of Field Robotics: Special Issue on the 2007 DARPA Urban Challenge, Part I, pp. 425-466, 2008.
  • Young-Woo Seo and Chris Urmson, A perception mechanism for supporting autonomous intersection handling in urban driving, In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS-2008), pp. 1830-1835, Nice, France, September, 2008.
  • Chris Urmson, Joshua Anhalt, Drew Bagnell, Christopher Baker, Robert Bittner, John Dolan, Dave Duggins, Dave Ferguson, Tugrul Galatali, Hartmut Geyer, Michele Gittleman, Sam Harbaugh, Martial Hebert, Thomas M. Howard, Alonzo Kelly, David Kohanbash, Maxim Likhachev, Nick Miller, Kevin Peterson, Raj Rajkumar, Paul Rybski, Bryan Salesky, Sebastian Scherer, Young-Woo Seo, Reid Simmons, Sanjiv Singh, Jarrod Snider, Anthony Stentz, William Red Whittaker, and Jason Ziglar, Tartan racing: a multi-modal approach to the DARPA Urban Challenge, Tech Report, the Robotics Institute, Carnegie Mellon University, 2007.