November 14, 2018

12:00 pm / 1:00 pm

Venue

Hackerman B17

Abstract
Perception precedes action, in both the biological world as well as the technologies maturing today that will bring us autonomous cars, aerial vehicles, robotic arms and mobile platforms. The problem of probabilistic state estimation via sensor measurements takes on a variety of forms, resulting in information about our own motion as well as the structure of the world around us. In this talk, I will discuss some approaches that my research group has been developing that focus on estimating these quantities online and in real-time in extreme environments wheredust, fog and other visually obscuring phenomena are widely present and when sensor calibration is altered or degraded over time. These approachesinclude new techniques in computer vision, visual-inertial SLAM, geometric reconstruction, nonlinear optimization, and even some sensor development. The methods I discuss have an application-specific focus to ground vehicles in the subterranean environment, but are also currently deployed in the agriculture, search and rescue, and industrial human-robot collaboration contexts.
 
Bio
Chris Heckman is an Assistant Professor and theJacques Pankove Faculty Fellow in the Department of Computer Science at the University of Colorado at Boulder, where he also holds appointments inthe Aerospace Engineering Sciences and Electrical and Computer Engineering departments. Professor Heckman earned his B.S. in Mechanical Engineeringfrom UC Berkeley in 2008 and his Ph.D. in Theoretical and Applied Mechanics from Cornell University in 2012, where he was an NSF Graduate ResearchFellow. He had postdoctoral appointments at the Naval Research Laboratoryin Washington, D.C. as an NRC Research Associate, and in the AutonomousRobotics and Perception Group at CU Boulder as a Research Scientist, before joining the faculty there in 2016. He currently is leading one of the funded competition teams in the DARPA Subterranean Challenge; his past work has been funded by NSF, DARPA and multiple industry partners. His research focuses on developing mathematical and systems-level frameworks for autonomous control and perception, particularly vision and sensor fusion. His work applies concepts of nonlinear dynamical systems to the design of control systems for autonomous agents, in particular ground and aquatic vehicles, enabling them to navigate uncertain and rapidly-changing environments. A hallmark of his research is the implementation of these systems on experimental platforms.
 
Recorded Fall 2018 Seminars