June 10th, 2026
from 11:00
Abstract
Robots play a crucial role in inspection, agriculture, logistics, automated driving, and search-and-rescue missions. Yet, they lag behind humans in speed, versatility, and robustness. I will show how combining model-based and machine-learning methods with the power of new, low-latency sensors, such as event cameras, can allow autonomous systems such as drones, legged robots, robot arms, and cars to achieve unprecedented agility and robustness. This can result in better productivity and safety of future autonomous systems.
Bio
Davide Scaramuzza is a Professor of robotics at the University of Zurich, where he works on autonomous navigation of micro flying robots with both standard and neuromorphic cameras. He did fundamental work on vision-based navigation and low-latency robust perception with event cameras. He pioneered vision-based drone navigation, which inspired the navigation algorithms of NASA's Mars Helicopter. In 2022, his team demonstrated that an AI-controlled, vision-based drone could outperform the world champions of drone racing, a result published in Nature. He co-founded Zurich-Eye, today Meta Zurich, which developed the localization and mapping algorithms of the Meta Quest. His research was featured in The New York Times, The Guardian, The Economist, and Forbes.
Host
Loris Roveda, Associate Professor in Integrated Intelligence for Robotics, Intelligent Control Area for Systems and Network.