Task 010: Self-flying Drones and Visual Analytics
| Event Date: | November 18, 2021 |
|---|---|
| Time: | 11:00 am (ET) / 8:00am (PT) |
| Priority: | No |
| College Calendar: | Show |
Ziyun (Claude) Wang, University of Pennsylvania
EV-Catcher: High-Speed Object Catching Using Low-latency Event-based Neural Networks
Abstract:
Event-based sensors have recently drawn increasing interest in robotic perception due to their lower latency, higher dynamic range and lower bandwidth compared with CMOS-based imagers. These properties make them ideal tools for real-time perception tasks in highly dynamic environments.
In this talk, we demonstrate an application where event cameras excel: accurately catching fast-flying objects with an average speed of up to 13m/s. We demonstrate an end-to-end system that can estimate the impact position of an object in less than 150ms, and issue a command to a linear actuator to capture the flying object. To achieve this level of performance, we introduce a novel event-representation called Binary Event History Image (BEHI) to encode event data at low latency, as well as a real-time differentiable approach that robustly outputs a confidence-enabled control signal to the robot. We show that the system is able to consistently succeed in catching fast-moving objects for a variety of shooting angles. In addition, we conduct a detailed analysis of the various sources of latency in the proposed catching system.
Bio:
Ziyun (Claude) Wang received his BSc in Computer Science from Rice University, and his MSc in Robotics from the University of Pennsylvania, where he is currently pursuing a Ph.D. in computer and information science. He is interested in computer vision and machine learning problems in robotics, focusing on event-based vision sensors.
