No video

Ultimate SLAM? Combining Events, Images, and IMU for Visual SLAM in HDR and High-Speed Scenarios

  Рет қаралды 23,132

UZH Robotics and Perception Group

UZH Robotics and Perception Group

Күн бұрын

Event cameras are bio-inspired vision sensors that output pixel-level brightness changes instead of standard intensity frames. These cameras do not suffer from motion blur and have a very high dynamic range, which enables them to provide reliable visual information during high speed motions or in scenes characterized by high dynamic range. However, event cameras output only little information when the amount of motion is limited, such as in the case of almost still motion. Conversely, standard cameras provide instant and rich information about the environment most of the time (in low-speed and good lighting scenarios), but they fail severely in case of fast motions, or difficult lighting such as high dynamic range or low light scenes. In this paper, we present the first state estimation pipeline that leverages the complementary advantages of these two sensors by fusing in a tightly-coupled manner events, standard frames, and inertial measurements. We show on the publicly available Event Camera Dataset that our hybrid pipeline leads to an accuracy improvement of 130% over event-only pipelines, and 85% over standard-frames only visual-inertial systems, while still being computationally tractable. Furthermore, we use our pipeline to demonstrate-to the best of our knowledge-the first autonomous quadrotor flight using an event camera for state estimation, unlocking
flight scenarios that were not reachable with traditional visual inertial odometry, such as low-light environments and high dynamic range scenes.
Reference:
Antoni Rosinol Vidal, Henri Rebecq, Timo Horstschaefer, Davide Scaramuzza
Ultimate SLAM? Combining Events, Images, and IMU for Robust Visual SLAM in HDR and High Speed Scenarios
IEEE Robotics and Automation Letters (RA-L), 2018.
DOI: 10.1109/LRA.2018.2793357
PDF: rpg.ifi.uzh.ch/...
Project Webpage:
rpg.ifi.uzh.ch/...
Our research page on event based vision:
rpg.ifi.uzh.ch/...
Our research page on vision-based navigation for MAVs:
rpg.ifi.uzh.ch/...
For event-camera datasets and event camera simulator, see here: rpg.ifi.uzh.ch/...
Other resources on event cameras (publications, software, drivers, where to buy, etc.):
github.com/uzh...
Affiliations: A.R. Vidal, H. Rebecq, T. Horstschaefer and D. Scaramuzza are with the Robotics and Perception Group, Dep. of Informatics, University of Zurich, and Dep. of Neuroinformatics, University of Zurich and ETH Zurich, Switzerland rpg.ifi.uzh.ch/

Пікірлер: 5
@chrislzy4959
@chrislzy4959 5 жыл бұрын
awesome project
@azdeatherage
@azdeatherage 4 жыл бұрын
im a cave diver trying to build a mapping robot, starting with a system on my person, can you guys help me?
@PremSai2244
@PremSai2244 3 жыл бұрын
what are the details of DVS sensor and its SDK?
@TheLex1972
@TheLex1972 6 жыл бұрын
Wow
@unknownplayer8221
@unknownplayer8221 6 жыл бұрын
Ultimate VO?
Event Cameras: Opportunities and the Road Ahead (CVPR 2020)
18:54
UZH Robotics and Perception Group
Рет қаралды 30 М.
SLAM Robot Mapping - Computerphile
11:35
Computerphile
Рет қаралды 130 М.
Prank vs Prank #shorts
00:28
Mr DegrEE
Рет қаралды 13 МЛН
小丑和白天使的比试。#天使 #小丑 #超人不会飞
00:51
超人不会飞
Рет қаралды 32 МЛН
MSA- Visual SLAM (no lidar, no IMU) on a drone -- extended version
13:34
Main Street Autonomy
Рет қаралды 11 М.
Champion-level Drone Racing using Deep Reinforcement Learning (Nature, 2023)
4:51
UZH Robotics and Perception Group
Рет қаралды 236 М.
Real-time Visual-Inertial Odometry for Event Cameras using Keyframe-based Nonlinear Optimization
3:03
Efficient, Data-Driven Perception with Event Cameras (Ph.D. Defense of Daniel Gehrig)
20:35
UZH Robotics and Perception Group
Рет қаралды 4,3 М.
DSO: Direct Sparse Odometry
5:09
cvprtum
Рет қаралды 110 М.
Multi-Level Mapping: Real-time Dense Monocular SLAM
2:51
W. Nicholas Greene
Рет қаралды 23 М.
Intro to Inertial Measurement Units (IMU)
12:16
MicWro Engr
Рет қаралды 35 М.