Рет қаралды 3,197
We present HPS, a method to capture human motion aligned with a 3D world reconstruction using only wearable sensors -- we use inertial sensors (IMU) and a head mounted camera looking outwards. HPS complements more common third-person-based 3D pose estimation methods. It allows capturing larger recording volumes and longer periods of motion, and could be used for VR/AR applications where humans interact with the scene without requiring direct line of sight with an external camera, or to train agents that navigate and interact with the environment based on first-person visual input, like real humans.
With HPS, we recorded a dataset of humans interact- ing with large 3D scenes (300-1000 m2) consisting of 7 subjects and more than 3 hours of diverse motion. The dataset, code and video will be available on the project page: virtualhumans.m....
Bibtex:
@inproceedings{HPS,
title = {Human POSEitioning System (HPS): 3D Human Pose Estimation and Self-localization in Large Scenes from Body-Mounted Sensors },
author = {Guzov, Vladimir and Mir, Aymen and Sattler, Torsten and Pons-Moll, Gerard},
booktitle = {{IEEE} Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {jun},
organization = {{IEEE}},
year = {2021},
}