Poster
EgoSim: An Egocentric Multi-view Simulator for Body-worn Cameras during Human Motion
Dominik Hollidt · Paul Streli · Jiaxi Jiang · Yasaman Haghighi · Changlin Qian · Xintong Liu · Christian Holz
[
Abstract
]
Fri 13 Dec 4:30 p.m. PST
— 7:30 p.m. PST
Abstract:
Research on egocentric tasks in computer vision has mostly focused on head-mounted cameras, such as fisheye cameras or those integrated into immersive headsets.We argue that the increasing miniaturization of optical sensors will lead to the prolific integration of cameras into body-worn devices at various locations.This will bring fresh perspectives to established tasks in computer vision and benefit key areas such as human motion tracking, body pose estimation, or action recognition---particularly for the lower body, which is typically occluded.In this paper, we introduce EgoSim, a novel simulator of body-worn cameras that generates realistic egocentric renderings from multiple perspectives across a wearer's body.A key feature of EgoSim is its use of real motion capture data and a physical simulation of camera attachments to render motion artifacts, which especially affect arm- or leg-worn cameras.We also present MultiEgoView, a dataset of egocentric footage from six egocentric body-worn cameras and 3D body poses during several activities:77\,hours of data are based on AMASS motion sequences in two virtual environments and $\sim$5\,hours are from real-world motion data from 13 participants using six GoPro cameras together with an Xsens mo-cap suit.We show EgoSim's effectiveness by training an end-to-end video-only pose estimation network.Analyzing its domain gap showed that our dataset and simulator substantially aid training for inference on real-world data.EgoSim code and MultiEgoView dataset:
Live content is unavailable. Log in and register to view live content