Egocentric Body Motion Tracking; Synthesis and Action Recognition
Lingni Ma, Yuting Ye, Robin Kips, Siyu Tang, Gen Li, Karen Liu, Boxiao Pan, Richard Newcombe
Abstract
EgoMotion, in its second edition, is a continuation workshop focusing on human motion modeling using egocentric, multi-modal data from wearable devices. We focus motion tracking, synthesis, and understanding algorithms from egocentric/exocentric cameras, non-visual sensors, and high-level derived data. The workshop also covers research that applies egocentric motion for character animation, simulation, robotic learning etc. In addition to algorithms, the workshop promotes recent open-source projects, research platforms, datasets and associated challenges to encourage and accelerate research in the field. We will include live demo sessions to encourage discussions.
Video
Chat is not available.
Successful Page Load