Poster
ContactField: Implicit Field Representation for Multi-Person Interaction Geometry
Hansol Lee · Tackgeun You · Hansoo Park · Woohyeon Shim · Sanghyeon Kim · Hwasup Lim
We introduce a novel implicit field representation tailored for multi-person interaction geometry in 3D spaces, capable of simultaneously reconstructing occupancy, instance identification (ID) tags, and contact fields. Volumetric representation of interacting people presents substantial challenges, including inaccurately captured geometries, varying degrees of occlusions, and data scarcity. Existing multi-view methods, which either reconstruct each subject in isolation or merge nearby 3D surfaces as a single unified mesh, often fail to capture the intricate geometry among interacting instances and exploit datasets with views and a small group of people for training. Our approach utilizes an implicit representation for interaction geometry contextualized by a transformer-based multi-view fusion module. This module adeptly aggregates both local and global information from individual views and interacting groups, enabling precise modeling of close physical interactions through dense point retrieval in small areas supported by the implicit fields. Furthermore, we develop a synthetic dataset encompassing diverse multi-person interaction scenarios to enhance the robustness of our geometry estimation. Experimental results demonstrate the superiority of our method for accurately reconstructing human geometries and ID tags within three-dimensional spaces, outperforming conventional multi-view techniques. Notably, our method facilitates unsupervised estimation of contact points without the need for specific training data on contact supervision.
Live content is unavailable. Log in and register to view live content