Hi4D: 4D Instance Segmentation of Close Human Interaction

We propose Hi4D, a method and dataset for the automatic analysis of physically close human-human interaction under prolonged contact. Robustly disentangling several in-contact subjects is a challenging task due to occlusions and complex shapes. Hence, existing multi-view systems typically fuse 3D su...

Full description

Saved in:
Bibliographic Details
Main Authors Yin, Yifei, Guo, Chen, Kaufmann, Manuel, Zarate, Juan Jose, Song, Jie, Hilliges, Otmar
Format Journal Article
LanguageEnglish
Published 27.03.2023
Subjects
Online AccessGet full text
DOI10.48550/arxiv.2303.15380

Cover

More Information
Summary:We propose Hi4D, a method and dataset for the automatic analysis of physically close human-human interaction under prolonged contact. Robustly disentangling several in-contact subjects is a challenging task due to occlusions and complex shapes. Hence, existing multi-view systems typically fuse 3D surfaces of close subjects into a single, connected mesh. To address this issue we leverage i) individually fitted neural implicit avatars; ii) an alternating optimization scheme that refines pose and surface through periods of close proximity; and iii) thus segment the fused raw scans into individual instances. From these instances we compile Hi4D dataset of 4D textured scans of 20 subject pairs, 100 sequences, and a total of more than 11K frames. Hi4D contains rich interaction-centric annotations in 2D and 3D alongside accurately registered parametric body models. We define varied human pose and shape estimation tasks on this dataset and provide results from state-of-the-art methods on these benchmarks.
DOI:10.48550/arxiv.2303.15380