Generating realistic human motion is essential for many computer vision and graphics applications. The wide variety of human body shapes and sizes greatly impacts how people move. However, most existing motion models ignore these differences, relying on a standardized, average body. This leads to uniform motion across different body types, where movements don't match their physical characteristics, limiting diversity. To solve this, we introduce a new approach to develop a generative motion model based on body shape. We show that it's possible to train this model using unpaired data by applying cycle consistency, intuitive physics, and stability constraints, which capture the relationship between identity and movement. The resulting model generates diverse, physically plausible, and dynamically stable human motions that are both quantitatively and qualitatively more realistic than current state-of-the-art methods.
@inproceedings{tripathi2024humos,
title = {{HUMOS}: Human Motion Model Conditioned on Body Shape},
author = {Tripathi, Shashank and Taheri, Omid and Lassner, Christoph and Black, Michael J. and Holden, Daniel and Stoll, Carsten},
booktitle = {European Conference on Computer Vision ({ECCV})},
year = {2024},
}