Skip to content

jingyugong/SSOMotion

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Human Motion Synthesis in 3D Scenes via Unified Scene Semantic Occupancy

by Jingyu Gong, Kunkun Tong, Zhuoran Chen, Chuanhan Yuan, Mingang Chen, Zhizhong Zhang, Xin Tan*, Yuan Xie

Introduction

This repository provides the implementation of our AAAI2026 paper Human Motion Synthesis in 3D Scenes via Unified Scene Semantic Occupancy.

Preparation

Installation

Please follow these instructions to set up your environment.

cd SSOMotion
conda env create -f environment.yml
conda activate ssomotion
python -m spacy download en_core_web_sm
pip install git+https://github.com/openai/CLIP.git

Body Models

Please download the SMPL-X body model and place it in the ./body_models/ folder.

Dataset

We train our models on AMASS and HUMANISE. We additionally evaluate our method on clutterd scenes from DIMOS+ShapeNet, PROX+PROX-S, and Replica. All datasets downloaded from the links should be placed under the project's dataset/ folder.

For textual annotation, please download the Babel and HumanML3D and place them in the ./dataset/amass/ folder.

Usage

Data Preprocessing

Please process the scene and training data separately. To process the scene data into the semantic occupancy form, run the following script in your terminal:

sh shell_scripts/data_process_scripts/process_scene_occ.sh

Run this script to preprocess the motion data:

sh shell_scripts/data_process_scripts/process_mdm_data.sh

Training

First, train the base diffusion model with the following script:

bash shell_scripts/train_scripts/train_action2motion.sh

Then, use the following script to train the motion controller:

sh shell_scripts/train_scripts/train_action2motion_control.sh

Generation

You can run the following command for motion generation in scenes from DIMOS:

sh shell_scripts/generate_scripts/generate_for_eval.sh $ACTION

where ACTION is one of walk, sit, or lie.

We also provide a convenient script for motion generation in scenes from PROX and Replica with the following command:

sh shell_scripts/generate_scripts/generate_scene2motion.sh

As for motion prediction on HUMANISE dataset, you can run

sh shell_scripts/generate_scripts/predict_for_eval.sh

Evaluation

You can evalute the generated motions in scenes from DIMOS using following command:

sh shell_scripts/evaluate_scripts/eval_metric.sh $ACTION

We also provide the script for motion prediction evaluation on HUMANISE dataset:

sh shell_scripts/evaluate_scripts/eval_metric.sh prediction

Acknowledgement

This code is based on HumanML3D, MDM, OmniControl, HUMANISE, SMPL-X, COINS, and DIMOS. If you find them useful, please consider citing them in your work.

About

Official Implementation of Human Motion Synthesis in 3D Scenes via Unified Scene Semantic Occupancy (AAAI2026)

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors