Skip to content

xingyi-li/iControl3D

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

iControl3D: An Interactive System for Controllable 3D Scene Generation (ACM MM 2024)

Xingyi Li1,2, Yizheng Wu1,2, Jun Cen2, Juewen Peng2, Kewei Wang1,2, Ke Xian1, Zhe Wang3, Zhiguo Cao1*, Guosheng Lin2

1Huazhong University of Science and Technology, 2Nanyang Technological University, 3SenseTime Research

Paper | arXiv | Video | Supp | Poster

This repository contains the official PyTorch implementation of our ACM MM 2024 paper "iControl3D: An Interactive System for Controllable 3D Scene Generation".

Environment Setup

First install dependencies:

conda create -n icontrol3d python=3.10
conda activate icontrol3d
conda install pytorch=1.13.0 torchvision torchaudio pytorch-cuda=11.6 -c pytorch -c nvidia
conda install scipy scikit-image
conda install -c conda-forge diffusers transformers ftfy accelerate
pip install opencv-python
pip install -U gradio
pip install pytorch-lightning==1.7.7 einops==0.4.1 omegaconf==2.2.3
pip install timm

# Install diffusers
git clone https://github.com/takuma104/diffusers.git
cd diffusers
git checkout 9a37409663a53f775fa380db332d37d7ea75c915
pip install .

# Update transformers and huggingface_hub
pip install git+https://github.com/huggingface/transformers
pip install -U huggingface_hub

# Pytorch3D
conda install -c iopath iopath
conda install -c bottler nvidiacub
conda install pytorch3d -c pytorch3d

# skylibs
pip install --upgrade skylibs
conda install -c conda-forge openexr-python openexr
conda install -c conda-forge pyshtools

# Grounded-Segment-Anything
python -m pip install -e segment_anything
python -m pip install -e GroundingDINO
pip install opencv-python pycocotools matplotlib onnxruntime onnx ipykernel

Follow https://github.com/haofanwang/ControlNet-for-Diffusers and download pipeline_stable_diffusion_controlnet_inpaint.py to enable ControlNet for diffusers:

# assume you already know the absolute path of installed diffusers
cp pipeline_stable_diffusion_controlnet_inpaint.py PATH/pipelines/stable_diffusion

Then, you need to import this new added pipeline in corresponding files

PATH/pipelines/stable_diffusion/__init__.py
PATH/pipelines/__init__.py
PATH/__init__.py

Last but not least, as per (haofanwang/ControlNet-for-Diffusers#6), to use any control model already present in ControlNet models, the way to do it is:

Download the models and annotators from the controlnet huggingface repo (https://huggingface.co/lllyasviel/ControlNet)[https://huggingface.co/lllyasviel/ControlNet] and place it under models folder. Then convert the models which can be used with the pipeline:

cd diffusers
python ./scripts/convert_controlnet_to_diffusers.py --checkpoint_path ./models/control_sd15_***.pth --dump_path ../controlnet_models/control_sd15_*** --device cpu

For Grounded-SAM:

cd lib/grounded_sam

wget https://dl.fbaipublicfiles.com/segment_anything/sam_vit_h_4b8939.pth
wget https://github.com/IDEA-Research/GroundingDINO/releases/download/v0.1.0-alpha/groundingdino_swint_ogc.pth

Usage

conda activate icontrol3d
# scribble
python app_controlnet_inpaint.py
# depth
# python app_controlnet_inpaint_depth.py
# hed
# python app_controlnet_inpaint_hed.py
# seg
# python app_controlnet_inpaint_seg.py
# canny
# python app_controlnet_inpaint_canny.py
# mlsd
# python app_controlnet_inpaint_mlsd.py

You can add --outdoor and adjust parameters like --box_threshold to enable the ability to handle outdoor scenes. Please refer to lib/utils/opt.py for more information.

After this, you can use nerfstudio to train a NeRF and render videos.

Acknowledgement

This code is built on stablediffusion-infinity, Text2Room and many other projects. We would like to acknowledge them for making great code openly available for us to use.

About

[ACM MM 2024] iControl3D: An Interactive System for Controllable 3D Scene Generation

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published