Skip to content

learner-shx/SVG-IR

Repository files navigation

SVG-IR: Spatially-Varying Gaussian Splatting for Inverse Rendering (CVPR 2025)

Hanxiao Sun1, Yupeng Gao2, Jin Xie2, Jian Yang1, Beibei Wang2
1Nankai University 2Nanjing University

This is official implement of Relightable 2D Gaussian for the paper.

SVG-IR: Spatially-Varying Gaussian Splatting for Inverse Rendering. Alt text

Installation

Clone this repo

git clone https://github.com/learner-shx/SVG-IR.git --recursive

Install dependencies

# install environment
conda env create --file environment.yml
conda activate r3dg

# install pytorch=1.12.1
conda install pytorch==1.12.1 torchvision==0.13.1 torchaudio==0.12.1 cudatoolkit=11.6 -c pytorch -c conda-forge

# install torch_scatter==2.1.1
pip install torch_scatter==2.1.1

# install kornia==0.6.12
pip install kornia==0.6.12

# install nvdiffrast=0.3.1
git clone https://github.com/NVlabs/nvdiffrast
pip install ./nvdiffrast

# install slang-torch
pip install slangtorch==1.2.1

Install the pytorch extensions

We recommend that users compile the extension with CUDA 11.8 to avoid the potential problems mentioned in 3D Guassian Splatting.

# install custom-knn
pip install ./submodules/custom-knn

# install bvh
pip install ./submodules/bvh

# install knn-cuda
pip install ./submodules/simple-knn

# install rgss (relightable gaussian surfels splatting)
pip install ./rgss-rasterization

# install svgss (spatially-varying gaussian surfels splatting)
pip install ./svgss-rasterization

Data preparation

TensoIR Synthetic Dataset

Download the TensoIR synthetic dataset from LINK provided by TensoIR.

Data Structure

We organize the datasets like this:

Relightable3DGaussian
├── datasets
    ├── TensoIR
    |   ├── armadillo
    |   ├── ...

Running

We run the code in a single NVIDIA GeForce RTX 3090 GPU (24G). To reproduce the results in the paper, please run the following code. TensoIR Synthetic dataset:

sh script/run_tensoIR.sh

Evaluating

Run the following command to evaluate Novel View Synthesis:

# e.g. TensoIR dataset
# stage 1
python eval_nvs.py --eval \
    -m output/TensoIR/${i}/3dgs \
    -c output/TensoIR/${i}/3dgs/chkpnt30000.pth

# stage 2
python eval_nvs.py --eval \
    -m output/TensoIR/${i}/render_relight \
    -c output/TensoIR/${i}/render_relight/chkpnt50000.pth \
    -t render_relight

Before Relighting, you need to collect HDR image first. You can download them from open source website, for example polyhaven.com. And then, modify the env map path in eval_relighting_tensoIR.py.

# e.g. we have a HDR image located in `{ROOT_DIRECTORY}/env_map/bridge.hdr`
# modify the env map path in `eval_relighting_tensoIR.py`
task_dict = {
    "bridge": {
        "capture_list": ["pbr",  "base_color", "lights", "local_lights", "direct",  "visibility"],
        "envmap_path": "{ROOT_DIRECTORY}/env_map/bridge.hdr",   # modify this line
    },
}

Run the following command to evaluate Relighting (for Synthetic4Relight only):

# e.g.
python eval_relighting_tensoIR.py \
    -m output/TensoIR/hotdog/render_relight \
    -c output/TensoIR/hotdog/render_relight/chkpnt50000.pth \
    --sample_num 384

Trying on your own data

We recommend that users reorganize their own data as render_relightpp-like dataset and then optimize. Modified VisMVSNet and auxiliary scripts to prepare your own data will come soon.

Citation

If you find our work useful in your research, please be so kind to cite:

@article{SVGIR2025,
    author    = {Sun, Hanxiao and Gao, Yupeng and Xie, Jin and Yang, Jian and Wang, Beibei},
    title     = {SVG-IR:Spatially-Varying Gaussian Splatting for Inverse Rendering},
    journal   = {arXiv:2504.06815},
    year      = {2025},
}

Acknowledgement

The code was built on Relightable3DGS, GaussianSurfels, MIRRes. Thanks for these great projects!

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •