Skip to content

lightillusions/UniTEX-FLUX

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

UniTEX-FLUX: Flux training codes used in UniTEX

Flux training codes using in UniTEX: Universal High Fidelity Generative Texturing for 3D Shapes

Yixun Liang$^{*}$ Kunming Luo$^{{*}}$, Xiao Chen$^{{*}}$, Rui Chen, Hongyu YAN, Weiyu LI,Jiarui Liu, Ping Tan$†$

Installation

prepare your environment like fluxgym

Training

NOTE: Since the dataset used in our training can not be release, so the code cannot be directly used for training in a new environment. We provide a necessary training code framework, please check and modify the data/datasets.py implementation!

You need to first prepare your test data and its corresponding .json files. An example can be find in val.json and find Ln .587 in tasks/texturing/trainer.py to see how we load json and its corresponding images to validation_prompt_item. we leave a example in val_script/val_6_views.json, just prepare your own data like that.

Then, if you want to add new models, you can add a novel files like model

data
├──datasets.py
task/{your_task}
├──pipeline.py #the custome pipeline structure like diffusers
├──trainer.py #a warpping class to adding train/ evaluation
└──...(other supports:attention_processor.py)

Once your data and pipeline preparation is complete, you can run the frameworks by

bash train.sh

or debug by

bash debug.sh

Drop Training

you can add --random_drop_condition/ --random_drop_noise in train.sh to open / close the drop training, also, you can change --random_drop_noise_probability/ --random_drop_condition_probability to adjunt the drop rates.

Inference

find the inference code in the flux_piplines in the Repo UniTEX

📍 Citation

@article{liang2025UnitTEX,
  title={UniTEX: Universal High Fidelity Generative Texturing for 3D Shapes},
  author={Yixun Liang and Kunming Luo and Xiao Chen and Rui Chen and Hongyu Yan and Weiyu Li and Jiarui Liu and Ping Tan},
  journal={arXiv preprint arXiv:2505.23253},
  year={2025}
}

Acknowledgement

This work is built on many amazing research works and open-source projects, thanks a lot to all the authors for sharing!

About

Flux training codes (lora) for UniTEX

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published