Flux training codes using in UniTEX: Universal High Fidelity Generative Texturing for 3D Shapes
Yixun Liang
prepare your environment like fluxgym
NOTE:
Since the dataset used in our training can not be release, so the code cannot be directly used for training in a new environment.
We provide a necessary training code framework, please check and modify the data/datasets.py implementation!
You need to first prepare your test data and its corresponding .json files. An example can be find in val.json and find Ln .587 in tasks/texturing/trainer.py to see how we load json and its corresponding images to validation_prompt_item.
we leave a example in val_script/val_6_views.json, just prepare your own data like that.
Then, if you want to add new models, you can add a novel files like model
data
├──datasets.py
task/{your_task}
├──pipeline.py #the custome pipeline structure like diffusers
├──trainer.py #a warpping class to adding train/ evaluation
└──...(other supports:attention_processor.py)
Once your data and pipeline preparation is complete, you can run the frameworks by
bash train.shor debug by
bash debug.shyou can add --random_drop_condition/ --random_drop_noise in train.sh to open / close the drop training, also, you can change --random_drop_noise_probability/ --random_drop_condition_probability to adjunt the drop rates.
find the inference code in the flux_piplines in the Repo UniTEX
@article{liang2025UnitTEX,
title={UniTEX: Universal High Fidelity Generative Texturing for 3D Shapes},
author={Yixun Liang and Kunming Luo and Xiao Chen and Rui Chen and Hongyu Yan and Weiyu Li and Jiarui Liu and Ping Tan},
journal={arXiv preprint arXiv:2505.23253},
year={2025}
}
This work is built on many amazing research works and open-source projects, thanks a lot to all the authors for sharing!