This repository hosts the ComfyUI implementation of UNO (Unity and Novel Output), supporting FLUX models. This implementation includes several new features and optimizations. That can run the full version with 24GB VRAM, as well as quickly run the FP8 version.
You can also access RunningHub online to use this plugin and model for free. Run&Download this Workflow: https://www.runninghub.ai/post/1910316871583789058
- Updated to match author's latest version
- Support for
flux-dev-fp8andflux-schnell-fp8 - Note:
flux-schnell-fp8offers lower consistency but much faster generation (4 steps)
- Support for
- Memory optimization through block swapping
- Run BF16 models on 24GB GPUs
- Support for both
flux-devandflux-schnellin BF16 mode on 24GB GPUs
- Progress bar to display denoising progress in real-time
- Local model loading configured via
config.json
Models are configured in the root config.json file. The default structure expected is:
ComfyUI/models
flux/
FLUX.1-schnell/ ###download from https://huggingface.co/black-forest-labs/FLUX.1-schnell
text_encoder/
tokenizer/
text_encoder_2/
tokenizer_2/
unet/
flux1-schnell.sft
flux1-dev.sft
vae/
ae.safetensors
UNO/
dit_lora.safetensors
For T5 and CLIP models, there are two organization options:
-
Single directory (XLabs-AI/xflux_text_encoders style):
- Set
"t5-in-one": 1or"clip-in-one": 1 - Text encoder and tokenizer are in the same folder
- Set
-
Official structure (separate directories):
- Set
"t5-in-one": 0or"clip-in-one": 0 - Point
"t5"or"clip"to the parent directory - Child directories must follow official naming conventions:
- CLIP:
text_encoderandtokenizer - T5:
text_encoder_2andtokenizer_2
- CLIP:
- Set
-
VAE:
- Default:
comfyui/models/vae/ - Configure with
"vae_base"in config.json
- Default:
-
FLUX Models:
- Default:
comfyui/models/unet/ - Configure with
"model_base"in config.json - Note: The author believes current FP8 FLUX models have issues, so BF16 models are used in both modes
- Default:
-
DIT-LoRA Models:
- Default:
comfyui/models/UNO/ - Configure with
"lora_base"in config.json
- Default:
Thanks to the original author. Visit the official repository at: https://github.com/bytedance/UNO
