[training ] add Kontext i2i training#11858
Conversation
| to_tensor = transforms.ToTensor() | ||
| normalize = transforms.Normalize([0.5], [0.5]) |
There was a problem hiding this comment.
Should initialize one-time only. All deterministic transformations should be initialized only once. Future PR.
|
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
There was a problem hiding this comment.
diffusers/examples/dreambooth/train_dreambooth_lora_flux_kontext.py
Lines 1380 to 1393 in bce55a9
let's add
proj_out,proj_mlp too here, seems to improve results and other trainers target these as well
There was a problem hiding this comment.
But proj_out will also include the final output layer also right? 👁️
There was a problem hiding this comment.
So, maybe let's just add proj_mlp for now given #11874?
Co-authored-by: Linoy Tsaban <[email protected]>
|
@bot /style |
linoytsaban
left a comment
There was a problem hiding this comment.
thanks @sayakpaul 🚀
add note on installing from commit `05e7a854d0a5661f5b433f6dd5954c224b104f0b`
What does this PR do?
Test command:
I haven't finished it fully.
Additionally, I have taken the liberty to modify our training script to precompute the text embeddings when we have
train_dataset.custom_instance_prompts. These are better calledcustom_instruction_prompts, IMO. So, in a future PR, we could switch to better variable names.