AUTOMATIC
|
79e39fae61
|
CLIP hijack rework
|
3 years ago |
AUTOMATIC
|
683287d87f
|
rework saving training params to file #6372
|
3 years ago |
AUTOMATIC1111
|
88e01b237e
|
Merge pull request #6372 from timntorres/save-ti-hypernet-settings-to-txt-revised
Save hypernet and textual inversion settings to text file, revised.
|
3 years ago |
Faber
|
81133d4168
|
allow loading embeddings from subdirectories
|
3 years ago |
Kuma
|
fda04e620d
|
typo in TI
|
3 years ago |
timntorres
|
b6bab2f052
|
Include model in log file. Exclude directory.
|
3 years ago |
timntorres
|
b85c2b5cf4
|
Clean up ti, add same behavior to hypernetwork.
|
3 years ago |
timntorres
|
eea8fc40e1
|
Add option to save ti settings to file.
|
3 years ago |
AUTOMATIC1111
|
eeb1de4388
|
Merge branch 'master' into gradient-clipping
|
3 years ago |
AUTOMATIC
|
525cea9245
|
use shared function from processing for creating dummy mask when training inpainting model
|
3 years ago |
AUTOMATIC
|
184e670126
|
fix the merge
|
3 years ago |
AUTOMATIC1111
|
da5c1e8a73
|
Merge branch 'master' into inpaint_textual_inversion
|
3 years ago |
AUTOMATIC1111
|
7bbd984dda
|
Merge pull request #6253 from Shondoit/ti-optim
Save Optimizer next to TI embedding
|
3 years ago |
Vladimir Mandic
|
192ddc04d6
|
add job info to modules
|
3 years ago |
Shondoit
|
bddebe09ed
|
Save Optimizer next to TI embedding
Also add check to load only .PT and .BIN files as embeddings. (since we add .optim files in the same directory)
|
3 years ago |
Philpax
|
c65909ad16
|
feat(api): return more data for embeddings
|
3 years ago |
AUTOMATIC
|
311354c0bb
|
fix the issue with training on SD2.0
|
3 years ago |
AUTOMATIC
|
bdbe09827b
|
changed embedding accepted shape detection to use existing code and support the new alt-diffusion model, and reformatted messages a bit #6149
|
3 years ago |
Vladimir Mandic
|
f55ac33d44
|
validate textual inversion embeddings
|
3 years ago |
Yuval Aboulafia
|
3bf5591efe
|
fix F541 f-string without any placeholders
|
3 years ago |
Jim Hays
|
c0355caefe
|
Fix various typos
|
3 years ago |
AUTOMATIC1111
|
c9a2cfdf2a
|
Merge branch 'master' into racecond_fix
|
3 years ago |
AUTOMATIC1111
|
a2feaa95fc
|
Merge pull request #5194 from brkirch/autocast-and-mps-randn-fixes
Use devices.autocast() and fix MPS randn issues
|
3 years ago |
PhytoEpidemic
|
119a945ef7
|
Fix divide by 0 error
Fix of the edge case 0 weight that occasionally will pop up in some specific situations. This was crashing the script.
|
3 years ago |
brkirch
|
4d5f1691dd
|
Use devices.autocast instead of torch.autocast
|
3 years ago |
AUTOMATIC1111
|
39827a3998
|
Merge pull request #4688 from parasi22/resolve-embedding-name-in-filewords
resolve [name] after resolving [filewords] in training
|
3 years ago |
AUTOMATIC
|
b48b7999c8
|
Merge remote-tracking branch 'flamelaw/master'
|
3 years ago |
flamelaw
|
755df94b2a
|
set TI AdamW default weight decay to 0
|
3 years ago |
AUTOMATIC
|
ce6911158b
|
Add support Stable Diffusion 2.0
|
3 years ago |
flamelaw
|
89d8ecff09
|
small fixes
|
3 years ago |
flamelaw
|
5b57f61ba4
|
fix pin_memory with different latent sampling method
|
3 years ago |
AUTOMATIC
|
c81d440d87
|
moved deepdanbooru to pure pytorch implementation
|
3 years ago |
flamelaw
|
2d22d72cda
|
fix random sampling with pin_memory
|
3 years ago |
flamelaw
|
a4a5735d0a
|
remove unnecessary comment
|
3 years ago |
flamelaw
|
bd68e35de3
|
Gradient accumulation, autocast fix, new latent sampling method, etc
|
3 years ago |
AUTOMATIC1111
|
89daf778fb
|
Merge pull request #4812 from space-nuko/feature/interrupt-preprocessing
Add interrupt button to preprocessing
|
3 years ago |
AUTOMATIC
|
cdc8020d13
|
change StableDiffusionProcessing to internally use sampler name instead of sampler index
|
3 years ago |
space-nuko
|
c8c40c8a64
|
Add interrupt button to preprocessing
|
3 years ago |
parasi
|
9a1aff645a
|
resolve [name] after resolving [filewords] in training
|
3 years ago |
AUTOMATIC1111
|
73776907ec
|
Merge pull request #4117 from TinkTheBoush/master
Adding optional tag shuffling for training
|
3 years ago |
KyuSeok Jung
|
a1e271207d
|
Update dataset.py
|
3 years ago |
KyuSeok Jung
|
b19af67d29
|
Update dataset.py
|
3 years ago |
KyuSeok Jung
|
13a2f1dca3
|
adding tag drop out option
|
3 years ago |
Muhammad Rizqi Nur
|
d85c2cb2d5
|
Merge branch 'master' into gradient-clipping
|
3 years ago |
AUTOMATIC
|
8011be33c3
|
move functions out of main body for image preprocessing for easier hijacking
|
3 years ago |
Muhammad Rizqi Nur
|
bb832d7725
|
Simplify grad clip
|
3 years ago |
TinkTheBoush
|
821e2b883d
|
change option position to Training setting
|
3 years ago |
Fampai
|
39541d7725
|
Fixes race condition in training when VAE is unloaded
set_current_image can attempt to use the VAE when it is unloaded to
the CPU while training
|
3 years ago |
Muhammad Rizqi Nur
|
237e79c77d
|
Merge branch 'master' into gradient-clipping
|
3 years ago |
KyuSeok Jung
|
af6fba2475
|
Merge branch 'master' into master
|
3 years ago |