107 Commits (52cc83d36b7663a77b79fd2258d2ca871af73e55)

Author SHA1 Message Date
flamelaw 1bd57cc979 last_layer_dropout default to False 3 years ago
flamelaw d2c97fc3fe fix dropout, implement train/eval mode 3 years ago
flamelaw 89d8ecff09 small fixes 3 years ago
flamelaw 5b57f61ba4 fix pin_memory with different latent sampling method 3 years ago
flamelaw bd68e35de3 Gradient accumulation, autocast fix, new latent sampling method, etc 3 years ago
AUTOMATIC cdc8020d13 change StableDiffusionProcessing to internally use sampler name instead of sampler index 3 years ago
AUTOMATIC 62e3d71aa7 rework the code to not use the walrus operator because colab's 3.7 does not support it 3 years ago
AUTOMATIC1111 cb84a304f0
Merge pull request #4273 from Omegastick/ordered_hypernetworks
Sort hypernetworks list
3 years ago
Isaac Poulton 08feb4c364
Sort straight out of the glob 3 years ago
Isaac Poulton fd62727893
Sort hypernetworks 3 years ago
aria1th 1ca0bcd3a7 only save if option is enabled 3 years ago
aria1th f5d394214d split before declaring file name 3 years ago
aria1th 283249d239 apply 3 years ago
AngelBottomless 179702adc4
Merge branch 'AUTOMATIC1111:master' into force-push-patch-13 3 years ago
AngelBottomless 0d07cbfa15
I blame code autocomplete 3 years ago
aria1th 0abb39f461 resolve conflict - first revert 3 years ago
AUTOMATIC1111 4918eb6ce4
Merge branch 'master' into hn-activation 3 years ago
aria1th 1764ac3c8b use hash to check valid optim 3 years ago
aria1th 0b143c1163 Separate .optim file from model 3 years ago
aria1th 9d96d7d0a0 resolve conflicts 3 years ago
AngelBottomless 20194fd975 We have duplicate linear now 3 years ago
AUTOMATIC1111 17a2076f72
Merge pull request #3928 from R-N/validate-before-load
Optimize training a little
3 years ago
Muhammad Rizqi Nur 3d58510f21 Fix dataset still being loaded even when training will be skipped 3 years ago
Muhammad Rizqi Nur a07f054c86 Add missing info on hypernetwork/embedding model log
Mentioned here: https://github.com/AUTOMATIC1111/stable-diffusion-webui/discussions/1528#discussioncomment-3991513

Also group the saving into one
3 years ago
Muhammad Rizqi Nur ab05a74ead Revert "Add cleanup after training"
This reverts commit 3ce2bfdf95.
3 years ago
Muhammad Rizqi Nur 3ce2bfdf95 Add cleanup after training 3 years ago
Muhammad Rizqi Nur ab27c111d0 Add input validations before loading dataset for training 3 years ago
timntorres e98f72be33
Merge branch 'AUTOMATIC1111:master' into 3825-save-hypernet-strength-to-info 3 years ago
AUTOMATIC1111 810e6a407d
Merge pull request #3858 from R-N/log-csv
Fix log off by 1 #3847
3 years ago
AUTOMATIC1111 d3b4b9d7ec
Merge pull request #3717 from benkyoujouzu/master
Add missing support for linear activation in hypernetwork
3 years ago
AngelBottomless f361e804eb
Re enable linear 3 years ago
Muhammad Rizqi Nur 9ceef81f77 Fix log off by 1 3 years ago
timntorres db5a354c48 Always ignore "None.pt" in the hypernet directory. 3 years ago
benkyoujouzu b2a8b263b2 Add missing support for linear activation in hypernetwork 3 years ago
AngelBottomless 462e6ba667
Disable unavailable or duplicate options 3 years ago
AngelBottomless 029d7c7543
Revert unresolved changes in Bias initialization
it should be zeros_ or parameterized in future properly.
3 years ago
guaneec cc56df996e Fix dropout logic 3 years ago
AngelBottomless 85fcccc105 Squashed commit of fixing dropout silently
fix dropouts for future hypernetworks

add kwargs for Hypernetwork class

hypernet UI for gradio input

add recommended options

remove as options

revert adding options in ui
3 years ago
guaneec b6a8bb123b
Fix merge 3 years ago
timntorres a524d137d0 patch bug (SeverianVoid's comment on 5245c7a) 3 years ago
guaneec 91bb35b1e6
Merge fix 3 years ago
guaneec 649d79a8ec
Merge branch 'master' into hn-activation 3 years ago
guaneec 877d94f97c
Back compatibility 3 years ago
AngelBottomless 7207e3bf49 remove duplicate keys and lowercase 3 years ago
AngelBottomless de096d0ce7 Weight initialization and More activation func
add weight init

add weight init option in create_hypernetwork

fstringify hypernet info

save weight initialization info for further debugging

fill bias with zero for He/Xavier

initialize LayerNorm with Normal

fix loading weight_init
3 years ago
guaneec c702d4d0df
Fix off-by-one 3 years ago
guaneec 2f4c91894d
Remove activation from final layer of HNs 3 years ago
AngelBottomless e9a410b535 check length for variance 3 years ago
AngelBottomless 0d2e1dac40 convert deque -> list
I don't feel this being efficient
3 years ago
AngelBottomless 348f89c8d4 statistics for pbar 3 years ago