AngelBottomless
de096d0ce7
Weight initialization and More activation func
...
add weight init
add weight init option in create_hypernetwork
fstringify hypernet info
save weight initialization info for further debugging
fill bias with zero for He/Xavier
initialize LayerNorm with Normal
fix loading weight_init
3 years ago
discus0434
dcb45dfecf
Merge branch 'master' of upstream
3 years ago
discus0434
0e8ca8e7af
add dropout
3 years ago
timntorres
51e3dc9cca
Sanitize hypernet name input.
3 years ago
AUTOMATIC1111
0c5522ea21
Merge branch 'master' into training-help-text
3 years ago
discus0434
6b38c2c19c
Merge branch 'AUTOMATIC1111:master' into master
3 years ago
AUTOMATIC
930b4c64f7
allow float sizes for hypernet's layer_structure
3 years ago
discus0434
6f98e89486
update
3 years ago
DepFA
166be3919b
allow overwrite old hn
3 years ago
discus0434
3770b8d2fa
enable to write layer structure of hn himself
3 years ago
discus0434
42fbda83bb
layer options moves into create hnet ui
3 years ago
AUTOMATIC
6be32b31d1
reports that training with medvram is possible.
3 years ago
AUTOMATIC
d4ea5f4d86
add an option to unload models during hypernetwork training to save VRAM
3 years ago
AUTOMATIC
6d09b8d1df
produce error when training with medvram/lowvram enabled
3 years ago
AUTOMATIC
d682444ecc
add option to select hypernetwork modules when creating
3 years ago
AUTOMATIC
b0583be088
more renames
3 years ago
AUTOMATIC
873efeed49
rename hypernetwork dir to hypernetworks to prevent clash with an old filename that people who use zip instead of git clone will have
3 years ago