54 Commits (f3f2ffd448bae76c0f731ecd96550a1aecf24ea9)

Author SHA1 Message Date
timntorres a524d137d0 patch bug (SeverianVoid's comment on 5245c7a) 3 years ago
AngelBottomless 7207e3bf49 remove duplicate keys and lowercase 3 years ago
AngelBottomless de096d0ce7 Weight initialization and More activation func
add weight init

add weight init option in create_hypernetwork

fstringify hypernet info

save weight initialization info for further debugging

fill bias with zero for He/Xavier

initialize LayerNorm with Normal

fix loading weight_init
3 years ago
AngelBottomless e9a410b535 check length for variance 3 years ago
AngelBottomless 0d2e1dac40 convert deque -> list
I don't feel this being efficient
3 years ago
AngelBottomless 348f89c8d4 statistics for pbar 3 years ago
AngelBottomless 40b56c9289 cleanup some code 3 years ago
AngelBottomless b297cc3324 Hypernetworks - fix KeyError in statistics caching
Statistics logging has changed to {filename : list[losses]}, so it has to use loss_info[key].pop()
3 years ago
DepFA 1fbfc052eb Update hypernetwork.py 3 years ago
AngelBottomless 48dbf99e84 Allow tracking real-time loss
Someone had 6000 images in their dataset, and it was shown as 0, which was confusing.
This will allow tracking real time dataset-average loss for registered objects.
3 years ago
AngelBottomless 24694e5983 Update hypernetwork.py 3 years ago
discus0434 6a4fa73a38 small fix 3 years ago
discus0434 97749b7c7d
Merge branch 'AUTOMATIC1111:master' into master 3 years ago
discus0434 7912acef72 small fix 3 years ago
discus0434 fccba4729d add an option to avoid dying relu 3 years ago
AUTOMATIC 7fd90128eb added a guard for hypernet training that will stop early if weights are getting no gradients 3 years ago
discus0434 dcb45dfecf Merge branch 'master' of upstream 3 years ago
discus0434 0e8ca8e7af add dropout 3 years ago
timntorres 272fa527bb Remove unused variable. 3 years ago
timntorres 19818f023c Match hypernet name with filename in all cases. 3 years ago
AUTOMATIC 03a1e288c4 turns out LayerNorm also has weight and bias and needs to be pre-multiplied and trained for hypernets 3 years ago
AUTOMATIC1111 0c5522ea21
Merge branch 'master' into training-help-text 3 years ago
timntorres 4ff274e1e3 Revise comments. 3 years ago
timntorres 5245c7a493 Issue #2921-Give PNG info to Hypernet previews. 3 years ago
AUTOMATIC c23f666dba a more strict check for activation type and a more reasonable check for type of layer in hypernets 3 years ago
aria1th f89829ec3a Revert "fix bugs and optimizations"
This reverts commit 108be15500.
3 years ago
AngelBottomless 108be15500
fix bugs and optimizations 3 years ago
AngelBottomless a71e021236
only linear 3 years ago
AngelBottomless d8acd34f66
generalized some functions and option for ignoring first layer 3 years ago
discus0434 6f98e89486 update 3 years ago
DepFA d6ea584137
change html output 3 years ago
discus0434 2ce52d32e4 fix for #3086 failing to load any previous hypernet 3 years ago
discus0434 42fbda83bb layer options moves into create hnet ui 3 years ago
discus0434 7f8670c4ef
Merge branch 'master' into master 3 years ago
Silent da72becb13 Use training width/height when training hypernetworks. 3 years ago
discus0434 e40ba281f1 update 3 years ago
discus0434 a5611ea502 update 3 years ago
discus0434 6021f7a75f add options to custom hypernetwork layer structure 3 years ago
AngelBottomless 703e6d9e4e check NaN for hypernetwork tuning 3 years ago
AUTOMATIC c7a86f7fe9 add option to use batch size for training 3 years ago
AUTOMATIC 03d62538ae remove duplicate code for log loss, add step, make it read from options rather than gradio input 3 years ago
AUTOMATIC 326fe7d44b Merge remote-tracking branch 'Melanpan/master' 3 years ago
AUTOMATIC c344ba3b32 add option to read generation params for learning previews from txt2img 3 years ago
AUTOMATIC 354ef0da3b add hypernetwork multipliers 3 years ago
Melan 8636b50aea Add learn_rate to csv and removed a left-over debug statement 3 years ago
Melan 1cfc2a1898 Save a csv containing the loss while training 3 years ago
AUTOMATIC c3c8eef9fd train: change filename processing to be more simple and configurable
train: make it possible to make text files with prompts
train: rework scheduler so that there's less repeating code in textual inversion and hypernets
train: move epochs setting to options
3 years ago
AUTOMATIC ee015a1af6 change textual inversion tab to train
remake train interface to use tabs
3 years ago
Milly 2d006ce16c xy_grid: Find hypernetwork by closest name 3 years ago
AUTOMATIC d6fcc6b87b apply lr schedule to hypernets 3 years ago