Muhammad Rizqi Nur
4123be632a
Fix merge conflicts
3 years ago
Muhammad Rizqi Nur
cd4d59c0de
Merge master
3 years ago
AUTOMATIC1111
17a2076f72
Merge pull request #3928 from R-N/validate-before-load
...
Optimize training a little
3 years ago
Muhammad Rizqi Nur
3d58510f21
Fix dataset still being loaded even when training will be skipped
3 years ago
Muhammad Rizqi Nur
a07f054c86
Add missing info on hypernetwork/embedding model log
...
Mentioned here: https://github.com/AUTOMATIC1111/stable-diffusion-webui/discussions/1528#discussioncomment-3991513
Also group the saving into one
3 years ago
Muhammad Rizqi Nur
ab05a74ead
Revert "Add cleanup after training"
...
This reverts commit 3ce2bfdf95 .
3 years ago
Muhammad Rizqi Nur
3ce2bfdf95
Add cleanup after training
3 years ago
Muhammad Rizqi Nur
ab27c111d0
Add input validations before loading dataset for training
3 years ago
Muhammad Rizqi Nur
05e2e40537
Merge branch 'master' into gradient-clipping
3 years ago
timntorres
e98f72be33
Merge branch 'AUTOMATIC1111:master' into 3825-save-hypernet-strength-to-info
3 years ago
AUTOMATIC1111
810e6a407d
Merge pull request #3858 from R-N/log-csv
...
Fix log off by 1 #3847
3 years ago
Muhammad Rizqi Nur
9ceef81f77
Fix log off by 1
3 years ago
Muhammad Rizqi Nur
16451ca573
Learning rate sched syntax support for grad clipping
3 years ago
timntorres
db5a354c48
Always ignore "None.pt" in the hypernet directory.
3 years ago
benkyoujouzu
b2a8b263b2
Add missing support for linear activation in hypernetwork
3 years ago
Muhammad Rizqi Nur
2a25729623
Gradient clipping in train tab
3 years ago
AngelBottomless
029d7c7543
Revert unresolved changes in Bias initialization
...
it should be zeros_ or parameterized in future properly.
3 years ago
guaneec
cc56df996e
Fix dropout logic
3 years ago
AngelBottomless
85fcccc105
Squashed commit of fixing dropout silently
...
fix dropouts for future hypernetworks
add kwargs for Hypernetwork class
hypernet UI for gradio input
add recommended options
remove as options
revert adding options in ui
3 years ago
guaneec
b6a8bb123b
Fix merge
3 years ago
timntorres
a524d137d0
patch bug (SeverianVoid's comment on 5245c7a)
3 years ago
guaneec
91bb35b1e6
Merge fix
3 years ago
guaneec
649d79a8ec
Merge branch 'master' into hn-activation
3 years ago
guaneec
877d94f97c
Back compatibility
3 years ago
AngelBottomless
7207e3bf49
remove duplicate keys and lowercase
3 years ago
AngelBottomless
de096d0ce7
Weight initialization and More activation func
...
add weight init
add weight init option in create_hypernetwork
fstringify hypernet info
save weight initialization info for further debugging
fill bias with zero for He/Xavier
initialize LayerNorm with Normal
fix loading weight_init
3 years ago
guaneec
c702d4d0df
Fix off-by-one
3 years ago
guaneec
2f4c91894d
Remove activation from final layer of HNs
3 years ago
Melan
18f86e41f6
Removed two unused imports
3 years ago
AngelBottomless
e9a410b535
check length for variance
3 years ago
AngelBottomless
0d2e1dac40
convert deque -> list
...
I don't feel this being efficient
3 years ago
AngelBottomless
348f89c8d4
statistics for pbar
3 years ago
AngelBottomless
40b56c9289
cleanup some code
3 years ago
AngelBottomless
b297cc3324
Hypernetworks - fix KeyError in statistics caching
...
Statistics logging has changed to {filename : list[losses]}, so it has to use loss_info[key].pop()
3 years ago
DepFA
1fbfc052eb
Update hypernetwork.py
3 years ago
AngelBottomless
48dbf99e84
Allow tracking real-time loss
...
Someone had 6000 images in their dataset, and it was shown as 0, which was confusing.
This will allow tracking real time dataset-average loss for registered objects.
3 years ago
AngelBottomless
24694e5983
Update hypernetwork.py
3 years ago
discus0434
6a4fa73a38
small fix
3 years ago
discus0434
97749b7c7d
Merge branch 'AUTOMATIC1111:master' into master
3 years ago
discus0434
7912acef72
small fix
3 years ago
discus0434
fccba4729d
add an option to avoid dying relu
3 years ago
AUTOMATIC
7fd90128eb
added a guard for hypernet training that will stop early if weights are getting no gradients
3 years ago
discus0434
dcb45dfecf
Merge branch 'master' of upstream
3 years ago
discus0434
0e8ca8e7af
add dropout
3 years ago
timntorres
272fa527bb
Remove unused variable.
3 years ago
timntorres
19818f023c
Match hypernet name with filename in all cases.
3 years ago
AUTOMATIC
03a1e288c4
turns out LayerNorm also has weight and bias and needs to be pre-multiplied and trained for hypernets
3 years ago
AUTOMATIC1111
0c5522ea21
Merge branch 'master' into training-help-text
3 years ago
timntorres
4ff274e1e3
Revise comments.
3 years ago
timntorres
5245c7a493
Issue #2921-Give PNG info to Hypernet previews.
3 years ago
AUTOMATIC
c23f666dba
a more strict check for activation type and a more reasonable check for type of layer in hypernets
3 years ago
Melan
7543cf5e3b
Fixed some typos in the code
3 years ago
Melan
8f59129847
Some changes to the tensorboard code and hypernetwork support
3 years ago
aria1th
f89829ec3a
Revert "fix bugs and optimizations"
...
This reverts commit 108be15500 .
3 years ago
AngelBottomless
108be15500
fix bugs and optimizations
3 years ago
AngelBottomless
a71e021236
only linear
3 years ago
AngelBottomless
d8acd34f66
generalized some functions and option for ignoring first layer
3 years ago
discus0434
6f98e89486
update
3 years ago
DepFA
d6ea584137
change html output
3 years ago
discus0434
2ce52d32e4
fix for #3086 failing to load any previous hypernet
3 years ago
AUTOMATIC
c6e9fed500
fix for #3086 failing to load any previous hypernet
3 years ago
discus0434
42fbda83bb
layer options moves into create hnet ui
3 years ago
discus0434
7f8670c4ef
Merge branch 'master' into master
3 years ago
Silent
da72becb13
Use training width/height when training hypernetworks.
3 years ago
discus0434
e40ba281f1
update
3 years ago
discus0434
a5611ea502
update
3 years ago
discus0434
6021f7a75f
add options to custom hypernetwork layer structure
3 years ago
AngelBottomless
703e6d9e4e
check NaN for hypernetwork tuning
3 years ago
AUTOMATIC
c7a86f7fe9
add option to use batch size for training
3 years ago
AUTOMATIC
03d62538ae
remove duplicate code for log loss, add step, make it read from options rather than gradio input
3 years ago
AUTOMATIC
326fe7d44b
Merge remote-tracking branch 'Melanpan/master'
3 years ago
AUTOMATIC
c344ba3b32
add option to read generation params for learning previews from txt2img
3 years ago
AUTOMATIC
354ef0da3b
add hypernetwork multipliers
3 years ago
Melan
8636b50aea
Add learn_rate to csv and removed a left-over debug statement
3 years ago
Melan
1cfc2a1898
Save a csv containing the loss while training
3 years ago
AUTOMATIC
c3c8eef9fd
train: change filename processing to be more simple and configurable
...
train: make it possible to make text files with prompts
train: rework scheduler so that there's less repeating code in textual inversion and hypernets
train: move epochs setting to options
3 years ago
AUTOMATIC
ee015a1af6
change textual inversion tab to train
...
remake train interface to use tabs
3 years ago
Milly
2d006ce16c
xy_grid: Find hypernetwork by closest name
3 years ago
AUTOMATIC
d6fcc6b87b
apply lr schedule to hypernets
3 years ago
AUTOMATIC
6a9ea5b41c
prevent extra modules from being saved/loaded with hypernet
3 years ago
AUTOMATIC
d4ea5f4d86
add an option to unload models during hypernetwork training to save VRAM
3 years ago
AUTOMATIC
d682444ecc
add option to select hypernetwork modules when creating
3 years ago
AUTOMATIC
873efeed49
rename hypernetwork dir to hypernetworks to prevent clash with an old filename that people who use zip instead of git clone will have
3 years ago