zhaohu xing
52cc83d36b
fix bugs
...
Signed-off-by: zhaohu xing <920232796@qq.com>
3 years ago
zhaohu xing
0831ab476c
Merge branch 'master' into master
3 years ago
zhaohu xing
75c4511e6b
add AltDiffusion to webui
...
Signed-off-by: zhaohu xing <920232796@qq.com>
3 years ago
AUTOMATIC
b48b7999c8
Merge remote-tracking branch 'flamelaw/master'
3 years ago
AUTOMATIC
64c7b7975c
restore hypernetworks to seemingly working state
3 years ago
AUTOMATIC
ce6911158b
Add support Stable Diffusion 2.0
3 years ago
flamelaw
bd68e35de3
Gradient accumulation, autocast fix, new latent sampling method, etc
3 years ago
killfrenzy96
17e4432820
cleanly undo circular hijack #4818
3 years ago
AUTOMATIC
c62d17aee3
use the new devices.has_mps() function in register_buffer for DDIM/PLMS fix for OSX
3 years ago
AUTOMATIC
7ba3923d5b
move DDIM/PLMS fix for OSX out of the file with inpainting code.
3 years ago
Jairo Correa
af758e97fa
Unload sd_model before loading the other
3 years ago
AUTOMATIC
2b91251637
removed aesthetic gradients as built-in
...
added support for extensions
3 years ago
AUTOMATIC
9286fe53de
make aestetic embedding ciompatible with prompts longer than 75 tokens
3 years ago
AUTOMATIC
7d6b388d71
Merge branch 'ae'
3 years ago
C43H66N12O12S2
73b5dbf72a
Update sd_hijack.py
3 years ago
C43H66N12O12S2
786ed49922
use legacy attnblock
3 years ago
MalumaDev
9324cdaa31
ui fix, re organization of the code
3 years ago
MalumaDev
e4f8b5f00d
ui fix
3 years ago
MalumaDev
523140d780
ui fix
3 years ago
MalumaDev
b694bba39a
Merge remote-tracking branch 'origin/test_resolve_conflicts' into test_resolve_conflicts
3 years ago
MalumaDev
9325c85f78
fixed dropbox update
3 years ago
MalumaDev
97ceaa23d0
Merge branch 'master' into test_resolve_conflicts
3 years ago
C43H66N12O12S2
529afbf4d7
Update sd_hijack.py
3 years ago
MalumaDev
37d7ffb415
fix to tokens lenght, addend embs generator, add new features to edit the embedding before the generation using text
3 years ago
MalumaDev
bb57f30c2d
init
3 years ago
AUTOMATIC
429442f4a6
fix iterator bug for #2295
3 years ago
hentailord85ez
80f3cf2bb2
Account when lines are mismatched
3 years ago
brkirch
98fd5cde72
Add check for psutil
3 years ago
brkirch
c0484f1b98
Add cross-attention optimization from InvokeAI
...
* Add cross-attention optimization from InvokeAI (~30% speed improvement on MPS)
* Add command line option for it
* Make it default when CUDA is unavailable
3 years ago
AUTOMATIC
873efeed49
rename hypernetwork dir to hypernetworks to prevent clash with an old filename that people who use zip instead of git clone will have
3 years ago
AUTOMATIC
5de806184f
Merge branch 'master' into hypernetwork-training
3 years ago
hentailord85ez
5e2627a1a6
Comma backtrack padding ( #2192 )
...
Comma backtrack padding
3 years ago
C43H66N12O12S2
623251ce2b
allow pascal onwards
3 years ago
hentailord85ez
d5c14365fd
Add back in output hidden states parameter
3 years ago
hentailord85ez
460bbae587
Pad beginning of textual inversion embedding
3 years ago
hentailord85ez
b340439586
Unlimited Token Works
...
Unlimited tokens actually work now. Works with textual inversion too. Replaces the previous not-so-much-working implementation.
3 years ago
Fampai
1824e9ee3a
Removed unnecessary tmp variable
3 years ago
Fampai
ad3ae44108
Updated code for legibility
3 years ago
Fampai
e59c66c008
Optimized code for Ignoring last CLIP layers
3 years ago
Fampai
1371d7608b
Added ability to ignore last n layers in FrozenCLIPEmbedder
3 years ago
AUTOMATIC
3061cdb7b6
add --force-enable-xformers option and also add messages to console regarding cross attention optimizations
3 years ago
C43H66N12O12S2
cc0258aea7
check for ampere without destroying the optimizations. again.
3 years ago
C43H66N12O12S2
017b6b8744
check for ampere
3 years ago
AUTOMATIC
cfc33f99d4
why did you do this
3 years ago
AUTOMATIC
27032c47df
restore old opt_split_attention/disable_opt_split_attention logic
3 years ago
AUTOMATIC
dc1117233e
simplify xfrmers options: --xformers to enable and that's it
3 years ago
AUTOMATIC1111
48feae37ff
Merge pull request #1851 from C43H66N12O12S2/flash
...
xformers attention
3 years ago
C43H66N12O12S2
970de9ee68
Update sd_hijack.py
3 years ago
C43H66N12O12S2
26b459a379
default to split attention if cuda is available and xformers is not
3 years ago
MrCheeze
5f85a74b00
fix bug where when using prompt composition, hijack_comments generated before the final AND will be dropped
3 years ago