83 Commits (ac085628540d0ec6a988fad93f5b8f2154209571)

Author SHA1 Message Date
Jairo Correa af758e97fa Unload sd_model before loading the other 3 years ago
AUTOMATIC 2b91251637 removed aesthetic gradients as built-in
added support for extensions
3 years ago
AUTOMATIC 9286fe53de make aestetic embedding ciompatible with prompts longer than 75 tokens 3 years ago
AUTOMATIC 7d6b388d71 Merge branch 'ae' 3 years ago
C43H66N12O12S2 73b5dbf72a Update sd_hijack.py 3 years ago
C43H66N12O12S2 786ed49922 use legacy attnblock 3 years ago
MalumaDev 9324cdaa31 ui fix, re organization of the code 3 years ago
MalumaDev e4f8b5f00d ui fix 3 years ago
MalumaDev 523140d780 ui fix 3 years ago
MalumaDev b694bba39a Merge remote-tracking branch 'origin/test_resolve_conflicts' into test_resolve_conflicts 3 years ago
MalumaDev 9325c85f78 fixed dropbox update 3 years ago
MalumaDev 97ceaa23d0
Merge branch 'master' into test_resolve_conflicts 3 years ago
C43H66N12O12S2 529afbf4d7 Update sd_hijack.py 3 years ago
MalumaDev 37d7ffb415 fix to tokens lenght, addend embs generator, add new features to edit the embedding before the generation using text 3 years ago
MalumaDev bb57f30c2d init 3 years ago
AUTOMATIC 429442f4a6 fix iterator bug for #2295 3 years ago
hentailord85ez 80f3cf2bb2 Account when lines are mismatched 3 years ago
brkirch 98fd5cde72 Add check for psutil 3 years ago
brkirch c0484f1b98 Add cross-attention optimization from InvokeAI
* Add cross-attention optimization from InvokeAI (~30% speed improvement on MPS)
* Add command line option for it
* Make it default when CUDA is unavailable
3 years ago
AUTOMATIC 873efeed49 rename hypernetwork dir to hypernetworks to prevent clash with an old filename that people who use zip instead of git clone will have 3 years ago
AUTOMATIC 5de806184f Merge branch 'master' into hypernetwork-training 3 years ago
hentailord85ez 5e2627a1a6
Comma backtrack padding (#2192)
Comma backtrack padding
3 years ago
C43H66N12O12S2 623251ce2b allow pascal onwards 3 years ago
hentailord85ez d5c14365fd Add back in output hidden states parameter 3 years ago
hentailord85ez 460bbae587 Pad beginning of textual inversion embedding 3 years ago
hentailord85ez b340439586 Unlimited Token Works
Unlimited tokens actually work now. Works with textual inversion too. Replaces the previous not-so-much-working implementation.
3 years ago
Fampai 1824e9ee3a Removed unnecessary tmp variable 3 years ago
Fampai ad3ae44108 Updated code for legibility 3 years ago
Fampai e59c66c008 Optimized code for Ignoring last CLIP layers 3 years ago
Fampai 1371d7608b Added ability to ignore last n layers in FrozenCLIPEmbedder 3 years ago
AUTOMATIC 3061cdb7b6 add --force-enable-xformers option and also add messages to console regarding cross attention optimizations 3 years ago
C43H66N12O12S2 cc0258aea7 check for ampere without destroying the optimizations. again. 3 years ago
C43H66N12O12S2 017b6b8744 check for ampere 3 years ago
AUTOMATIC cfc33f99d4 why did you do this 3 years ago
AUTOMATIC 27032c47df restore old opt_split_attention/disable_opt_split_attention logic 3 years ago
AUTOMATIC dc1117233e simplify xfrmers options: --xformers to enable and that's it 3 years ago
AUTOMATIC1111 48feae37ff
Merge pull request #1851 from C43H66N12O12S2/flash
xformers attention
3 years ago
C43H66N12O12S2 970de9ee68
Update sd_hijack.py 3 years ago
C43H66N12O12S2 26b459a379
default to split attention if cuda is available and xformers is not 3 years ago
MrCheeze 5f85a74b00 fix bug where when using prompt composition, hijack_comments generated before the final AND will be dropped 3 years ago
AUTOMATIC 77f4237d1c fix bugs related to variable prompt lengths 3 years ago
AUTOMATIC 4999eb2ef9 do not let user choose his own prompt token count limit 3 years ago
AUTOMATIC 706d5944a0 let user choose his own prompt token count limit 3 years ago
C43H66N12O12S2 91d66f5520
use new attnblock for xformers path 3 years ago
C43H66N12O12S2 b70eaeb200
delete broken and unnecessary aliases 3 years ago
AUTOMATIC 12c4d5c6b5 hypernetwork training mk1 3 years ago
AUTOMATIC f7c787eb7c make it possible to use hypernetworks without opt split attention 3 years ago
C43H66N12O12S2 5e3ff846c5
Update sd_hijack.py 3 years ago
C43H66N12O12S2 5303df2428
Update sd_hijack.py 3 years ago
C43H66N12O12S2 35d6b23162
Update sd_hijack.py 3 years ago