1466 Commits (d3ffc962dd1d5c8d0ed763a9d05832c153ff15ea)
 

Author SHA1 Message Date
AUTOMATIC cfc33f99d4 why did you do this 3 years ago
Greendayle 2e8ba0fa47 fix conflicts 3 years ago
Milly 4f33289d0f Fixed typo 3 years ago
AUTOMATIC 27032c47df restore old opt_split_attention/disable_opt_split_attention logic 3 years ago
AUTOMATIC dc1117233e simplify xfrmers options: --xformers to enable and that's it 3 years ago
AUTOMATIC 7ff1170a2e emergency fix for xformers (continue + shared) 3 years ago
AUTOMATIC1111 48feae37ff
Merge pull request #1851 from C43H66N12O12S2/flash
xformers attention
3 years ago
C43H66N12O12S2 970de9ee68
Update sd_hijack.py 3 years ago
C43H66N12O12S2 7ffea15078
Update requirements_versions.txt 3 years ago
C43H66N12O12S2 ca5f0f149c
Update launch.py 3 years ago
C43H66N12O12S2 69d0053583
update sd_hijack_opt to respect new env variables 3 years ago
C43H66N12O12S2 ddfa9a9786
add xformers_available shared variable 3 years ago
C43H66N12O12S2 26b459a379
default to split attention if cuda is available and xformers is not 3 years ago
C43H66N12O12S2 d0e85873ac
check for OS and env variable 3 years ago
MrCheeze 5f85a74b00 fix bug where when using prompt composition, hijack_comments generated before the final AND will be dropped 3 years ago
guaneec 32e428ff19 Remove duplicate event listeners 3 years ago
ddPn08 772db721a5 fix glob path in hypernetwork.py 3 years ago
AUTOMATIC 7001bffe02 fix AND broken for long prompts 3 years ago
AUTOMATIC 77f4237d1c fix bugs related to variable prompt lengths 3 years ago
C43H66N12O12S2 3f166be1b6
Update requirements.txt 3 years ago
C43H66N12O12S2 4201fd14f5
install xformers 3 years ago
AUTOMATIC 4999eb2ef9 do not let user choose his own prompt token count limit 3 years ago
Trung Ngo 00117a07ef check specifically for skipped 3 years ago
Trung Ngo 786d9f63aa Add button to skip the current iteration 3 years ago
AUTOMATIC 45cc0ce3c4 Merge remote-tracking branch 'origin/master' 3 years ago
AUTOMATIC 706d5944a0 let user choose his own prompt token count limit 3 years ago
leko 616b7218f7 fix: handles when state_dict does not exist 3 years ago
C43H66N12O12S2 91d66f5520
use new attnblock for xformers path 3 years ago
C43H66N12O12S2 76a616fa6b
Update sd_hijack_optimizations.py 3 years ago
C43H66N12O12S2 5d54f35c58
add xformers attnblock and hypernetwork support 3 years ago
AUTOMATIC 87db6f01cc add info about cross attention javascript shortcut code 3 years ago
DepFA 21679435e5 implement removal 3 years ago
DepFA 83749bfc72 context menu styling 3 years ago
DepFA e21e473253 Context Menus 3 years ago
brkirch f2055cb1d4 Add hypernetwork support to split cross attention v1
* Add hypernetwork support to split_cross_attention_forward_v1
* Fix device check in esrgan_model.py to use devices.device_esrgan instead of shared.device
3 years ago
Jairo Correa a958f9b3fd edit-attention browser compatibility and readme typo 3 years ago
C43H66N12O12S2 b70eaeb200
delete broken and unnecessary aliases 3 years ago
C43H66N12O12S2 c9cc65b201
switch to the proper way of calling xformers 3 years ago
AUTOMATIC 12c4d5c6b5 hypernetwork training mk1 3 years ago
Greendayle 5f12e7efd9 linux test 3 years ago
Greendayle fa2ea648db even more powerfull fix 3 years ago
Greendayle 54fa613c83 loading tf only in interrogation process 3 years ago
Greendayle 537da7a304 Merge branch 'master' into dev/deepdanbooru 3 years ago
AUTOMATIC f7c787eb7c make it possible to use hypernetworks without opt split attention 3 years ago
AUTOMATIC 97bc0b9504 do not stop working on failed hypernetwork load 3 years ago
AUTOMATIC d15b3ec001 support loading VAE 3 years ago
AUTOMATIC bad7cb29ce added support for hypernetworks (???) 3 years ago
C43H66N12O12S2 5e3ff846c5
Update sd_hijack.py 3 years ago
C43H66N12O12S2 5303df2428
Update sd_hijack.py 3 years ago
C43H66N12O12S2 35d6b23162
Update sd_hijack.py 3 years ago