AUTOMATIC
3b2141c5fb
add 'Ignore last layers of CLIP model' option as a parameter to the infotext
3 years ago
AUTOMATIC
e6e42f98df
make --force-enable-xformers work without needing --xformers
3 years ago
Fampai
1371d7608b
Added ability to ignore last n layers in FrozenCLIPEmbedder
3 years ago
DepFA
b458fa48fe
Update ui.py
3 years ago
DepFA
15c4278f1a
TI preprocess wording
...
I had to check the code to work out what splitting was 🤷🏿
3 years ago
Greendayle
0ec80f0125
Merge branch 'master' into dev/deepdanbooru
3 years ago
AUTOMATIC
3061cdb7b6
add --force-enable-xformers option and also add messages to console regarding cross attention optimizations
3 years ago
AUTOMATIC
f9c5da1592
add fallback for xformers_attnblock_forward
3 years ago
Greendayle
01f8cb4447
made deepdanbooru optional, added to readme, automatic download of deepbooru model
3 years ago
Artem Zagidulin
a5550f0213
alternate prompt
3 years ago
DepFA
34acad1628
Add GZipMiddleware to root demo
3 years ago
C43H66N12O12S2
cc0258aea7
check for ampere without destroying the optimizations. again.
3 years ago
C43H66N12O12S2
017b6b8744
check for ampere
3 years ago
C43H66N12O12S2
7e639cd498
check for 3.10
3 years ago
Greendayle
5329d0aba0
Merge branch 'master' into dev/deepdanbooru
3 years ago
AUTOMATIC
cfc33f99d4
why did you do this
3 years ago
Greendayle
2e8ba0fa47
fix conflicts
3 years ago
Milly
4f33289d0f
Fixed typo
3 years ago
AUTOMATIC
27032c47df
restore old opt_split_attention/disable_opt_split_attention logic
3 years ago
AUTOMATIC
dc1117233e
simplify xfrmers options: --xformers to enable and that's it
3 years ago
AUTOMATIC
7ff1170a2e
emergency fix for xformers (continue + shared)
3 years ago
AUTOMATIC1111
48feae37ff
Merge pull request #1851 from C43H66N12O12S2/flash
...
xformers attention
3 years ago
C43H66N12O12S2
970de9ee68
Update sd_hijack.py
3 years ago
C43H66N12O12S2
7ffea15078
Update requirements_versions.txt
3 years ago
C43H66N12O12S2
ca5f0f149c
Update launch.py
3 years ago
C43H66N12O12S2
69d0053583
update sd_hijack_opt to respect new env variables
3 years ago
C43H66N12O12S2
ddfa9a9786
add xformers_available shared variable
3 years ago
C43H66N12O12S2
26b459a379
default to split attention if cuda is available and xformers is not
3 years ago
C43H66N12O12S2
d0e85873ac
check for OS and env variable
3 years ago
MrCheeze
5f85a74b00
fix bug where when using prompt composition, hijack_comments generated before the final AND will be dropped
3 years ago
guaneec
32e428ff19
Remove duplicate event listeners
3 years ago
ddPn08
772db721a5
fix glob path in hypernetwork.py
3 years ago
AUTOMATIC
7001bffe02
fix AND broken for long prompts
3 years ago
AUTOMATIC
77f4237d1c
fix bugs related to variable prompt lengths
3 years ago
C43H66N12O12S2
3f166be1b6
Update requirements.txt
3 years ago
C43H66N12O12S2
4201fd14f5
install xformers
3 years ago
AUTOMATIC
4999eb2ef9
do not let user choose his own prompt token count limit
3 years ago
Trung Ngo
00117a07ef
check specifically for skipped
3 years ago
Trung Ngo
786d9f63aa
Add button to skip the current iteration
3 years ago
AUTOMATIC
45cc0ce3c4
Merge remote-tracking branch 'origin/master'
3 years ago
AUTOMATIC
706d5944a0
let user choose his own prompt token count limit
3 years ago
leko
616b7218f7
fix: handles when state_dict does not exist
3 years ago
C43H66N12O12S2
91d66f5520
use new attnblock for xformers path
3 years ago
C43H66N12O12S2
76a616fa6b
Update sd_hijack_optimizations.py
3 years ago
C43H66N12O12S2
5d54f35c58
add xformers attnblock and hypernetwork support
3 years ago
AUTOMATIC
87db6f01cc
add info about cross attention javascript shortcut code
3 years ago
DepFA
21679435e5
implement removal
3 years ago
DepFA
83749bfc72
context menu styling
3 years ago
DepFA
e21e473253
Context Menus
3 years ago
brkirch
f2055cb1d4
Add hypernetwork support to split cross attention v1
...
* Add hypernetwork support to split_cross_attention_forward_v1
* Fix device check in esrgan_model.py to use devices.device_esrgan instead of shared.device
3 years ago