27 Commits (ac085628540d0ec6a988fad93f5b8f2154209571)

Author SHA1 Message Date
Cheka 2fd7935ef4 Remove wrong self reference in CUDA support for invokeai 3 years ago
C43H66N12O12S2 c71008c741 Update sd_hijack_optimizations.py 3 years ago
C43H66N12O12S2 84823275e8 readd xformers attnblock 3 years ago
C43H66N12O12S2 2043c4a231 delete xformers attnblock 3 years ago
brkirch 861db783c7 Use apply_hypernetwork function 3 years ago
brkirch 574c8e554a Add InvokeAI and lstein to credits, add back CUDA support 3 years ago
brkirch 98fd5cde72 Add check for psutil 3 years ago
brkirch c0484f1b98 Add cross-attention optimization from InvokeAI
* Add cross-attention optimization from InvokeAI (~30% speed improvement on MPS)
* Add command line option for it
* Make it default when CUDA is unavailable
3 years ago
AUTOMATIC 873efeed49 rename hypernetwork dir to hypernetworks to prevent clash with an old filename that people who use zip instead of git clone will have 3 years ago
AUTOMATIC 530103b586 fixes related to merge 3 years ago
AUTOMATIC 948533950c replace duplicate code with a function 3 years ago
C43H66N12O12S2 3e7a981194 remove functorch 3 years ago
Fampai 122d42687b Fix VRAM Issue by only loading in hypernetwork when selected in settings 3 years ago
AUTOMATIC e6e42f98df make --force-enable-xformers work without needing --xformers 3 years ago
AUTOMATIC f9c5da1592 add fallback for xformers_attnblock_forward 3 years ago
AUTOMATIC dc1117233e simplify xfrmers options: --xformers to enable and that's it 3 years ago
AUTOMATIC 7ff1170a2e emergency fix for xformers (continue + shared) 3 years ago
AUTOMATIC1111 48feae37ff
Merge pull request #1851 from C43H66N12O12S2/flash
xformers attention
3 years ago
C43H66N12O12S2 69d0053583
update sd_hijack_opt to respect new env variables 3 years ago
C43H66N12O12S2 76a616fa6b
Update sd_hijack_optimizations.py 3 years ago
C43H66N12O12S2 5d54f35c58
add xformers attnblock and hypernetwork support 3 years ago
brkirch f2055cb1d4 Add hypernetwork support to split cross attention v1
* Add hypernetwork support to split_cross_attention_forward_v1
* Fix device check in esrgan_model.py to use devices.device_esrgan instead of shared.device
3 years ago
C43H66N12O12S2 c9cc65b201
switch to the proper way of calling xformers 3 years ago
AUTOMATIC bad7cb29ce added support for hypernetworks (???) 3 years ago
C43H66N12O12S2 f174fb2922
add xformers attention 3 years ago
Jairo Correa ad0cc85d1f Merge branch 'master' into stable 3 years ago
AUTOMATIC 820f1dc96b initial support for training textual inversion 3 years ago