text-generation-webui
Add flash-attention 2 for windows
#4235
Merged

Add flash-attention 2 for windows #4235

kingbri1
kingbri1
Ph0rk0z
jllllll
Ph0rk0z
oobabooga
Panchovix
jllllll
Panchovix
kingbri1
kingbri1
kingbri1 kingbri1 force pushed from 82fb209b to 722fac52 2 years ago
kingbri1 Requirements: Add cuda 12.1 and update one click
06ad1450
kingbri1 kingbri1 force pushed from 722fac52 to 06ad1450 2 years ago
Panchovix
bombel28
Nicoolodion2
jllllll
oobabooga Bump to CUDA 12.1 & Python 3.11
0b714d71
oobabooga Merge branch 'main' into bdashore3-flash-attention-windows
cef34a83
oobabooga Change choise to "Would you like CUDA 11.8?"
9bc14062
oobabooga Minor fixes
0f25ee56
oobabooga
oobabooga Merge branch 'main' into bdashore3-flash-attention-windows
b4f56533
oobabooga Don't use python 3.11
8e7d7655
oobabooga Minor change
83140c3f
oobabooga Update README
e7c662ee
oobabooga CUDA_118 environment variable
991a9ff9
oobabooga Fix typo
63f4c344
kingbri1
oobabooga More foolproof
eb4cc7d6
oobabooga cu118 -> cu121
81f9112e
Panchovix
oobabooga
oobabooga cu122 for flash-attn
2d0ea337
kingbri1
oobabooga
kingbri1
oobabooga Add new wheels
5a61197e
oobabooga Use python 3.11
df9966d5
oobabooga More robust flash-attention import
cd33585f
oobabooga Lint
ec9ddeeb
oobabooga Update dockerfile
8465893f
oobabooga
oobabooga oobabooga merged 3345da2e into main 2 years ago

Login to write a write a comment.

Login via GitHub

Reviewers
No reviews
Assignees
No one assigned
Labels
Milestone