Flash Transformers modeling backend support #2913
add transformers_flash
ade0f44a
inits
da222900
switch version to make it work
b3b07474
Update Makefile-flash-att-v2
738f0b0e
Update Makefile-flash-att-v2
a84ecf26
Update Makefile-flash-att-v2
372799a4
Update Makefile-flash-att-v2
a0035e66
Update Makefile-flash-att-v2
e69a384d
Update Makefile-flash-att-v2
3a636ed1
runnable version
649cb1f5
push change
f843b62a
fix high dim
715b2d19
init
e93ab925
default
f4c60ca5
latest transformers changes
2e2631e0
revert
44b36793
simplify check
266377b3
remove flag
32488c1a
improve type hints + required args
ac62bd15
Update based on transformers PR
b03d7ae9
small fix
b40c8893
Remove Warpers for Processor
42ae6dea
fix compatibility version issue
f01014de
Narsil
commented
on 2025-01-20
Narsil
commented
on 2025-01-20
Narsil
commented
on 2025-01-20
raise error if needed
2659b599
Simplify with monkey patch
a2fe8427
revert + style + minor improvements
6e0f37c0
update comment
52afdcc2
Cyrilvallez
changed the title Transformers backend Flash Transformers modeling backend support 335 days ago
device check
9af3ea4b
move the import to avoid device issue
6d9c011f
Narsil
dismissed these changes
on 2025-01-20
Update __init__.py
2ef3002c
Cyrilvallez
dismissed their stale review
via 2ef3002c
335 days ago
check for non-native models
70ada578
oupsi
0d9ec75f
Narsil
approved these changes
on 2025-01-21
Narsil
merged
b980848a
into main 334 days ago
Narsil
deleted the transformers-backend branch 334 days ago
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub