Fix static generation when compiling! #28937
wow I was scared!
21876858
ArthurZucker
changed the title wow I was scared! Fix static generation when compiling! 1 year ago
fix everything
4922c924
nits
56768a02
make it BC?
b5650515
add todo
99afd1ad
nits
edc498fb
is_tracing should still be used to pass tracing tests
651c4bd8
nits
f69626e1
ArthurZucker
marked this pull request as ready for review 1 year ago
some nits to make sure genration works with static cache uncompiled
96136acb
fix sdpa
d5ebd806
gante
commented
on 2024-02-12
gante
approved these changes
on 2024-02-13
fix FA2 for both static and dynamic in a better way?
70adcf66
style
61ed4cb4
fix-copies
fedc5633
fix fix copies
0195d58d
fix sequential beam searcg
07f3adbb
style
9402c25e
use `keys_to_ignore`
86303c4d
nit
fb9e9072
correct dtype inference when init
9aa667e0
:( the fix for FA2 is still not optimal to investigate!
68a5f294
styling
3b9969b7
Merge branch 'main' of github.com:huggingface/transformers into fix-s…
162ab877
nits
914b0d7d
nit
e79f79f6
this might work better
ee2317d3
add comment
93b2691a
Update src/transformers/models/llama/modeling_llama.py
3619ed30
"position_ids" -> "cache_position"
c23cdc42
style
717a8e75
Merge branch 'main' of github.com:huggingface/transformers into fix-s…
7fe09642
Merge branch 'main' of github.com:huggingface/transformers into fix-s…
464c4637
nit
80148abe
Remove changes that should no be propagatted just yet
c9f3c828
Apply suggestions from code review
5f54d84e
Styling
b3fc0428
make sure we raise an errir for static cache with FA2 enabled
5fdb2da8
move to the bottom of the signature
03edf912
style
b762304e
Update src/transformers/models/llama/modeling_llama.py
9fbe9014
Update src/transformers/models/llama/modeling_llama.py
7afe7d93
nit in the name
3772d1ca
Merge branches 'fix-static-kv-cache' and 'fix-static-kv-cache' of git…
cf0bc324
ArthurZucker
deleted the fix-static-kv-cache branch 1 year ago
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub