vllm
Update `rope_scaling` to `rope_parameters` in preparation for Transformers v5
#28542
Merged

Update `rope_scaling` to `rope_parameters` in preparation for Transformers v5 #28542

hmellor
hmellor Rename `rope_scaling` -> `rope_parameters` in `get_rope`
a62c2df7
hmellor Patch rope parameters to new name, `rope_parameters`
f42b03d5
hmellor Update models where it's a simple rename
a2a94374
hmellor Fix model config overrides
fba5bf5a
hmellor Update examples
ee5cf666
hmellor Update benchmarks
080530dd
mergify
mergify mergify added documentation
mergify mergify added llama
mergify mergify added performance
mergify mergify added qwen
mergify mergify added gpt-oss
mergify mergify added speculative-decoding
hmellor More renaming in transformers utils
889b9002
hmellor Fix `patch_rope_parameters` for when `rope_scaling` was explicitly `N…
50b1a870
hmellor Update Gemma3 and Gemma3n
bd182e06
hmellor Merge branch 'main' into update-rope-config
4c61e2ea
hmellor Get `rope_theta` from the new location too
65c8658a
hmellor Fix condition for non gemma3 models
5d657391
hmellor Make Transformers backend torch compile check work with new rope params
b4e1967c
hmellor Re-enable a load of Transformers nightly tests which are now fixed
ee77bd7b
hmellor Update the custom configs
df4c0075
hmellor Make sure scaling factor always exists
325ff8d2
hmellor A couple more models that now init on v5
11c23a72
mergify mergify added ci/build
hmellor Update Commandr
4ea113c6
hmellor Update Qwen3Next
59b0f270
hmellor Update Olmo2
064441be
hmellor rope_parameters always present because of rope_theta
bdd0e6c5
hmellor Update LFM2MoE
f224ef4c
hmellor Update LFM2
19dcc189
hmellor Update the rest
2eecd312
hmellor hmellor added ready
hmellor hmellor force pushed from f56dbc78 to 2eecd312 27 days ago
mergify mergify added deepseek
hmellor update tests
e95ccd4d
hmellor Update configs
f2bac156
hmellor Missed 2
36e8a1f8
hmellor hmellor marked this pull request as ready for review 27 days ago
hmellor hmellor requested a review from noooop noooop 27 days ago
hmellor hmellor requested a review from patrickvonplaten patrickvonplaten 27 days ago
hmellor hmellor requested a review from sighingnow sighingnow 27 days ago
hmellor hmellor requested a review from mgoin mgoin 27 days ago
hmellor hmellor requested a review from tlrmchlsmth tlrmchlsmth 27 days ago
hmellor hmellor requested a review from WoosukKwon WoosukKwon 27 days ago
hmellor hmellor requested a review from yewentao256 yewentao256 27 days ago
hmellor hmellor requested a review from simon-mo simon-mo 27 days ago
hmellor hmellor requested a review from youkaichao youkaichao 27 days ago
hmellor hmellor requested a review from robertgshaw2-redhat robertgshaw2-redhat 27 days ago
hmellor hmellor requested a review from houseroad houseroad 27 days ago
hmellor hmellor requested a review from ProExpertProg ProExpertProg 27 days ago
hmellor
chatgpt-codex-connector
chatgpt-codex-connector commented on 2025-11-13
hmellor Improve comment about what `rope_parameters` is
dfa75cff
hmellor Move scaling factor out of loop
708ea0c3
hmellor Early exit `patch_rope_parameters` if no rope params present
4a285129
hmellor Be more explicit about v4 vs v5 behaviour
dfb476f7
hmellor Update a few models to not pass `base` outside of `rope_parameters`
97bb3394
hmellor Update some more models
97766f5e
hmellor Update some more models
783962ba
hmellor Add back `type` -> `rope_type` for legacy custom models
797fbeae
hmellor More models
b7808920
hmellor Fix docs build
ad9dff2b
hmellor Update some more models
461ff947
hmellor Update some more models
fa2ccedc
hmellor Remove last references to `base` arg of `get_rope`
4127d543
hmellor Update mrope test
1ebd0e4f
hmellor Check everything
ec30fef4
hmellor fix
6368078d
hmellor Merge branch 'main' into update-rope-config
482f378d
hmellor Don't delete the legacy attributes when still using v4
d4b2fbb7
hmellor Fix typo in commandr
1e68d271
hmellor Fix typo in deepseek v2
db6a8806
hmellor Handle multimodal models where vision model uses RoPE
26a51d45
hmellor Use new default value of rope_parameters in kernels test
dd692448
hmellor Use `rope_parameters` instead of `base` in compile test
132dc4b4
hmellor Don't overwrite main config for v4 style Gemma 3
d7a6ded3
hmellor Only raise for `disable_sliding_window` if the model actually has `sl…
8ceffd6f
hmellor Fix arctic config docstring for docs
08126a9a
hmellor Fix typo in gpt-oss
f1c3c33c
hmellor Remove disable_sliding_window errors
a2601ce0
hmellor Fix olmo2
03d50e06
hmellor Fix custom code mm models
93827b64
hmellor Fix models with no rope info at all in their `config.json`
3b3c2336
hmellor Fix unaccounted for style of config
3f9ce074
hmellor Hopefully final fix for multimodal rope overrides
f1714ac0
hmellor Fix condition for raising error
981aac45
hmellor Only override `rope_type` to `deepseek_yarn` if it was not `default`
5c2f394e
hmellor Make 10000 the default base for `get_rope` if `rope_parameters == None`
6c64ba51
hmellor Set all model defaults which are not 10000
6beee2b4
hmellor Update models which can default to 10000
002fb907
hmellor Fix nemotron config
99c5d476
hmellor Fix ernie 4.5 vl
c38e8bbb
hmellor Fix benchmarks/tests where `get_rope` is called with positional argum…
eebe73c5
hmellor Merge branch 'main' into update-rope-config
540a46b1
hmellor Fix get_rope kwargs in vision transformers
a60b5ec2
hmellor Update new model
00f28532
hmellor Missed positional args
717a7044
hmellor Fix nemotron config validation
a9fa3b04
vllm-bot vllm-bot merged a8b70304 into main 21 days ago
hmellor hmellor deleted the update-rope-config branch 21 days ago

Login to write a write a comment.

Login via GitHub

Assignees
No one assigned
Labels
Milestone