vllm
Update `rope_scaling` to `rope_parameters` in preparation for Transformers v5
#28542
Merged
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Overview
Commits
73
Changes
View On
GitHub
Update `rope_scaling` to `rope_parameters` in preparation for Transformers v5
#28542
vllm-bot
merged 73 commits into
vllm-project:main
from
hmellor:update-rope-config
Rename `rope_scaling` -> `rope_parameters` in `get_rope`
a62c2df7
Patch rope parameters to new name, `rope_parameters`
f42b03d5
Update models where it's a simple rename
a2a94374
Fix model config overrides
fba5bf5a
Update examples
ee5cf666
Update benchmarks
080530dd
mergify
added
documentation
mergify
added
llama
mergify
added
performance
mergify
added
qwen
mergify
added
gpt-oss
mergify
added
speculative-decoding
More renaming in transformers utils
889b9002
Fix `patch_rope_parameters` for when `rope_scaling` was explicitly `N…
50b1a870
Update Gemma3 and Gemma3n
bd182e06
Merge branch 'main' into update-rope-config
4c61e2ea
Get `rope_theta` from the new location too
65c8658a
Fix condition for non gemma3 models
5d657391
Make Transformers backend torch compile check work with new rope params
b4e1967c
Re-enable a load of Transformers nightly tests which are now fixed
ee77bd7b
Update the custom configs
df4c0075
Make sure scaling factor always exists
325ff8d2
A couple more models that now init on v5
11c23a72
mergify
added
ci/build
Update Commandr
4ea113c6
Update Qwen3Next
59b0f270
Update Olmo2
064441be
rope_parameters always present because of rope_theta
bdd0e6c5
Update LFM2MoE
f224ef4c
Update LFM2
19dcc189
Update the rest
2eecd312
hmellor
added
ready
hmellor
force pushed
from
f56dbc78
to
2eecd312
27 days ago
mergify
added
deepseek
update tests
e95ccd4d
Update configs
f2bac156
Missed 2
36e8a1f8
hmellor
marked this pull request as ready for review
27 days ago
hmellor
requested a review
from
noooop
27 days ago
hmellor
requested a review
from
patrickvonplaten
27 days ago
hmellor
requested a review
from
sighingnow
27 days ago
hmellor
requested a review
from
mgoin
27 days ago
hmellor
requested a review
from
tlrmchlsmth
27 days ago
hmellor
requested a review
from
WoosukKwon
27 days ago
hmellor
requested a review
from
yewentao256
27 days ago
hmellor
requested a review
from
simon-mo
27 days ago
hmellor
requested a review
from
youkaichao
27 days ago
hmellor
requested a review
from
robertgshaw2-redhat
27 days ago
hmellor
requested a review
from
houseroad
27 days ago
hmellor
requested a review
from
ProExpertProg
27 days ago
chatgpt-codex-connector
commented on 2025-11-13
Improve comment about what `rope_parameters` is
dfa75cff
Move scaling factor out of loop
708ea0c3
Early exit `patch_rope_parameters` if no rope params present
4a285129
Be more explicit about v4 vs v5 behaviour
dfb476f7
Update a few models to not pass `base` outside of `rope_parameters`
97bb3394
Update some more models
97766f5e
Update some more models
783962ba
Add back `type` -> `rope_type` for legacy custom models
797fbeae
More models
b7808920
Fix docs build
ad9dff2b
Update some more models
461ff947
Update some more models
fa2ccedc
Remove last references to `base` arg of `get_rope`
4127d543
Update mrope test
1ebd0e4f
Check everything
ec30fef4
fix
6368078d
Merge branch 'main' into update-rope-config
482f378d
Don't delete the legacy attributes when still using v4
d4b2fbb7
Fix typo in commandr
1e68d271
Fix typo in deepseek v2
db6a8806
Handle multimodal models where vision model uses RoPE
26a51d45
Use new default value of rope_parameters in kernels test
dd692448
Use `rope_parameters` instead of `base` in compile test
132dc4b4
Don't overwrite main config for v4 style Gemma 3
d7a6ded3
Only raise for `disable_sliding_window` if the model actually has `sl…
8ceffd6f
Fix arctic config docstring for docs
08126a9a
Fix typo in gpt-oss
f1c3c33c
Remove disable_sliding_window errors
a2601ce0
Fix olmo2
03d50e06
Fix custom code mm models
93827b64
Fix models with no rope info at all in their `config.json`
3b3c2336
Fix unaccounted for style of config
3f9ce074
Hopefully final fix for multimodal rope overrides
f1714ac0
Fix condition for raising error
981aac45
Only override `rope_type` to `deepseek_yarn` if it was not `default`
5c2f394e
Make 10000 the default base for `get_rope` if `rope_parameters == None`
6c64ba51
Set all model defaults which are not 10000
6beee2b4
Update models which can default to 10000
002fb907
Fix nemotron config
99c5d476
Fix ernie 4.5 vl
c38e8bbb
Fix benchmarks/tests where `get_rope` is called with positional argum…
eebe73c5
Merge branch 'main' into update-rope-config
540a46b1
Fix get_rope kwargs in vision transformers
a60b5ec2
Update new model
00f28532
Missed positional args
717a7044
Fix nemotron config validation
a9fa3b04
vllm-bot
merged
a8b70304
into main
21 days ago
hmellor
deleted the update-rope-config branch
21 days ago
Login to write a write a comment.
Login via GitHub
Reviewers
chatgpt-codex-connector
noooop
patrickvonplaten
sighingnow
mgoin
tlrmchlsmth
WoosukKwon
yewentao256
simon-mo
youkaichao
robertgshaw2-redhat
houseroad
ProExpertProg
Assignees
No one assigned
Labels
documentation
performance
speculative-decoding
ready
ci/build
llama
qwen
deepseek
gpt-oss
Milestone
No milestone
Login to write a write a comment.
Login via GitHub