vllm
Don't assume `position_embedding_type` will be present for BERT and RoBERTa models
#30770
Merged

Don't assume `position_embedding_type` will be present for BERT and RoBERTa models #30770

hmellor
hmellor Don'e assume `position_embedding_type` will be present for BERT and R…
2347a2aa
chatgpt-codex-connector
gemini-code-assist
gemini-code-assist commented on 2025-12-16
DarkLight1337
DarkLight1337 approved these changes on 2025-12-16
DarkLight1337 DarkLight1337 enabled auto-merge (squash) 4 days ago
hmellor hmellor changed the title Don'e assume `position_embedding_type` will be present for BERT and RoBERTa models Don't assume `position_embedding_type` will be present for BERT and RoBERTa models 4 days ago
github-actions github-actions added ready
DarkLight1337 DarkLight1337 merged 6f15ac5d into main 4 days ago
hmellor hmellor deleted the bert-rope-fix branch 3 days ago

Login to write a write a comment.

Login via GitHub

Assignees
No one assigned
Labels
Milestone