[bug] Fix "Current vLLM config is not set." warnings when FlashInfer attention is used #30241
[bug] Fix "Current vLLM config is not set." warnings when FlashInfer …
446ee645
nvpohanh
marked this pull request as ready for review 19 hours ago
hmellor
requested changes
on 2025-12-08
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub