vllm
[bug] Fix "Current vLLM config is not set." warnings when FlashInfer attention is used
#30241
Open

[bug] Fix "Current vLLM config is not set." warnings when FlashInfer attention is used #30241

nvpohanh
nvpohanh [bug] Fix "Current vLLM config is not set." warnings when FlashInfer …
446ee645
nvpohanh nvpohanh marked this pull request as ready for review 19 hours ago
nvpohanh nvpohanh requested a review from mgoin mgoin 19 hours ago
nvpohanh nvpohanh requested a review from pavanimajety pavanimajety 19 hours ago
mergify mergify added nvidia
mergify mergify added v1
gemini-code-assist
gemini-code-assist commented on 2025-12-08
hmellor
hmellor requested changes on 2025-12-08
MatthewBonanni

Login to write a write a comment.

Login via GitHub

Assignees
No one assigned
Labels
Milestone