vllm
[Feature] AWQ marlin quantization support for fused moe with lora
#30442
Merged

[Feature] AWQ marlin quantization support for fused moe with lora #30442

princepride
princepride add awq moe lora support
25ee0bdb
princepride princepride requested a review from mgoin mgoin 9 days ago
princepride princepride requested a review from robertgshaw2-redhat robertgshaw2-redhat 9 days ago
princepride princepride requested a review from tlrmchlsmth tlrmchlsmth 9 days ago
princepride princepride requested a review from yewentao256 yewentao256 9 days ago
princepride princepride requested a review from pavanimajety pavanimajety 9 days ago
gemini-code-assist
gemini-code-assist commented on 2025-12-11
chatgpt-codex-connector
chatgpt-codex-connector commented on 2025-12-11
princepride fix some bug
744327f5
jeejeelee
jeejeelee approved these changes on 2025-12-11
jeejeelee jeejeelee enabled auto-merge (squash) 8 days ago
jeejeelee jeejeelee added ready
princepride
jeejeelee Merge branch 'main' into awq_moe_lora_support
aee10ac9
jeejeelee jeejeelee merged 0e71eaa6 into main 8 days ago

Login to write a write a comment.

Login via GitHub

Assignees
No one assigned
Labels
Milestone