vllm
[Bugfix] Fix precision loss in LoRA-wrapped RowParallelLinear by fusing bias into GEMM
#28972
Merged

[Bugfix] Fix precision loss in LoRA-wrapped RowParallelLinear by fusing bias into GEMM #28972

prashanth058
prashanth058 prashanth058 requested a review from jeejeelee jeejeelee 22 days ago
gemini-code-assist
gemini-code-assist commented on 2025-11-19
jeejeelee
prashanth058 [Bugfix] Fix precision loss in LoRA-wrapped RowParallelLinear by fusi…
58e30d49
prashanth058 prashanth058 force pushed from b58afac5 to 58e30d49 22 days ago
jeejeelee
jeejeelee approved these changes on 2025-11-20
jeejeelee jeejeelee enabled auto-merge (squash) 21 days ago
github-actions github-actions added ready
jeejeelee jeejeelee merged 0cca9b4d into main 21 days ago

Login to write a write a comment.

Login via GitHub

Assignees
No one assigned
Labels
Milestone