Added support for quantization in vLLM backend #690
Added support for quanitzation in vllm backend
e985799c
Merge branch 'main' into main
0c3e3d77
Fixed style issues
d6ebe58f
Merge branch 'main' into main
2992d7fe
Merge branch 'main' into main
eb38a159
NathanHB
approved these changes
on 2025-05-12
NathanHB
merged
04a74a28
into main 322 days ago
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub