vllm
[Bugfix][Quantization] Support BF16 tensors on GGUF
#29948
Merged

[Bugfix][Quantization] Support BF16 tensors on GGUF #29948

a4lg
a4lg Support BF16 tensors on GGUF
925ac913
a4lg a4lg requested a review from DarkLight1337 DarkLight1337 29 days ago
a4lg a4lg requested a review from ywang96 ywang96 29 days ago
a4lg a4lg requested a review from 22quinn 22quinn 29 days ago
gemini-code-assist
gemini-code-assist commented on 2025-12-03
chatgpt-codex-connector
chatgpt-codex-connector commented on 2025-12-03
Isotr0py
Isotr0py approved these changes on 2025-12-03
Isotr0py Isotr0py enabled auto-merge (squash) 29 days ago
github-actions github-actions added ready
a4lg
Isotr0py
Isotr0py Isotr0py merged 42c19496 into main 28 days ago

Login to write a write a comment.

Login via GitHub

Assignees
No one assigned
Labels
Milestone