vllm
[Bugfix][Quantization] Support BF16 tensors on GGUF
#29948
Merged

Commits
  • Support BF16 tensors on GGUF
    a4lg committed 43 days ago
Loading