vllm
ca179d0f
- [Bugfix] Fix activation quantization for compressed-tensors W4A16 (#31572)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
26 days ago
[Bugfix] Fix activation quantization for compressed-tensors W4A16 (#31572) Signed-off-by: Tmn07 <tmn0796@gmail.com>
References
#31572 - [Bugfix] Fix activation quantization for compressed-tensors W4A16
Author
Tmn07
Parents
013b5408
Loading