fp8 compressed_tensors w8a8 support #3242
fp8 compressed_tensors w8a8 support
4ffa111f
Merge branch 'main' into fp8_compressor
a2934644
remove print
ce8978f9
regisss
dismissed these changes
on 2025-05-26
add multi-weight for GPTQ weight loader
475f6e21
sywangyi
dismissed their stale review
via 475f6e21
209 days ago
Narsil
approved these changes
on 2025-05-28
regisss
merged
f1404400
into main 208 days ago
Assignees
No one assigned
Login to write a write a comment.
Login via GitHub