transformers
28c9541c - Attention Quantization with FBGemm & TP (#37384)

Commit
282 days ago
Attention Quantization with FBGemm & TP (#37384) * fix * keep fused * contiguous * rm print * update * update * rm print
Author
Committer
Parents
Loading