text-generation-inference
e14ae3b5 - feat(server): support quantization for flash models (#200)

Commit
2 years ago
feat(server): support quantization for flash models (#200) closes #197
Parents
Loading