text-generation-inference
e14ae3b5
- feat(server): support quantization for flash models (#200)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
2 years ago
feat(server): support quantization for flash models (#200) closes #197
References
#200 - feat(server): support quantization for flash models
Author
OlivierDehaene
Parents
2475aede
Loading