text-generation-inference
299217c9
- feat(server): add flash attention llama (#144)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
2 years ago
feat(server): add flash attention llama (#144)
References
#144 - feat(server): add flash attention llama
Author
OlivierDehaene
Parents
99879600
Loading