llama.cpp
333c9ead
- llama : bump max seq limit from 64 to 256
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
1 day ago
llama : bump max seq limit from 64 to 256 ggml-ci
References
gg/llama-bump-max-seq
#15916 - llama : bump max seq limit from 64 to 256
Author
ggerganov
Parents
10d8b2b6
Loading