llama.cpp
da3913d8
- batched: fix n_predict parameter (#8527)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Hide Minimap (CTRL+M)
Commit
1 year ago
batched: fix n_predict parameter (#8527)
References
#8527 - batched: fix n_predict parameter
Author
msy-kato
Parents
d65a8361
Files
1
examples/batched
batched.cpp
Loading