llama.cpp
faac0bae
- common : ensure llama_batch size does not exceed max size (#9668)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
1 year ago
common : ensure llama_batch size does not exceed max size (#9668) A crash was observed when the number of tokens added to a batch exceeds llama_batch size. An assertion in llama_batch_add was added to protect against llama_batch size overflow.
References
#9668 - common: ensure token addition to batch does not exceed llama_batch size
Author
matiaslin
Parents
f99d3f83
Loading