llama.cpp
0c898640
- retrieval : use at most n_seq_max chunks (#18400)
Go
Login via GitHub
Home
Pricing
FAQ
Install
Login
via GitHub
Commit
View On
GitHub
Commit
1 day ago
retrieval : use at most n_seq_max chunks (#18400)
References
#18400 - Fix in the example in examples/retrieval
Author
hectorem2
Parents
daa242df
Loading