llama.cpp
2a4bcbac - llama : remove n_threads from llama_decode_internal (#3614)

Commit
1 year ago
llama : remove n_threads from llama_decode_internal (#3614) This commit removes `n_threads` from the `llama_decode_internal` functions doc comment as it does not exist anymore. It looks like this parameter was removed in Commit 16bc66d9479edd5ee12ec734973554d4493c5dfa ("llama.cpp : split llama_context_params into model and context params"). Signed-off-by: Daniel Bevenius <daniel.bevenius@gmail.com>
Author
Parents
Loading