langchain
77bb6c99 - llamacpp wrong default value passed for `f16_kv` (#3320)

Commit
2 years ago
llamacpp wrong default value passed for `f16_kv` (#3320) Fixes default f16_kv value in llamacpp; corrects incorrect parameter passed. See: https://github.com/abetlen/llama-cpp-python/blob/ba3959eafd38080f3bf3028746406f350a8ef793/llama_cpp/llama.py#L33 Fixes #3241 Fixes #3301
Author
Parents
Loading