community[patch]: Added support for Ollama's num_predict option in ChatOllama (#16633)
Just a simple default addition to the options payload for a ollama
generate call to support a max_new_tokens parameter.
Should fix issue: https://github.com/langchain-ai/langchain/issues/14715