langchain
e9799d68 - improves huggingface_hub example (#988)

Commit
2 years ago
improves huggingface_hub example (#988) The provided example uses the default `max_length` of `20` tokens, which leads to the example generation getting cut off. 20 tokens is way too short to show CoT reasoning, so I boosted it to `64`. Without knowing HF's API well, it can be hard to figure out just where those `model_kwargs` come from, and `max_length` is a super critical one.
Author
Parents
Loading