langchain
ea6a5b03 - Fix output final text for HuggingFaceTextGenInference when streaming (#6211)

Commit
2 years ago
Fix output final text for HuggingFaceTextGenInference when streaming (#6211) The LLM integration [HuggingFaceTextGenInference](https://github.com/hwchase17/langchain/blob/master/langchain/llms/huggingface_text_gen_inference.py) already has streaming support. However, when streaming is enabled, it always returns an empty string as the final output text when the LLM is finished. This is because `text` is instantiated with an empty string and never updated. This PR fixes the collection of the final output text by concatenating new tokens.
Author
Parents
Loading