langchain
community: Handle FunctionMessage type in ChatOllama
#19815
Closed

community: Handle FunctionMessage type in ChatOllama #19815

vishalmanohar
vishalmanohar1 year ago (edited 1 year ago)

Description: When using functional calling with OllamaFunctions, it raises an error on encountering a FunctionMessage: "Received unsupported message type for Ollama."

Issue: #18450

dosubot dosubot added size:XS
vercel
vercel1 year ago (edited 1 year ago)

The latest updates on your projects. Learn more about Vercel for Git ↗︎

1 Ignored Deployment
Name Status Preview Comments Updated (UTC)
langchain ⬜️ Ignored (Inspect) Visit Preview Mar 31, 2024 4:30am
dosubot dosubot added 🤖:bug
vishalmanohar community: handle FunctionMessage type in ChatOllama
c23f0fc1
vishalmanohar vishalmanohar force pushed to c23f0fc1 1 year ago
isahers1
isahers1344 days ago
ccurme ccurme added community
ccurme
ccurme273 days ago

I think this should be mapped to a different role than assistant. Or ideally, use tool message, which is accommodated in the langchain-ollama package:

elif isinstance(message, ToolMessage):
role = "tool"
tool_call_id = message.tool_call_id

ccurme ccurme closed this 273 days ago

Login to write a write a comment.

Login via GitHub

Reviewers
No reviews
Assignees
No one assigned
Labels
Milestone