langchain
3beba77e - feat(ollama): support `response_format` (#34612)

Commit
3 days ago
feat(ollama): support `response_format` (#34612) Fixes #34610 --- This PR resolves an issue where `ChatOllama` would raise an `unexpected keyword argument 'response_format'` error when used with `create_agent` or when passed an OpenAI-style `response_format`. When using `create_agent` (especially with models like `gpt-oss`), LangChain creates a `response_format` argument (e.g., `{"type": "json_schema", ...}`). `ChatOllama` previously passed this argument directly to the underlying Ollama client, which does not support `response_format` and instead expects a `format` parameter. ## The Fix I updated `_chat_params` in `libs/partners/ollama/langchain_ollama/chat_models.py` to: 1. Intercept the `response_format` argument. 2. Map it to the native Ollama `format` parameter: * `{"type": "json_schema", "json_schema": {"schema": ...}}` -> `format=schema` * `{"type": "json_object"}` -> `format="json"` 3. Remove `response_format` from the kwargs passed to the client. ## Validation * **Reproduction Script**: Verified the fix with a script covering `json_schema`, `json_object`, and explicit `format` priority scenarios. * **New Tests**: Added 3 new unit tests to `libs/partners/ollama/tests/unit_tests/test_chat_models.py` covering these scenarios. * **Regression**: Ran the full test suite (`make -C libs/partners/ollama test`), passing 29 tests (previously 26). * **Lint/Format**: Verified with `make lint_package` and `make format`. --------- Co-authored-by: Mohan Kumar Sagadevan <mohankumarsagadevan@Mohans-MacBook-Air.local> Co-authored-by: Mason Daugherty <mason@langchain.dev> Co-authored-by: Mason Daugherty <github@mdrxy.com>
Author
Parents
Loading