langchain
730a3676 - fix(core): strip message IDs from cache keys using `model_copy` (#33915)

Commit
116 days ago
fix(core): strip message IDs from cache keys using `model_copy` (#33915) **Description:** *Closes #[33883](https://github.com/langchain-ai/langchain/issues/33883)* Chat model cache keys are generated by serializing messages via `dumps(messages)`. The optional `BaseMessage.id` field (a UUID used solely for tracing/threading) is included in this serialization, causing functionally identical messages to produce different cache keys. This results in repeated API calls, cache bloat, and degraded performance in production workloads (e.g., agents, RAG chains, long conversations). This change normalizes messages **only for cache key generation** by stripping the nonsemantic `id` field using Pydantic V2’s `model_copy(update={"id": None})`. The normalization is applied in both synchronous and asynchronous cache paths (`_generate_with_cache` / `_agenerate_with_cache`) immediately before `dumps()`. ```python normalized_messages = [ msg.model_copy(update={"id": None}) if getattr(msg, "id", None) is not None else msg for msg in messages ] prompt = dumps(normalized_messages) --------- Co-authored-by: Mason Daugherty <mason@langchain.dev> Co-authored-by: Mason Daugherty <github@mdrxy.com>
Author
Parents
Loading