langchain
(RFC) core: add RemoveMessage
#22870
Closed

(RFC) core: add RemoveMessage #22870

vbarda wants to merge 10 commits into master from vb/add-remove-message
vbarda
vbarda344 days ago (edited 340 days ago)

This change adds a new message type RemoveMessage (and RemoveMessageChunk) to allow the following behaviors:

(1) in langhcain_core -- when using model w/ fallbacks, if one of the models fails mid-stream, we would want to roll back chunks already streamed
(2) in langgraph -- to allow user or graph node to manually modify the state

Example usage:

(1) langchain_core -- RunnableWithFallbacks

from langchain_core.language_models.fake_chat_models import FakeListChatModel
from langchain_core.runnables.fallbacks import RunnableWithFallbacks
from langchain_core.callbacks import StdOutCallbackHandler, CallbackManager, BaseCallbackHandler


class CustomCallbackHandler(BaseCallbackHandler):
    def on_chain_end(self, outputs, **kwargs):
        print("Finished chain", outputs)


callback_manager = CallbackManager([StdOutCallbackHandler()])
callbacks = [CustomCallbackHandler()]

fake_list_chat_model_1 = FakeListChatModel(name="model1", responses=["one", "two"], error_on_chunk_number=2)
fake_list_chat_model_2 = FakeListChatModel(name="model2", responses=["three", "four"])

llm_with_fallbacks = fake_list_chat_model_1.with_fallbacks([fake_list_chat_model_2], exceptions_to_handle=(Exception,)).with_config(callbacks=callbacks)

for s in llm_with_fallbacks.stream("hi"):
    print(s.__class__.__name__, s.content, s.id)

Output:

AIMessageChunk o run-4b83315c-1ffc-4474-818a-2efb3811ac38
AIMessageChunk n run-4b83315c-1ffc-4474-818a-2efb3811ac38
RemoveMessageChunk modifier run-4b83315c-1ffc-4474-818a-2efb3811ac38
AIMessageChunk t run-2208adaf-26a7-43ce-b203-8a279626cacd
AIMessageChunk h run-2208adaf-26a7-43ce-b203-8a279626cacd
AIMessageChunk r run-2208adaf-26a7-43ce-b203-8a279626cacd
AIMessageChunk e run-2208adaf-26a7-43ce-b203-8a279626cacd
AIMessageChunk e run-2208adaf-26a7-43ce-b203-8a279626cacd
Finished chain content='three' id='run-2208adaf-26a7-43ce-b203-8a279626cacd'

(2) langgraph

  • allow users to delete messages from state by calling
graph.update_state(config, values=[RemoveMessage(id=state.values[-1].id)])
  • allow nodes to delete messages
graph.add_node("delete_messages", lambda state: [RemoveMessage(id=state[-1].id)])
vbarda (RFC) core: add RemoveMessage
a8e1580c
vercel
vbarda vbarda requested a review from baskaryan baskaryan 344 days ago
eyurtsev
vbarda
vbarda update naming & remove chunk modifier
50d6a454
vbarda (wip) working w/ messages
d895a16c
vbarda experiment w/ updating chat model interface directly
a1b6abc9
vbarda add RemoveMessageChunk & move the exception handling to ChatModel.str…
70285e14
vbarda Merge branch 'master' into vb/add-remove-message
fe2e3c88
vbarda fix
2596b3c3
vbarda remove redundant logic
536b4748
vbarda
vbarda commented on 2024-06-17
libs/core/langchain_core/runnables/fallbacks.py
481494
482 yield chunk
483 output: Optional[Output] = chunk
484 try:
485 for chunk in stream:
486 yield chunk
487 try:
488 output = output + chunk # type: ignore
489 except TypeError:
490 output = None
491 except BaseException as e:
492 run_manager.on_chain_error(e)
493 raise e
494495 run_manager.on_chain_end(output)
495496
496497
async def astream(
vbarda340 days ago

note: this still needs updating

baskaryan
baskaryan commented on 2024-06-17
Conversation is marked as resolved
Show resolved
libs/core/langchain_core/language_models/chat_models.py
270 seen_chunk_ids.append(chunk.message.id)
271 last_chunk_id = chunk.message.id
257272 assert generation is not None
273
except self.exceptions_to_handle:
274
for chunk_id in seen_chunk_ids:
275
yield RemoveMessageChunk(id=chunk_id) # type: ignore[call-arg]
baskaryan340 days ago

i think we'd still want to raise an error at the end of this? otherwise the user code has now way of knowing that nothing has actually streamed

vbarda340 days ago

yes, agreed

Conversation is marked as resolved
Show resolved
libs/core/langchain_core/language_models/chat_models.py
120122 callback_manager: Optional[BaseCallbackManager] = Field(default=None, exclude=True)
121123 """[DEPRECATED] Callback manager to add to the run trace."""
122124
125
exceptions_to_handle: Tuple[Type[BaseException], ...] = ()
baskaryan340 days ago

do we need this? could just always yield remove messages on error (assuming we always raise error anyways)

vbarda340 days ago

yea if we always raise we don't need!

vbarda code review
3129793e
baskaryan
baskaryan commented on 2024-06-17
libs/core/langchain_core/messages/modifier.py
6class ModifierMessage(BaseMessage):
7 """Message responsible for modifying other messages (deleting / updating.)"""
8
9
def __init__(self, id: str, **kwargs: Any) -> None:
baskaryan340 days ago

should we raise an error if any other kwargs are specified

vbarda340 days ago

i considered this -- there is an issue w/ content field during serialization/deserialization. since it's required on the base message, we still need to pass it here at deserialization. i can raise for any other keys though!

libs/core/langchain_core/messages/modifier.py
7 """Message responsible for modifying other messages (deleting / updating.)"""
8
9 def __init__(self, id: str, **kwargs: Any) -> None:
10
return super().__init__("modifier", id=id)
baskaryan340 days ago

is "modifier" the content here? should we just leave it blank?

vbarda340 days ago

yea, can just be an empty string, wasn't sure which is less confusing ""

eyurtsev340 days ago

If we don't need the content, do we want to change the hierarchy so that content is only for messages that contain content? Or could the content be helpful here?

baskaryan
baskaryan commented on 2024-06-17
baskaryan340 days ago

dunno if this is a good idea but: we could add the message removal logic to BaseChatModel. ie check for RemoveMessages in the input and perform specified removals before passing messages on. that way user doesn't have to write this logic everywhere

vbarda
vbarda fix
d68fb038
eyurtsev eyurtsev assigned eyurtsev eyurtsev 340 days ago
eyurtsev
nfcampos
nfcampos commented on 2024-06-18
libs/core/langchain_core/language_models/chat_models.py
256259 generation += chunk
260
261 if chunk.message.id != last_chunk_id:
262
seen_chunk_ids.append(chunk.message.id)
nfcampos339 days ago

let's just use a set?

vbarda339 days ago

set makes sense conceptually, but i think we might want to maintain the order?

nfcampos
nfcampos commented on 2024-06-18
nfcampos
nfcampos commented on 2024-06-18
nfcampos
nfcampos commented on 2024-06-18
nfcampos
nfcampos commented on 2024-06-18
vbarda
ccurme ccurme added â±­: core
vbarda
vbarda vbarda closed this 329 days ago
vbarda vbarda deleted the vb/add-remove-message branch 262 days ago

Login to write a write a comment.

Login via GitHub

Assignees
Labels
Milestone