Conversation Summary Buffer Memory
The Conversation Summary Buffer Memory node is a specialized memory component used in conversational AI systems. It maintains a summary of the conversation history, automatically summarizing older parts of the conversation when a token limit is reached. This allows for maintaining context over longer conversations without exceeding memory constraints.
Node Details
- Name: ConversationSummaryBufferMemory
- Type: Memory
- Version: 1.0
- Icon: memory.svg
- Category: Memory
Base Classes
- ConversationSummaryBufferMemory
- (Additional base classes from LangChain)
Base Classes
- ConversationSummaryBufferMemory
- (Additional base classes from LangChain)
Description
This node uses token length to decide when to summarize conversations. It’s particularly useful for maintaining context in long-running conversations or when dealing with large amounts of dialogue history.
Parameters
Inputs
- Chat Model
Inputs
-
Chat Model
- Type: BaseChatModel
- Description: The language model used for summarizing the conversation
- Description: The language model used for summarizing the conversation
-
Max Token Limit (optional)
-
Max Token Limit (optional)
- Type: number
- Default: 2000
- Description: The maximum number of tokens to keep in memory before summarizing
- Description: The maximum number of tokens to keep in memory before summarizing
-
Session Id (optional)
-
Session Id (optional)
- Type: string
- Description: A unique identifier for the conversation session. If not specified, a random ID will be used
- Description: A unique identifier for the conversation session. If not specified, a random ID will be used
-
Memory Key (optional)
-
Memory Key (optional)
- Type: string
- Default: ‘chat_history’
- Description: The key used to store and retrieve the chat history in the memory
- Description: The key used to store and retrieve the chat history in the memory
Functionality
- Maintains a buffer of recent messages and a summary of older messages.
- Automatically summarizes older messages when the token limit is reached.
- Integrates with a database to persist conversation history across sessions.
- Supports retrieval of chat messages, including options to prepend messages and return as base messages or interface messages.
- Handles pruning of messages to stay within token limits.
- Special handling for Anthropic models, which don’t support multiple system messages.
Input/Output
- Input: Receives new messages and integrates them into the conversation history.
- Output: Provides summarized conversation history and recent messages when queried.
Usage
This node is typically used in conversational AI flows where maintaining context over long conversations is crucial. It’s particularly useful in scenarios where:
- Conversations can become lengthy and exceed normal memory constraints.
- A summary of previous interactions is needed to maintain context.
- The AI needs to refer back to earlier parts of the conversation without storing the entire dialogue history.
Integration
- Integrates with database systems for persistent storage of conversation history.
- Works with various chat models, with special handling for Anthropic models.
- Can be used in conjunction with other memory types and conversational AI components.
Note
This node extends the functionality of the base ConversationSummaryBufferMemory from LangChain, adding custom features for integration with AI systems and database persistence.
Was this page helpful?