Buffer Window Memory
The Buffer Window Memory node is a memory component used in conversational AI systems. It stores and retrieves a fixed number of recent messages in a conversation, providing context for language models.
Node Details
- Name: bufferWindowMemory
- Type: BufferWindowMemory
- Version: 2.0
- Category: Memory
- Icon: memory.svg
Description
This node uses a window of size k to surface the last k back-and-forth exchanges to use as memory in a conversation. It’s useful for maintaining recent context without storing the entire conversation history.
Base Classes
- BufferWindowMemory
- (Additional base classes from langchain’s BufferWindowMemory)
Parameters
Inputs
Inputs
-
Size (k)
- Type: number
- Default: 4
- Description: The number of recent message pairs (back-and-forth exchanges) to keep in memory
- Description: The number of recent message pairs (back-and-forth exchanges) to keep in memory
-
Session Id (optional)
-
Session Id (optional)
- Type: string
- Description: A unique identifier for the conversation session. If not specified, a random id will be used
- Description: A unique identifier for the conversation session. If not specified, a random id will be used
-
Memory Key
- Type: string
- Default: “chat_history”
- Description: The key used to store and retrieve the chat history in the memory
- Description: The key used to store and retrieve the chat history in the memory
Functionality
-
Initialization:
- Creates a BufferWindowMemoryExtended instance with the specified parameters.
- Connects to a database using the provided DataSource.
-
Message Retrieval:
- Fetches messages from the database based on the session ID and chatflow ID.
- Limits the number of messages based on the specified window size (k).
- Can return messages as either IMessage or BaseMessage types.
-
Message Storage:
- Actual message storage is handled at the server level, not within this node.
-
Memory Clearing:
- Clearing of chat messages is also handled at the server level.
Use Cases
- Maintaining recent context in chatbots or conversational AI systems.
- Limiting the amount of historical data passed to language models to reduce token usage and improve relevance.
- Providing a sliding window of conversation history for context-aware responses.
Integration
This node is designed to work within a larger system, likely a conversational AI framework. It interfaces with a database to persist and retrieve conversation history, making it suitable for applications that require stateful conversations across multiple interactions or sessions.