Рет қаралды 9,689
ConversationBufferMemory
ConversationBufferMemory is a memory utility in the Langchain package that allows for storing messages in a buffer and extracting them as a string or a list of messages. It is useful for storing conversation history in a chatbot or conversational AI system.
It simply keeps a buffer of all the interactions in a conversation. It does not have a limit on the number of interactions it can store
ConversationBufferWindowMemory
ConversationBufferWindowMemory is a type of memory in the LangChain package that keeps a list of the interactions of a conversation over time, but only uses the last K interactions. This can be useful for keeping a sliding window of the most recent interactions, so the buffer does not get too large. On the other hand, ConversationBufferMemory is a different type of memory that simply keeps a buffer of all the interactions in a conversation. It does not have a limit on the number of interactions it can store.
ConversationTokenBufferMemory
ConversationTokenBufferMemory is a buffer for storing conversation memory that keeps a buffer of recent interactions in memory, and uses token length rather than the number of interactions to determine when to flush interactions. It is a class in the langchain.memory.token_buffer module and is used in conversational AI applications to store and retrieve conversation history.
ConversationSummaryMemory
ConversationSummaryMemory is a type of memory utility in the LangChain package that creates a summary of the conversation over time. It condenses information from the conversation over time and can be useful for keeping track of the conversation history. It differs from other memory utilities in LangChain in that it specifically focuses on creating a summary of the conversation, whereas other memory utilities may focus on different aspects of the conversation, such as keeping a buffer of recent interactions or using token length to determine when to flush interactions.
ConversationSummaryBufferMemory
ConversationSummaryBufferMemory is a type of memory utility in the LangChain package that keeps a buffer of recent interactions in memory and compiles them into a summary, using token length rather than the number of interactions to determine when to flush interactions. It differs from other memory utilities in that it combines the functionality of keeping a buffer of recent interactions with the ability to summarize the conversation over time. Other memory utilities in the package include ConversationSummaryMemory, which creates a summary of the conversation over time, and ConversationBufferMemory, which keeps a buffer of recent interactions in memory.
Entity Memory
Entity Memory is a memory module in LangChain that remembers information about specific entities. It uses LLMs to extract information on entities and builds up its knowledge about that entity over time. It is designed to be used in chains and provides easy ways to incorporate its utilities into chains. Compared to other memory utilities in LangChain, Entity Memory is focused on remembering information about specific entities, while other memory utilities are more general and can be used to manage and manipulate previous chat messages. Additionally, Entity Memory is designed to be used in chains, while other memory utilities can be used in a variety of contexts.
VectorStore-Backed Memory
VectorStore-Backed Memory is a memory component in LangChain that stores memories in a VectorDB and queries the top-K most “salient” docs every time it is called. It differs from most of the other Memory classes in that it doesn’t explicitly track the order of interactions. In this case, the “docs” are previous conversation snippets. This can be useful to refer to relevant pieces of information that the AI was told earlier in the conversation.