建设一个网站需要哪方面的费用,头条搜索,华为云定制建站服务怎么样,定制网站建设的流程分类目录#xff1a;《自然语言处理从入门到应用》总目录 对话令牌缓冲存储器ConversationTokenBufferMemory
ConversationTokenBufferMemory在内存中保留了最近的一些对话交互#xff0c;并使用标记长度来确定何时刷新交互#xff0c;而不是交互数量。
from langchain.me…分类目录《自然语言处理从入门到应用》总目录 对话令牌缓冲存储器ConversationTokenBufferMemory
ConversationTokenBufferMemory在内存中保留了最近的一些对话交互并使用标记长度来确定何时刷新交互而不是交互数量。
from langchain.memory import ConversationTokenBufferMemory
from langchain.llms import OpenAI
llm OpenAI()
memory ConversationTokenBufferMemory(llmllm, max_token_limit10)
memory.save_context({input: hi}, {output: whats up})
memory.save_context({input: not much you}, {output: not much})
memory.load_memory_variables({})输出
{‘history’: ‘Human: not much you\nAI: not much’}
我们还可以将历史记录作为消息列表获取如果我们正在使用聊天模型将非常有用
memory ConversationTokenBufferMemory(llmllm, max_token_limit10, return_messagesTrue)
memory.save_context({input: hi}, {output: whats up})
memory.save_context({input: not much you}, {output: not much})在链式模型中的应用
让我们通过一个例子来演示如何在链式模型中使用它同样设置verboseTrue以便我们可以看到提示信息。
from langchain.chains import ConversationChain
conversation_with_summary ConversationChain(llmllm, # We set a very low max_token_limit for the purposes of testing.memoryConversationTokenBufferMemory(llmOpenAI(), max_token_limit60),verboseTrue
)
conversation_with_summary.predict(inputHi, whats up?)日志输出 Entering new ConversationChain chain...
Prompt after formatting:
The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.Current conversation:Human: Hi, whats up?
AI: Finished chain.输出 Hi there! Im doing great, just enjoying the day. How about you?输入
conversation_with_summary.predict(inputJust working on writing some documentation!)日志输出 Entering new ConversationChain chain...
Prompt after formatting:
The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.Current conversation:
Human: Hi, whats up?
AI: Hi there! Im doing great, just enjoying the day. How about you?
Human: Just working on writing some documentation!
AI: Finished chain.输出 Sounds like a productive day! What kind of documentation are you writing?输入
conversation_with_summary.predict(inputFor LangChain! Have you heard of it?)日志输出 Entering new ConversationChain chain...
Prompt after formatting:
The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.Current conversation:
Human: Hi, whats up?
AI: Hi there! Im doing great, just enjoying the day. How about you?
Human: Just working on writing some documentation!
AI: Sounds like a productive day! What kind of documentation are you writing?
Human: For LangChain! Have you heard of it?
AI: Finished chain.输出 Yes, I have heard of LangChain! It is a decentralized language-learning platform that connects native speakers and learners in real time. Is that the documentation youre writing about?输入
# 我们可以看到这里缓冲区被更新了
conversation_with_summary.predict(inputHaha nope, although a lot of people confuse it for that)日志输出 Entering new ConversationChain chain...
Prompt after formatting:
The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.Current conversation:
Human: For LangChain! Have you heard of it?
AI: Yes, I have heard of LangChain! It is a decentralized language-learning platform that connects native speakers and learners in real time. Is that the documentation youre writing about?
Human: Haha nope, although a lot of people confuse it for that
AI: Finished chain.输出 Oh, I see. Is there another language learning platform youre referring to?基于向量存储的记忆VectorStoreRetrieverMemory
VectorStoreRetrieverMemory将内存存储在VectorDB中并在每次调用时查询最重要的前 K K K个文档。与大多数其他Memory类不同它不明确跟踪交互的顺序。在这种情况下“文档”是先前的对话片段。这对于提及AI在对话中早些时候得知的相关信息非常有用。
from datetime import datetime
from langchain.embeddings.openai import OpenAIEmbeddings
from langchain.llms import OpenAI
from langchain.memory import VectorStoreRetrieverMemory
from langchain.chains import ConversationChain
from langchain.prompts import PromptTemplate初始化VectorStore
根据我们选择的存储方式此步骤可能会有所不同我们可以查阅相关的VectorStore文档以获取更多详细信息。
import faissfrom langchain.docstore import InMemoryDocstore
from langchain.vectorstores import FAISSembedding_size 1536 # Dimensions of the OpenAIEmbeddings
index faiss.IndexFlatL2(embedding_size)
embedding_fn OpenAIEmbeddings().embed_query
vectorstore FAISS(embedding_fn, index, InMemoryDocstore({}), {})创建VectorStoreRetrieverMemory
记忆对象是从VectorStoreRetriever实例化的。
# In actual usage, you would set k to be a higher value, but we use k1 to show that the vector lookup still returns the semantically relevant information
retriever vectorstore.as_retriever(search_kwargsdict(k1))
memory VectorStoreRetrieverMemory(retrieverretriever)# When added to an agent, the memory object can save pertinent information from conversations or used tools
memory.save_context({input: My favorite food is pizza}, {output: thats good to know})
memory.save_context({input: My favorite sport is soccer}, {output: ...})
memory.save_context({input: I dont the Celtics}, {output: ok}) #
# Notice the first result returned is the memory pertaining to tax help, which the language model deems more semantically relevant
# to a 1099 than the other documents, despite them both containing numbers.
print(memory.load_memory_variables({prompt: what sport should i watch?})[history])输出
input: My favorite sport is soccer
output: ...在对话链中使用
让我们通过一个示例来演示在此示例中我们继续设置verboseTrue以便查看提示。
llm OpenAI(temperature0) # Can be any valid LLM
_DEFAULT_TEMPLATE The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.Relevant pieces of previous conversation:
{history}(You do not need to use these pieces of information if not relevant)Current conversation:
Human: {input}
AI:
PROMPT PromptTemplate(input_variables[history, input], template_DEFAULT_TEMPLATE
)
conversation_with_summary ConversationChain(llmllm, promptPROMPT,# We set a very low max_token_limit for the purposes of testing.memorymemory,verboseTrue
)
conversation_with_summary.predict(inputHi, my name is Perry, whats up?)日志输出 Entering new ConversationChain chain...
Prompt after formatting:
The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.Relevant pieces of previous conversation:
input: My favorite food is pizza
output: thats good to know(You do not need to use these pieces of information if not relevant)Current conversation:
Human: Hi, my name is Perry, whats up?
AI: Finished chain.输出 Hi Perry, Im doing well. How about you?输入
# Here, the basketball related content is surfaced
conversation_with_summary.predict(inputwhats my favorite sport?)日志输出 Entering new ConversationChain chain...
Prompt after formatting:
The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.Relevant pieces of previous conversation:
input: My favorite sport is soccer
output: ...(You do not need to use these pieces of information if not relevant)Current conversation:
Human: whats my favorite sport?
AI: Finished chain.输出 You told me earlier that your favorite sport is soccer.输入
# Even though the language model is stateless, since relavent memory is fetched, it can reason about the time.
# Timestamping memories and data is useful in general to let the agent determine temporal relevance
conversation_with_summary.predict(inputWhats my favorite food)日志输出 Entering new ConversationChain chain...
Prompt after formatting:
The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.Relevant pieces of previous conversation:
input: My favorite food is pizza
output: thats good to know(You do not need to use these pieces of information if not relevant)Current conversation:
Human: Whats my favorite food
AI: Finished chain.输出 You said your favorite food is pizza.输入
# The memories from the conversation are automatically stored,
# since this query best matches the introduction chat above,
# the agent is able to remember the users name.
conversation_with_summary.predict(inputWhats my name?)日志输出 Entering new ConversationChain chain...
Prompt after formatting:
The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.Relevant pieces of previous conversation:
input: Hi, my name is Perry, whats up?
response: Hi Perry, Im doing well. How about you?(You do not need to use these pieces of information if not relevant)Current conversation:
Human: Whats my name?
AI: Finished chain.输出 Your name is Perry.参考文献 [1] LangChain官方网站https://www.langchain.com/ [2] LangChain ️ 中文网跟着LangChain一起学LLM/GPT开发https://www.langchain.com.cn/ [3] LangChain中文网 - LangChain 是一个用于开发由语言模型驱动的应用程序的框架http://www.cnlangchain.com/