How to use LangChain memory with Streamlit ?

  Рет қаралды 2,651

TechLycan

TechLycan

9 ай бұрын

This video tutorial explains the challenges of attaching memory with LLM in a streamlit application and one of the simple solutions to overcome it. #ai #llm #sql #database #datascience #artificialintelligence #largelanguagemodels #langchain #streamlit #python #pythonprogramming #llmapplication #robot #robotics #entrepreneur #enterprisesolutions #innovation #innovative #promptengineering #promptengineer #chatgpt #gpt #database #snowflake #gpt4 #gpt3 #dataengineering #dataengineers #openai

Пікірлер: 6
@datasciencebyyogi623
@datasciencebyyogi623 3 ай бұрын
can you please share the code git repo
@user-xx8xg5yf4d
@user-xx8xg5yf4d Ай бұрын
import os from app_secrates import OPENAI_API_KEY from langchain.llms import OpenAI from langchain.memory import ConversationBufferMemory from langchain.prompts import PromptTemplate from lamgchain.chains import LLMChain import streamlit as st os.environ["OPENAI_API_KEY"] = OPENAI_API_KEY memory = ConversationBufferMemory() st.title("Conversational Bot") user_input = st.text_input("Enter your message here") if 'chat_history' not in st.session_state: st.session_state.chat_history = [] else: for message in st.session_state.chat_history: memory.save_context({"input": message['human']}, {'output': message['AI']}) promprt_template = PromptTemplate( input_variables= ['history', 'input'], template=""" You are coversational bot. Maintain formal tone in your responses. History: {history} humen: {input} AI: """ ) llm = OpenAI(temprature=0.0) conversation_chain = LLMChain(llm=llm, prompt=promprt_template, memory=memory, verbose=True) if user_input: response = conversation_chain.run(input=user_input) message = {"human": user_input, "AI": response['text']} st.session_state.chat_history.append(message) st.write(response) with st.expander(label="Chat History", expanded=True): st.write(st.session_state.chat_history)
@ahmaddajani3639
@ahmaddajani3639 26 күн бұрын
great video. But I have a question if there are multiple users asking at the same time, what will happen? If user A is asking and memory now is filled with first question and answer, if someone else asked a question, lets say user B immediately after user A, the memory will be filled with question and answer so what I mean conversation history will be filled by both users or it will create it separately since it is a local variable using session state? Do we need to implement a session?
@techlycan
@techlycan 26 күн бұрын
Session is created by default . Different users , different sessions
@renaudgg
@renaudgg 4 ай бұрын
is every new question we will ask to the AI with the "memory" will push every time the entire conversation as tokens every single time? example if at the beginning there is no history and lets say, for argument sake, my first question is 10 token, then the AI answer and the total so far is 10+5 = 15 then i ask a second question that is lets say 20 tokens. when I press enter, will it "eat up" 15 + 20 ?
@techlycan
@techlycan 4 ай бұрын
Yes that's the context setting but in real case you won't need messages upto that back.... Realistically thinking not more than 3-5 previous messages should be passed and even that depends on what you are using LLM app for...so fix the no of previous messages
What is LangChain?
8:08
IBM Technology
Рет қаралды 130 М.
100❤️ #shorts #construction #mizumayuuki
00:18
MY💝No War🤝
Рет қаралды 20 МЛН
Заметили?
00:11
Double Bubble
Рет қаралды 1,4 МЛН
MOM TURNED THE NOODLES PINK😱
00:31
JULI_PROETO
Рет қаралды 9 МЛН
LangChain Explained in 13 Minutes | QuickStart Tutorial for Beginners
12:44
Build an Agent with Long-Term, Personalized Memory
22:54
Deploying AI
Рет қаралды 24 М.
LangChain Chat with Flan20B
10:55
Sam Witteveen
Рет қаралды 13 М.
100❤️ #shorts #construction #mizumayuuki
00:18
MY💝No War🤝
Рет қаралды 20 МЛН