A video on custom memory will be helpful. Thanks for the series.
@vq8gef3210 ай бұрын
Yes Please build a custom memory ! Thank you.
@atylerblack164 Жыл бұрын
Thank you so much! I had a app built without any conversation memory just using chains and was struggling to convert to memory. you made this very easy to follow and understand
@SaifBagmaru Жыл бұрын
Hey how did you do that i am trying to implement the same do you have any repo?
@aiamfree Жыл бұрын
i’ve been experimenting with entity in my own ways and its pretty wild and probably the most useful for general use. I imagine word for word would really only matter in something like a story generator or whatnot
@resistance_tn Жыл бұрын
Great explanation ! Would love to see the custom/combinaisons one :)
@hiranga Жыл бұрын
Yeah - would love to see a custom memory tute!
@RandyHawkinsMD Жыл бұрын
Custom memory seems intuitively potentially useful for allowing human experts’ input to shape the knowledge graph that might be created to represent the state of users’ concerns based on experts’ knowledge. I’d be v.interested in a video on this subject. :)
@Jasonknash101 Жыл бұрын
Another great video, I want to create my own agent with a memory. I’m thinking a vector database is the best way of doing it would be great if you could do a similar video outlining some of the different vector database, options, pros and cons of the different ones.
@kenchang3456 Жыл бұрын
Indeed, this was helpful. Thank you for this video series. The more I work through them the more may questions are being answered :-)
@ketolm5 ай бұрын
Love the videos! Thank you for making them. Dying at the b-film footage
@samwitteveenai5 ай бұрын
Thanks . Feedback is really appreciated. We have tried to reduce the stock video a lot on the newer vids.
@hikariayana Жыл бұрын
This is exactly what I needed, thanks so much!
@m_ke Жыл бұрын
Oh how much I missed that voice. Keep the videos coming and maybe get some sunglasses and a webcam.
@samwitteveenai Жыл бұрын
Long time no see. :D Working on getting a cam setup, but traveling a fair bit till April. Will DM you later.
@blackpixels9841 Жыл бұрын
This was the voice that got me started on my Deep Learning journey! Let us know if you're ever in Singapore again some day
@krisszostak4849 Жыл бұрын
This is awesome! I love the way you explain things Sam! If you ever create an in depth video course about using lang chain and llms, especially regarding extracting particular knowledge from a personal or business knowledge base - let me know pls, I'll be first one to buy it 😍
@joer3650 Жыл бұрын
best explanation Ive found, thanks
@samwitteveenai Жыл бұрын
thanks, much appreciated
@abhirj87 Жыл бұрын
wow!!! super helpful and thanks a ton for making this tutorial!!
@musabalsaifi89932 ай бұрын
Great work, Thanks a lot
@LearningWorldChatGPT Жыл бұрын
Great class! Thank you very much for sharing your knowledge Gained a follower !
@sahil51249 ай бұрын
it's really helpful, thanks man
@abdoualgerian5396 Жыл бұрын
i think the best one is to create like a small ai handler that handles all of the memory in your device then sends a very brief summary to the llm wih the necessary info of what the user means , in this case we will avoid sending too much data with much more effective promts than all of the mentined above
@ghinwamoujaes90596 ай бұрын
Very helpful - Thank you!
@WissemBellara7 ай бұрын
Nice Video, very well made
@MannyBernabe Жыл бұрын
Super helpful overview. Thank you.
@DavidTowers-f1y Жыл бұрын
I love these tutorials. Learning so much. Thanks.
@sysadmin93969 ай бұрын
Hi Sam, how do we keep the Conversation context of multiple users separate ?
@hussienhassin73347 ай бұрын
Have you resolved it? I am still struggling too
@owszystkim54156 ай бұрын
Is it cost effective to use ConversationSummary? From my understanding it needs to summarize our conversation every time.
@noone-jq1xw Жыл бұрын
Great video! I'm such a big fan of your work now! I'm sure this channel is going to places once the llms become a bit more mainstream in the programming stack. Please keep up with the awesome work! I have a question with regard to the knowledge graph memory section. The sample code given shows that the relevant section never gets populated. Furthermore, the prompt structure has two inputs, {history} and {input}, but we only pass on the {input} part, which might explain why the relevant information is empty. In this case, do you know if there is any use for the relevant information section? A second query is in regard to the knowledge graph. Since the prompt seems to be contextually aware, even though the buffer doesn't show the chat history, is it safe to say that in addition to the chat log shown (as part of verbose), it also sends the knowledge graph triplets created to the llm to process the response?
@ghghgjkjhggugugbb Жыл бұрын
revolutionary video..
@dogtens1060 Жыл бұрын
nice overview, thanks!
@JimCh-g6w Жыл бұрын
I've built this with streamlit UI as a front-end and deployed it as a Cloud Run service. Now, if multiple users are trying to chat with the Bot, the entire chat_history combined with all User conversations is being referred. If I want to have a user_id/session_id specific chat_history, how can I do it ? Could you please helo me
@sysadmin93969 ай бұрын
I have this same exact issue. Did you ever figure it out??
@naveennirban4 ай бұрын
Hey guys, I am working on it too. I am trying to create multiple vector db with runtime knowledge feed for specific user. Name of vector db could be a unique id attached to your user model.
@binstitus390911 ай бұрын
How can I keep the conversation context of multiple users separately?
@sysadmin93969 ай бұрын
I’m looking for this answer as well. Did you ever figure it out?
@insight-guy Жыл бұрын
Thankyou Sam.
@elyakimlev Жыл бұрын
Thanks for this great tutorial series. Question: how do you set the k value for the ConversationSummaryBufferMemory option? I didn't see where you set it in your code. Is it always 2?
@z-ro Жыл бұрын
Amazing explanation! I'm currently trying to use Langchain's javascript library to "persist" memory across multiple "sessions" or reloads. Do you have a video of the types of memory that can do that?
@untypicalien Жыл бұрын
Hey there, I'd love to know if after a month you found any useful resources or documentation about this. I'm trying to reach this as well. 😄
@hussamsayeed3012 Жыл бұрын
how do we add a custom prompt by adding some variable data, and using memory in ConversationChain? Like I'm trying this but getting the validation error: PROMPT = PromptTemplate( input_variables=["chat_history_lines", "input", "tenant_prompt", "context"], template=_DEFAULT_TEMPLATE ) llm = OpenAI(temperature=0) conversation = ConversationChain( llm=llm, verbose=True, memory=memory, prompt=PROMPT ) Error: 1 validation error for ConversationChain __root__ Got unexpected prompt input variables. The prompt expects ['chat_history_lines', 'input', 'tenant_prompt', 'context'], but got ['chat_history_lines', 'history'] as inputs from memory, and input as the normal input key. (type=value_error)
@samwitteveenai Жыл бұрын
you over write the 'prompt.template' and make sure it takes in the same inputs as the previous one etc. Take a look at one of the early vids about LangChain Prompts and Chains.
@starmorph Жыл бұрын
I like the iced out parrot thumbnails 😎
@jintao824 Жыл бұрын
Great content Sam! Subbed. Just wanted to ask - are there technical limitations to why these LLMs have limited context windows? Any pointers to papers will be very helpful should they exist!
@samwitteveenai Жыл бұрын
mostly this is about the attention layers and that the wider the spans the go you run into compounding computation. Take a look at this stackoverflow.com/questions/65703260/computational-complexity-of-self-attention-in-the-transformer-model
@jintao824 Жыл бұрын
@@samwitteveenai Thanks Sam, I will check this out!
@mautkajuari Жыл бұрын
beautifully explained
@CookFu6 ай бұрын
can a retrieval chain work with memory function? I have been trying that for couple of days, but it doesn't work.
@caiyu538 Жыл бұрын
great tutorial
@viktor4207 Жыл бұрын
Can you use both? So you can start working on a user profile by creating a knowledge graph associated with a user and storing it but then pass information to the bot in a summarized way?
@carlosquiala86987 ай бұрын
Can I mixed 2 types of memories? For example Entity and graph?
@Aidev787611 ай бұрын
Imbusimg am SQL chain. Id like to add memory on that. Do we have some ideas on thst? Thanks
@pec83777 ай бұрын
How do you use the diff. conversation with LCEL ?
@aibasics7206 Жыл бұрын
hi sam nice video! can you please clarify that can we finetune and use the memory here .For finetuning with own data we are using gpt index anf for llm predictor we are using LangChain.Can you tell me way around to use memory of langchain with integration of gpt index and loading own custom chat data ?
@RedCloudServices Жыл бұрын
Sam can you help clarify? Do we still need to fine tune a custom LLM with our own corpus if we can use Langchain methods (i.e. webhooks, Python REPEL, pdf loaders, etc) ? or are both still necessary for all custom use cases?
@samwitteveenai Жыл бұрын
LLMs that you finetune for your purpose should always have an advantage in regard to unique data etc. If you can get away with LangChain and and API though and you don't mind the costs then that will be easier.
@lordsairolaas11 ай бұрын
Hello ! I'm making a chat bot using Conversation with KG but It keeps popping this error for the past few days could you help ? Got unexpected prompt input variables. The prompt expects [], but got ['history'] as inputs from memory, and input as the normal input key. (type=value_error)
@wukao1985 Жыл бұрын
Thanks Sam for this great video. I found it really hard to understand how to make these memory function to work with ChatOpenAI model, can you help create a video on that? This video were all using divanci models.
@samwitteveenai Жыл бұрын
yes good point these were made before that API existed. I might make some updated versions.
@harinisri2962 Жыл бұрын
Hi I have a doubt. I am implementing ConversationBufferWindowMemory for document question answering chatbot. from langchain import memory conversation=ConversationChain(llm=llm,verbose=True,memory=ConversationBufferWindowMemory(k=2)) is it possible to return the source of documents answer using any parameters?
@samwitteveenai Жыл бұрын
Yes that will require using meta data
@ranjithkumarkalal1810 Жыл бұрын
Great videos
@vq8gef3210 ай бұрын
Amazing ! Appreciated it! But I can't run some of the codes ! :( is there any updated version?
@samwitteveenai10 ай бұрын
Sorry I am working on updated LangChain vid which I will update the code. Some of these vids are a year old now
@vq8gef3210 ай бұрын
Thank you @@samwitteveenai amazing work, I am still watching your channel. Thank you heaps.
@prayagbrahmbhatt6375 Жыл бұрын
Great stuff ! Thanks for the tutorial ! I do have a question regarding Opensource models. How can use any alternative of OpenAI model ? like Vicuna or Llama ? What if we don't have openAI API-key ?
@samwitteveenai Жыл бұрын
I have some vids using open source LLMs for this kind of thing
@srishtinagu185710 ай бұрын
Hii Sam, awesome video. I am trying to add conversation memory to my RAG application. But it is not giving correct response. Can you make a video or share some references for that. It will be really helpful. Thanks!
@samwitteveenai9 ай бұрын
I need to make a full LangChain update this vid is a year old now. I am working on it, so hopefully soon.
@srishtinagu18579 ай бұрын
@@samwitteveenai ok thanks! Waiting for it.
@sanakmukherjee3929 Жыл бұрын
Nice explanation. Can you help me add this to a custom csv dataset.
@tubingphd Жыл бұрын
Thank you Sam
@embeddedelligence-926 Жыл бұрын
So how to make a conversational memory and using it with csv agent
@lorenzoleongutierrez7927 Жыл бұрын
Thanks for sharing!
@Fluffynix Жыл бұрын
How does this compare to Haystack which has been around for years?
@samwitteveenai Жыл бұрын
Its quite different than Haystack. This is all about prompts and generative LLM manipulation that search. LangChain can do search with vector stores. You could probably use haystack as a tool with LangChain which could be cool for certain use cases.
@kenchang3456 Жыл бұрын
I just enjoy learning from your videos, thank you very much. Do you have any videos, suggestions or advice on how to control when a conversation goes off on a tangent and bring it back to the purpose of the conversation. E.g. A Chatbot for laptop trouble shooting - System: Hi how can I help you?, User: My laptop is broken., System: Can you describe the problem with more detail?, User: What's the weather like in Hawaii?, System: The weather is pleasant in Hawaii. Can you describe the problem with your laptop with more detail?
@samwitteveenai Жыл бұрын
With big models this dealt with by good prompts that make it clear what it can and can't talk about and then to discontinue the conversation if people are too far off the main topics.
@kenchang3456 Жыл бұрын
@@samwitteveenai Ah so it's in the prompts...interesting, thanks!
@adumont Жыл бұрын
Really interesting. The last one about graphs and entities could have a lot of potential. I wonder how one could use some retrieval on a knowledge database for example to also enrich the context/prompt with information from that. For example, suppose the AI had access to the warranty database, and could check the status of the warranty for the TV serial number. It could maybe ask the user for the serial number, and automatically go check the warranty for that serial number, and answer "your TV is under warranty number xxx". Is there examples of how to do that?
@Jasonknash101 Жыл бұрын
Totally agree would be great to show. How are you integrate this with something like node JS
@svenandreas5947 Жыл бұрын
I`m wondering. This works as long as the human gives the expected information. Is there any chance to ask for information (like warranty number)?
@samwitteveenai Жыл бұрын
yes you can do this with context and retrieval. Eg. adding a search for data etc. and passing the results into the context of the prompt.
@svenandreas5947 Жыл бұрын
@@samwitteveenai will google and search for this :-) Thanks for the hint. I just figured out the way via prompt engineering, but this wasn`t exactly what I was looking for. Thanks again
@samwitteveenai Жыл бұрын
What exactly do you want to do?
@memesofproduction27 Жыл бұрын
langchain's self ask search sounds relevant.
@stanTrX2 ай бұрын
Thanks. How about agents with memory?
@samwitteveenai2 ай бұрын
will make a video about this soon
@gmdl007 Жыл бұрын
hi Sam, is there a way to combine this with qa with own pdf files?
@samwitteveenai Жыл бұрын
yes I have a few videos about that if you look for PDF etc.
@gmdl007 Жыл бұрын
@@samwitteveenai fantastic, can you share?
@samwitteveenai Жыл бұрын
@@gmdl007 There are a number take a look in this playlist kzbin.info/www/bejne/gJCToqmIqZl_hM0
@souvickdas5564 Жыл бұрын
How do I use memory with ChatVectorDBChain where we can specify vector stores. Could you please give code snippet for this. Thanks
@samwitteveenai Жыл бұрын
I will made a video about Vector stores at some point.
@stonaraptor8196 Жыл бұрын
There has to be a simpler way to get a personalized, locally on my PC stored AI, that has long term memory and is able to keep up long conversations. Maybe I am very naive, but for me as a non-programmer, my main interest in AI is more in philosophical nature I guess. Where/how would I start or even get an offline version? Reading the OpenAI site is, let's say slightly challenging...
@pengchengwu447 Жыл бұрын
I wonder if it's possible to specifiy *predefined* entities?
@samwitteveenai Жыл бұрын
you could do it with custom prompts etc.
@souvickdas5564 Жыл бұрын
How do we create conversational bot for non English languages and the languages that are not supported by the OpenAI embeddings? For example If I want to build a conversational agent for articles written in Indian languages (Bengali or Bangla), how we can do it?
@samwitteveenai Жыл бұрын
You would use a multi-lingual embedding model which you could find on HuggingFace. Check out huggingface.co/sentence-transformers/stsb-xlm-r-multilingual there are others as well. there are also a number of multi-lingual LLMs including mT5 which supports Bengali. You would get best results by fine-tuning some of these models.
@souvickdas5564 Жыл бұрын
@@samwitteveenai thanks a lot.
@428manish Жыл бұрын
It works fine with gpt3.5 turbo .. How to make it work with FAISS DB using local data(pdf)..
@ambrosionguema9200 Жыл бұрын
Hi Sam, in this, How to upload my own data file? in this code? Plase help me
@samwitteveenai Жыл бұрын
I have a video coming out this weekend on using your own data for CSV and Excel files I will make one for larger dataset.
@sooryaprabhu1412211 ай бұрын
bro please include the deployment also
@foysalmamun5106 Жыл бұрын
Thank you lot
@Pure_Science_and_Technology Жыл бұрын
Not sure the difference, but I use, print(conversation.memory.entity_store) : print(dir(conversation.memory)) I don't have an attribute 'store'
@emmanuelkolawole6720 Жыл бұрын
Are you saying that Alpaca can only take in 2000 tokens? Please if that is true how can we increase it
@samwitteveenai Жыл бұрын
increasing it requires some substantial retraining.
@foysalmamun5106 Жыл бұрын
waiting for video on custom memory 🙂
@WissemBellara7 ай бұрын
Is it possible to add chapters with timestamps please ? It would make it easier
@nilendughosal6084 Жыл бұрын
How to handle memory for multiple users?
@samwitteveenai Жыл бұрын
You serialize this out and load in the memory based on who is calling the model etc.
@mandarbagul3008 Жыл бұрын
hello sir, greetings, what is span? 3.42
@samwitteveenai Жыл бұрын
The span ( context size ) refers to the number of tokens (sub words) that you can pass into a model in a single shot.
@mandarbagul3008 Жыл бұрын
@@samwitteveenaiGot it. thank you very much sir :)
@RahulD6004 ай бұрын
but still, this is not unlimited memory, right?
@alizhadigerov9599 Жыл бұрын
can we use gpt3.5-turbo instead of davinci-003 here?
@samwitteveenai Жыл бұрын
you can but the code has to be changed with the new Chat options.
@aaroldaaroldson708 Жыл бұрын
@@samwitteveenai Thanks. Are you planning to record a video on that? Would be very helpful!
@Danpage04 Жыл бұрын
Btw, it would be nice if you show yourself on cam when you’re not coding. The clips are weirdly distracting 😅
@samwitteveenai Жыл бұрын
lol yeah plan to get a camera at some point. I cut back on the broll stuff after the early videos if that helps.
@neerajmahapatra5239 Жыл бұрын
Ho exam we add prompt with these memory chains.
Жыл бұрын
select * from stock_videos where label like '%typing%' :D