6-Building Advanced RAG Q&A Project With Multiple Data Sources With Langchain

  Рет қаралды 43,861

Krish Naik

Krish Naik

Күн бұрын

Пікірлер: 74
@nishantchoudhary3245
@nishantchoudhary3245 8 ай бұрын
One of the best langchain series. Thanks God I am able to find such good content
@rajamailtome
@rajamailtome 8 ай бұрын
Instead of openai, plz use any other downloadable open source LLM which can be run locally 😊
@XX-ge8ng
@XX-ge8ng 5 күн бұрын
This example works like Champ:) thank you
@jayashreesanthanam816
@jayashreesanthanam816 10 күн бұрын
Very nice tutorial . Very helpful. Please add a commentary to include Langchain API key when using prompts from hub. I am learning a lot , thanks so much Krish
@dhanashrikolekar-j7e
@dhanashrikolekar-j7e 8 ай бұрын
Thank you sir .Great videos you are making............
@athulroby3082
@athulroby3082 5 ай бұрын
simple and best Langchain series, keep up the good work.👏
@carlosbelleza5104
@carlosbelleza5104 8 ай бұрын
your videos are amazing... state of the art
@jayanthAILab
@jayanthAILab 8 ай бұрын
Sir understood the complete flow. Great explaination. ❤❤
@deepaksingh9318
@deepaksingh9318 4 ай бұрын
Amazing information Krish.. Thanks for making thi series.
@canyouvish
@canyouvish 8 ай бұрын
Very comprehensive and super helpful!
@tintintintin576
@tintintintin576 8 ай бұрын
God bless you , sir!
@DoomsdayDatabase
@DoomsdayDatabase 8 ай бұрын
Wow this is what i wanted! can i use the same code with opensource model by just changing the way it loads sir? ​​I wrote email to you yesterday and the video came today! This is next level Krish sir! Thankyou ❤!
@dharmikmehta5593
@dharmikmehta5593 3 ай бұрын
yes, you can use Ollama with Llama3. I did same thing with open source.
@shankarpentyala2390
@shankarpentyala2390 6 ай бұрын
Thanks for introducing and agents
@Nishant-xu1ns
@Nishant-xu1ns 8 ай бұрын
wiating for next video
@lalaniwerake881
@lalaniwerake881 7 ай бұрын
amazing - Thank you
@varindanighanshyam
@varindanighanshyam 8 ай бұрын
Krish fantastic work. Can you explain it keeping ollama in context
@hemanthram7907
@hemanthram7907 8 ай бұрын
This video is very comprehensive and easy to understand, really grateful for your efforts sir, However, Could you please create a session on how to achieve the function calling , tools and agents using Gemini Pro or any other Open source LLM, unfortunately, there is no alternative for the open AI version (create_openai_tools_agent). Please explain us the workaround to use other LLMs.
@aj.arijit
@aj.arijit 6 ай бұрын
exactly sent one full day wasting on this and reading hell lot of documentations although gained a lot of knowledge i found a function using gemini which a person wrote as no a agent formation tool for ollama which support chat generation using different tools and ollama together at the same time def process_user_request(user_input): # Parse user input for potential tool usage if "{" in user_input and "}" in user_input: # Extract tool name and arguments tool_call = user_input.split("{")[1].split("}")[0] tool_name, arguments = tool_call.split(":") arguments = eval(arguments) # Find the corresponding tool function for tool in tools: if tool.__name__ == tool_name: # Execute the tool with user arguments tool_output = tool(arguments) return tool_output # User request doesn't involve a tool, respond normally return f"I understand, but I can't use a tool for this request. {user_input}" while True: # Get user input user_input = input("User: ") # Process user request and generate response with Llama 2 response = model.generate( input_ids=model.tokenizer.encode(prompt.format(user_input=user_input, list_of_available_tools=" * ".join([t.__name__ for t in tools]))), max_length=1024, num_beams=5, no_repeat_ngram_size=2, early_stopping=True ) # Extract and format the generated response generated_text = model.tokenizer.decode(response[0]["generated_tokens"], skip_special_tokens=True) tool_output = process_user_request(user_input) final_response = generated_text.replace("{generated_response}", tool_output) # Print the final response to the user print(final_response) there is something called binding of func which i could not understand shit
@asadpanhwar634
@asadpanhwar634 5 ай бұрын
I used create_openai_tools_agent with llama 3 and it worked fine. I think you can use it and even that loaded prompt from the hub is for openai model but it worked fine with llama 3 70b model.
@dharmikmehta5593
@dharmikmehta5593 3 ай бұрын
@@aj.arijit you can use create_react_agent instead of create_openai_tools_agent to prepare agent with Ollama & llama3.2 model. I build the same thing with open source model
@THOSHI-cn6hg
@THOSHI-cn6hg 8 ай бұрын
U can use Lllama 2 or other opensource api insteaddddddddddddddddddddddddddddddddddddddddd....
@binayashrestha4131
@binayashrestha4131 8 ай бұрын
thank you so much
@santhiyac8252
@santhiyac8252 8 ай бұрын
Hello krish , I had done google palm2 using pdf bot , but it's not gave response properly because prompt it will working publicly. How to handle prompt for specific dataset.please reply me..
@123arskas
@123arskas 7 ай бұрын
You're awesome.
@CyberSavvyMind
@CyberSavvyMind 4 ай бұрын
Amazing videos, Krisk! I have a question that you haven't covered yet. After getting results from the similarity search in RAG mode, you attach them to the prompt and send them to the LLM model. Given the character limit when querying the LLM, what approaches do you take if this limit is exceeded? Please explain this or create a video with code on this topic.
@cartolla
@cartolla 4 ай бұрын
Hi, very interesting video! How do I get the documents and their metadata returned by the retriever? I would like to show, for example, the Wikipedia links or articles to the user related to the answer.
@piyush_nimbokar_07
@piyush_nimbokar_07 8 ай бұрын
Sir can you please elaborate on using neo4j knowledge graph to build RAG application
@AIConverge
@AIConverge 8 ай бұрын
Great video. Any alternatives to Langsmith?
@yerasam
@yerasam 5 ай бұрын
Hi, Can you make a video on multitenancy using agents and tools?
@arID3371ER
@arID3371ER 8 ай бұрын
Man I thought you got your hair back! 😂😂😂❤❤❤
@kannansingaravelu
@kannansingaravelu 7 ай бұрын
Hi Krish, if we use "create_conversational_retrieval_agent", how do we pass the prompt - Is prompt mandatory?
@yashthakkar2629
@yashthakkar2629 7 ай бұрын
I tried adding an SQLDatabase tool to the tools list. I got an error because i think the QuerySQLDataBaseTool is not really returning a tool. What am i suppose to do if i want to add an sqlDatabase to the following list of tools without any error, kindly help.
@ashishmalhotra2230
@ashishmalhotra2230 8 ай бұрын
Hi Krish, can you make a video on conversational chatbot trained on own datasource
@AsifKhan-cc3ye
@AsifKhan-cc3ye 8 ай бұрын
hey krish, i develop an app using rag for qc of manually populated data in excel but the model is not performing with accuracy i used llama2, is there any other best athematic open source llm?
@thetagang6854
@thetagang6854 8 ай бұрын
Great video. Ignore the comments on not using OpenAI, if they don’t want to pay they wouldn’t be the ones to develop actual apps anyway
@mohsenghafari7652
@mohsenghafari7652 8 ай бұрын
Hi dear friend . Thank you for your efforts . How to use this tutorial in PDFs at other language (for example Persian ) What will the subject ? I made many efforts and tested different models, but the results in asking questions about pdfs are not good and accurate! Thank you for the explanation
@Mabzone-q4p
@Mabzone-q4p 6 ай бұрын
Hi @Krish Naik, i saw many videos on Generative AI, but i feel that there are many missing connections from basic level understanding to coding understanding, I thinking Everyone is capable to loading the libraries and use classes and get the code done. Also provide the basic core concepts also about Prompts, chat models, Tools, agent, memory, chains with their types and where to use them using coding. The basic knowledge in these videos are broken in different parts, time and space. Hope you will find this comment.
@vinayaksharma3650
@vinayaksharma3650 6 ай бұрын
Learn from the blogs posted on websites from their it will be easy to understand things like agents,Tools etc
@Mabzone-q4p
@Mabzone-q4p 6 ай бұрын
@@vinayaksharma3650 Hi, thanks for the suggestion. I read and tried to understand, but some of the concept are high overview that need to be understandable. My meaning for the above comment was, if someone is explaining the things which are already explained, so it means the explanation should be like more than from the document in a easy way.
@ayonbanerjee1969
@ayonbanerjee1969 5 ай бұрын
Is this updated langchain content available in your udemy course? Or is the udemy course in need of updates?
@divya-ob3jq
@divya-ob3jq 8 ай бұрын
Sir please make a video on virtual car assistant using LLMs
@muhammedyaseenkm9292
@muhammedyaseenkm9292 8 ай бұрын
How can we extract multi columnar tabular data , especially from images,
@karansingh-fk4gh
@karansingh-fk4gh 7 ай бұрын
Hi Krish, Can you please create Vedio on langgraph??
@atharvsakalley9633
@atharvsakalley9633 8 ай бұрын
How to get the accuracy of our search with implementation?
@SrinithiMalar
@SrinithiMalar 8 ай бұрын
Is block chain is good career to start in 2024 and it's future scope
@naudua9272
@naudua9272 3 ай бұрын
Pdf upload agen not implemented right
@DamanjeetGTBIT
@DamanjeetGTBIT 8 ай бұрын
I am learning Stats, sql and ML from miscellaneous videos. I want to start with a clean course . Which one is better to pursue data science career ? IBM Machine Learning or Google Advanced Data Analytics?
@slayer_dan
@slayer_dan 6 ай бұрын
Find a roadmap and follow it loosly but not a big reroute. Then find a person who teaches the concepts in a way you can absorb it. And practice if they recommend or not. For example, I found these people very much as per my taste. - Codebasics, Krish for ML concepts with analogies and practical implementation. - StatQuest for visual interpretation and understanding of Statistical concepts. I hope it would be useful for you.
@tekionixkeshavag.452
@tekionixkeshavag.452 8 ай бұрын
Pls don't use OpenAI in this project...as it's API is paid so we can't access it....instead use gemini or any other open source model...so that we can also try it at our end...
@datatalkswithchandranshu2028
@datatalkswithchandranshu2028 8 ай бұрын
Not gemini...as they are removing free features quicklu
@tekionixkeshavag.452
@tekionixkeshavag.452 8 ай бұрын
@@datatalkswithchandranshu2028 okk..
@quezinmark8225
@quezinmark8225 8 ай бұрын
Break your training data into chunks size less than token limit so that you can use free version even for big data...
@theinhumaneme
@theinhumaneme 8 ай бұрын
Switch the LLM in langchain
@tekionixkeshavag.452
@tekionixkeshavag.452 8 ай бұрын
That is somewhat complicated and we need help for that only from Krish sir ​@@theinhumaneme
@AlanSabuJohn
@AlanSabuJohn 8 ай бұрын
sir,can you do the embedchain tutorials
@rishiraj2548
@rishiraj2548 8 ай бұрын
🙏🙂👍
@mvuyisogqwaru2409
@mvuyisogqwaru2409 8 ай бұрын
Hey guys, is anyone else having an issue on the invoke call when using Ollama (llama2 and llama3). I get the following error: ValueError: Ollama call failed with status code 400. Details: {"error":"invalid options: tools"}
@aj.arijit
@aj.arijit 6 ай бұрын
exactly spent one full day wasting on this and reading hell lot of documentations although gained a lot of knowledge i found a function using gemini as no a agent formation tool for ollama which support chat generation using different tools and ollama together at the same time def process_user_request(user_input): # Parse user input for potential tool usage if "{" in user_input and "}" in user_input: # Extract tool name and arguments tool_call = user_input.split("{")[1].split("}")[0] tool_name, arguments = tool_call.split(":") arguments = eval(arguments) # Find the corresponding tool function for tool in tools: if tool._name_ == tool_name: # Execute the tool with user arguments tool_output = tool(arguments) return tool_output # User request doesn't involve a tool, respond normally return f"I understand, but I can't use a tool for this request. {user_input}" while True: # Get user input user_input = input("User: ") # Process user request and generate response with Llama 2 response = model.generate( input_ids=model.tokenizer.encode(prompt.format(user_input=user_input, list_of_available_tools=" * ".join([t._name_ for t in tools]))), max_length=1024, num_beams=5, no_repeat_ngram_size=2, early_stopping=True ) # Extract and format the generated response generated_text = model.tokenizer.decode(response[0]["generated_tokens"], skip_special_tokens=True) tool_output = process_user_request(user_input) final_response = generated_text.replace("{generated_response}", tool_output) # Print the final response to the user print(final_response) there is something called binding of func which i could not understand shit which could solve the problem using the langchain.agents func - create_tool_calling_agent
@omkarjamdar4076
@omkarjamdar4076 8 ай бұрын
I decided not to use retriever tool so, tools=[wiki,arxiv] the following error occured TypeError: type 'Result' is not subscriptable
@mohamed_deshaune
@mohamed_deshaune 8 ай бұрын
can you provide the whole code i can help you if you want
@dear_nidhi
@dear_nidhi 8 ай бұрын
this is work for me and i have created its ui also its working fine please send code or more about it ...we will help you
@dear_nidhi
@dear_nidhi 8 ай бұрын
Please dont use any paid api key ...we can't excess it
@rakeshkumar-pf6yu
@rakeshkumar-pf6yu 8 ай бұрын
with due respect....you are little faster here...don't know why you are in hurry these days...please be a little slower..may be of 80% speed of your current speed...thanks..
@thelifehackerpro9943
@thelifehackerpro9943 7 ай бұрын
Code explanation is not good, each class ans component should be explained properly, it is very confusing, you are just writing the code that you have been working with.
@madhugarg7499
@madhugarg7499 20 сағат бұрын
Average kind of project. This kind of project is not suitable to add in resume. Also better if you would explain in a better way the new terms, libraries. Currently you are just watching your one screen & writing the same to the other one, which will not really help audience.
@venky433
@venky433 8 ай бұрын
Did anyone faced below "orjson.orjson" module error. error:ModuleNotFoundError: No module named 'orjson.orjson' its coming after running "from langchain_community.tools import WikipediaQueryRun" code.
@venky433
@venky433 8 ай бұрын
this error is coming with python 3.12 , but it worked with lower version ie 3.10
小丑教训坏蛋 #小丑 #天使 #shorts
00:49
好人小丑
Рет қаралды 54 МЛН
Quilt Challenge, No Skills, Just Luck#Funnyfamily #Partygames #Funny
00:32
Family Games Media
Рет қаралды 55 МЛН
Python RAG Tutorial (with Local LLMs): AI For Your PDFs
21:33
pixegami
Рет қаралды 341 М.
LLM App: Building a RAG Q&A System with Streamlit, FAISS, and Hugging Face
51:53
BALA GOPAL REDDY PEDDIREDDY
Рет қаралды 1,6 М.
Database Sharding and Partitioning
23:53
Arpit Bhayani
Рет қаралды 107 М.
Advanced RAG 01 - Self Querying Retrieval
12:02
Sam Witteveen
Рет қаралды 49 М.
What is Agentic AI? Important For GEN AI In 2025
22:36
Krish Naik
Рет қаралды 88 М.
How I built an AI Teacher with Vector Databases and ChatGPT
13:43
小丑教训坏蛋 #小丑 #天使 #shorts
00:49
好人小丑
Рет қаралды 54 МЛН