🔥Join the AI Engineer Bootcamp: Hey there! The second edition of the AI Engineering Cohort is starting soon! 🚀 - Learn with step-by-step lessons and exercises - Join a community of like-minded and amazing people - I'll be there to personally answer all your questions 🤓 - The spots are limited since I'll be directly interacting with you You can join the waitlist now 👉 course.alejandro-ao.com/ Cheers!
@chercm2 ай бұрын
I am Not able to get the schema of the db
@palanikumarmsc4 ай бұрын
Thank you for this wonderful tutorial. Using this tutorial, I integrated Chat with database with MySQL, MS SQL, PostgreSQL, and MongoDB using HuggingFace's mistralai/Mistral-7B-Instruct-v0.3.
@davidtindell95010 ай бұрын
Hi, I reviewed this "Chat with MySQL DB" tutorial yet again and decided to try a more difficult SQL query: "determine the most popular artist in the database based upon total sales.". Of course, I manually ran the full Select -- with three table joins -- to make sure that this new query would work and produce the correct answer(s). To my surprise, the "natural language query" was properly processed by my modified version of your program and produced the correct response: " Iron Maiden with total sales of $138.60 " ! To further test the program, I changed the question to "top three most popular artists". The correct result was again returned: "Iron Maiden, U2, and Metallica" ! Glad to see 'U2' near the top ! Since "langchain-openai" is only at version 0.0.8 and "SQLDatabase" at 0.0.26, we may expect even more NLP Query improvements in the near future [i.e., if 'Altman' & 'Musk' do not mess everything up for all of us !?!]. P.S. Looking forward to the next MySQL vid(s) that you post !
@EricLofland10 ай бұрын
I got an error when I tried your first query - the query was correct (validated in database) but it was prepended by "sql" like this: [SQL: ```sql SELECT a.Name AS Artist, SUM(il.UnitPrice * il.Quantity) AS TotalSales FROM Artist a JOIN Album al ON a.ArtistId = al.ArtistId JOIN Track t ON al.AlbumId = t.AlbumId JOIN InvoiceLine il ON t.TrackId = il.TrackId JOIN Invoice i ON il.InvoiceId = i.InvoiceId GROUP BY a.Name ORDER BY TotalSales DESC LIMIT 1; ```]
@davidtindell95010 ай бұрын
@@EricLofland Thank You! For the feedback. The natural language query worked well for me and ChatGPT4 validated the SQL Query. I have since moved on to testing SQLite and also saving query vectors in ChromaDB. Later, I will try to re-check and reproduce the MySQL Database query and compare it to your results.
@davidtindell95010 ай бұрын
@@EricLofland I reset my laptop for my extension of Alejandro's "Chat with MySQL" Python program. It ran as before and produced both the SQL and Natural Language Query. SQL: run_query("select artist.Name AS ArtistName, COUNT(*) AS TotalSales from \ invoiceline join track on invoiceline.TrackId = track.TrackId \ join album on track.AlbumId = album.AlbumId \ join artist on album.ArtistId = artist.ArtistId \ group by artist.Name order by TotalSales desc limit 3;") NLQ: user_question = 'determine the top three most popular artists in the database based upon total sales.' BOTH Results were very similar: "'The top three most popular artists in the database based on total sales are Iron Maiden with $138.60, U2 with $105.93, and Metallica with $90.09.". Further, I researched any conditions under which "sql" would be prepended to the SELECT statement. I did NOT find any examples of this artifact, however, I did find a good very recent "Medium" article on this subject by " Senthil E " at " levelup.gitconnected.com/llms-meet-sql-revolutionizing-data-querying-with-natural-language-processing-52487337f043 ". As we say "Hope This Helps!" and thanks again for your detailed feedback.
@warrenmarkham889110 ай бұрын
@@davidtindell950 Are you writing up any of your experiments/tests? I'd be interested in seeing the caching of queries and responses into a vector database.
@davidtindell95010 ай бұрын
@@warrenmarkham8891 Hi, I expect to write a new "Medium" article fairly soon, however, I am still currently continuing my R&D. In addition to Alejandro's excellent tutorials, I have found another good tutorial that includes the employment of PyTorch for fast processing of large vector databases: " Maximize ChromaDB Embedding Vectorization Speed with NVidia CUDA GPU and Python Multiprocessing " " Johnny Code " " kzbin.info/www/bejne/bXfZlaqtq9alepY " and " kzbin.info/www/bejne/bXfZlaqtq9alepYsi=a18-dKxTYk2UvRMT ". Also, there are several good current "Medium" articles available on this research subject.
@imranonthenet11 ай бұрын
I really love your tutorials, you are teaching us to create such powerful AI tools in Python that are really useful. I'm surprised that you have only 26K subscribers, you should have millions.
@alejandro_ao11 ай бұрын
thank you man! i hope i will get there someday!
@davidtindell95010 ай бұрын
Thanks for including MySQL and not just SQLite.
@alejandro_ao10 ай бұрын
sure :)
@jatinnandwani667811 ай бұрын
Thanks Alejandro!
@alejandro_ao11 ай бұрын
Thank you man, you are amazing
@p80mod27 күн бұрын
THANK YOU SO MUCH ALEJANDRO! I've struggled quite a bit for the following reasons: 1. I use Azure Open AI instead of Open AI, which is a little different. 2. GPT-4o returned a wordy answer ("to answer that question, you could use the following query...") from the SQL chain instead of query only, and that couldn't be passed to the full chain - had to work with that. 3. Had to reinstall several LangChain libraries. Nevertheless at the end it works beautifully, answering questions based on our MSSQL database. This video was a tremendous help. Thank you!
@1sanak11 ай бұрын
Nice, looking forward to part 2!
@alejandro_ao11 ай бұрын
Coming soon! 😎
@atagymx2 ай бұрын
Thank you so much for this Tutorial, you have no idea how much i have been struggling on some basic things like ports or pip installing and etc. and you have showed ALL the STEPS unlike the other videos/channels, i really appreciate it. I finally could understood and made an AI Project thanks to you, keep up the good work man!
@alejandro_ao2 ай бұрын
i'm very glad to hear you got this working! great job man! i'll keep posting more tutorials here :)
@madhutera29810 ай бұрын
Thanks!
@madhutera29810 ай бұрын
Hi - how does this work when the response returns table data - example - show me top 10 artists by song streaming count?
@alejandro_ao10 ай бұрын
hey there! thank you so much for the tip!! it totally would work. the results are always a table actually. what happens is that the LLM receives a table-like prompt and reads it as though it were simple text. so in this case, your LLM would receive the table in the prompt and return something like "the top 10 artists are...." and it may even give you more details depending on your initial instructions :)
@funmiemore19410 ай бұрын
First time here and I'm glad I gound your channel.Thanks for sharing!
@alejandro_ao10 ай бұрын
hey there! welcome to the channel :) very happy to have you here :)
@funmiemore19410 ай бұрын
@@alejandro_ao Thanks!!
@krisograbek10 ай бұрын
Loving this video, Alejandro! 2 things I'd like to add: 1. I removed the {schema} part from the full_chain prompt and it works great. Only the sql_chain needs schema (to produce the sql query). 2. Your website is down :( Thanks for explaining every bit of your code so well!
@dgm194910 ай бұрын
Thank you for your videos. As a subscriber to your channel, I look forward to each new one. I would like to make a comment, and perhaps a suggestion for a future video. As a Finance Manager for the last 40+ years, I have come to find these three areas important as it relates to the retrieval of information. What every office need is a way to search: 1) URL's , EXCEL, and PDF on the internet. 2) EXCEL and CSV files locally. (and Securely) 3) PDF and DOC documents locally. (and Securely) And nowadays, you could never get a NON GUI program adopted by the office staff. And of course, all three of the search types would be incorporated into the same GUI. Python based program would be the preferred language. OLLAMA based. NO DOCKER. No wrapper programs like streamlit, etc. Thanks for your time, and keep up the good work.
@alejandro_ao10 ай бұрын
Hey there, thanks for following the channel and for your suggestions! I see what you mean and agree that in order to get an app adopted internally, it should be very straightforward and easy to use. However, I don't see why it shouldn't use streamlit? Streamlit is just a way to build the GUI in a few lines of code. And if what you are interested in is the privacy, you can totally host it internally and have your data never leave your network.
@shreyasb.s38192 ай бұрын
Really wonderful explanation and presentation.. Each and every line of code explanation is mind bowling.. I liked it❤ thank you so much
@amitabhranjan9668Ай бұрын
Thank you so much for this. It worked flawlessly
@alejandro_aoАй бұрын
thanks! that's great! keep it up 💪
@tannerdio3339 ай бұрын
Dude, incredible tutorial, right on the money for what i needed.
@zhiouzhu202Ай бұрын
Hi, i have a question that, i have created one model by myself and i want to use it for communication with lanchain how can i do that? I have tried to change the llm = OpenAI() to the path that point to my model directory but its doesn't work.
@JayasuriyaM-z8vАй бұрын
Thanks mate.is that allows Executing the CRUD opeartions in DB?
@alejandro_aoАй бұрын
absolutely. but be careful about allowing your LLM to perform write operations. you probably want a human in the loop to validate the queries before they are executed
@_deepak__jangra17 күн бұрын
@@alejandro_ao or you can have the readonly access of db
@SanjayRoy-vz5ih11 ай бұрын
Have done it with SAP Hana DB 6 month back..issue related hallucinations are faced and token size limit is also a constraint with open ai GPT 3.5 turbo
@kaiser_the_emperor8 ай бұрын
Did you make it work at the end? Or was not worth the struggle?
@SanjayRoy-vz5ih8 ай бұрын
@@kaiser_the_emperor Not much work done on that further but of course you can use combination of SQL agent and combinations of prompt techniques but the issues is different as you SAP would not support or recommend working directly with SQL tables but I would still try it as Q&A bot simply as an "art of possible" solution...I am now trying to do the same using combination of OData API and through BTP using function calls and agent architecture
@adnank498011 ай бұрын
Love your videos, started watching all your langchain and it really has helped me and I wanted to say thanks. I would also like to see the use of agents if it isnt too much to ask
@alejandro_ao11 ай бұрын
hey there! thank you for telling me this :) keep it up and keep learning 🚀 i'll bring up agents here very soon!
@AIWALABRO10 ай бұрын
can you tell me at timestamp 14:29 what is grep SQL, i confused when I was doing , it shows it not recognizing
@alejandro_ao10 ай бұрын
hey there, that's just to only return the lines that contain the string "sql" when doing 'pip freeze'. otherwise i would get the huge list of all the packages installed. 'grep' is a unix command that allows you to filter the output text and return only the lines that contain the passed string 👍
@victorchrist989910 ай бұрын
for the GUI, what tool would you suggest one can use to return a table, just like in mysql
@msssouza28 ай бұрын
Hi Alejandro. Great post! It helped me a lot. I was trying to find a Gemini alternative to a solution that I learned from a Udemy course, using LangChain and OpenAI Agents to access a SQLite database and pass the results to the OpenAI LLM. I searched for days and found nothing, until I saw your video. Now my code is running and I can see many possibilities for accessing enterprise databases to enable users to obtain results using generative AI. Thank you and greetings from Brazil.
@anismairi86510 ай бұрын
When i write the code in py file, the schema variable in full_chain function s returning an error (it expect dict type and get_schema return str), do you have any idea on how to fix it ? i've checked langchain doc + you article but i still cant find a solution... thanks for all the tutorials i've learned a lot, keep going !
@justinchang35739 ай бұрын
Having the same issue
@BrandonFoltz11 ай бұрын
Fabulous!
@alejandro_ao11 ай бұрын
hey Brandon! thanks!
@heaton92211 ай бұрын
Love to watch your tutorials. It's very details.
@alejandro_ao11 ай бұрын
thanks! keep it up!
@abhishekbourai18329 ай бұрын
Thanks for such amazing resource, Alejandro.. i am getting this error: attributeerror : dict object has no attribute get_table_info.. when i try to invoke chain
@alejandro_ao9 ай бұрын
seems to me like your database instance is not being created. try logging the type of your SQL client to see if it was actually defined
@mazinzain712111 ай бұрын
I love your content, thanks for all your efforts ❤
@alejandro_ao11 ай бұрын
i appreciate it! let me know what you want to see next
@tanzeelmohammed91579 ай бұрын
Hi Alejandro I have been trying to do the same thing. The problem I do not have credit in OpenAI and i wanted to know if there is any other way using opensource models to achieve the same result..specially in LangChain. Is there any other way?
@noustelo11 ай бұрын
Awesome. Looking forward to part 2....
@onurolceАй бұрын
Your code halts to Kernel and it does not work. So there is a problem about "prompt.format(schema="my schema", question="how many users are there?"). what is "my schema" in this because kernel halts when I try to run database connection functions.
@onurolceАй бұрын
I've found the solution. it is using pymysql library instead of mysqlconnector library. So first install pymysql library and change db_uri. It is : db_uri = "mysql+pymysql://root:password@localhost:3306/Chinook"
@bwilliams06011 ай бұрын
Your videos are the best!
@alejandro_ao11 ай бұрын
you are the best
@teddyperera85318 ай бұрын
This is a great tutorial. Thanks for explaining it in a way that's easy to understand
@alejandro_ao8 ай бұрын
thanks Teddy! i’m glad it was useful!
@bitcoinjc8 ай бұрын
Will this work with a Microsoft SQL relational database that is much bigger too?
@Rajat_Jaiswal7 ай бұрын
Thanks works well . Amazing tutorial
@alejandro_ao7 ай бұрын
Great to hear! All the best!
@chercm2 ай бұрын
Hi sir I am not able to get the database scheme output even though it is connected
@wg592010 ай бұрын
Amazing Video. Great job
@arishasaeedАй бұрын
Can we add memory to this architecture? So this bot is conversational
@rameshh38218 ай бұрын
Could you please let me know if it's possible to use LangChain to query multiple tables and generate data visualizations on the chatbot interface? I've seen solutions for a single CSV file using PandasAI or LIDA, but I haven't found anything that works with multiple tables stored in a database.
@Aidev787611 ай бұрын
Thanks. What about long ot short term memory during conversation?
@chercm2 ай бұрын
hi do you any thing for chat with Microsoft SQL DB ?
@yashtrisha59197 ай бұрын
Thanks but i follow the tutorial and i am getting this error "xcept MySQLInterfaceError as err:", do i need to replace " " in query full_chain function?
@brunocarvalho32295 ай бұрын
Hey Alejandro! Amazing content, it helped a lot with the application we're building on my company right now. One question, let's say that the user wants to give some feedback on the output and wants to generate again? Would I have to run the full chain again ? Do you already have another walkthrough on this memory context? Thanks!
@batigol_99 ай бұрын
I was wondering if your calendly link is working for consultations ?it seems that its down
@sam-uw3gf11 ай бұрын
great as always bro
@alejandro_ao11 ай бұрын
thank you bro
@neelarahimi10534 ай бұрын
Thank you for the amazing video. I have seen practices where they make "vector stores" from DB schema, using LangChain's vectorization and provide that to LLM in the prompt, rather than sql-generated schema. Which practice is preferred?
@alejandrocano532310 ай бұрын
how to manage that the first chain doest find the information in the SQL Schema?
@artislove49111 ай бұрын
Great stuff! Many thanks 💪
@alejandro_ao11 ай бұрын
No problem 👍
@machinelearningzone.62309 ай бұрын
Awesome tutorial! I tried to implement the same on collar, using a sqlite database(chinook). but consistently get the error:"'NoneType' object has no attribute 'get_table_info'" when I try to ge the table schema. Any work arounds?
@RedWhiteBlue2097 ай бұрын
Could you please post your requirements.txt? When running your code, I got this error: ImportError: cannot import name 'LangSmithParams' from 'langchain_core.language_models.chat_models'
@scratch-90978 ай бұрын
Hey , is it possible to run DML queries using chains or agents?
@tinytube4me-z8h7 ай бұрын
When running your code, I got this error: ImportError: cannot import name 'LangSmithParams' from 'langchain_core.language_models.chat_models'. Could you please post your requirements.txt? Thank you!
@chercm2 ай бұрын
Hi . It is great video . Is it possible to use azure open ai and Microsoft sql with this ?
@Soneone-kz4fp25 күн бұрын
I want to do the same thing for mongodb, how can I do that?
@arunsnmimtimt7 ай бұрын
@alejandro_ao,can you kindly let me know whether it can be done with out API. that is to run the same locally on desktop?
@07-bmanohar7010 ай бұрын
Does this can be implemented to the large Databases ?
@alejandro_ao10 ай бұрын
absolutely, just be careful that these two things fall within your context window: - the table schemas of your database (unless you have a humungous number of tables, it should be fine). - the results from your query (as they they will be sent back to the model for interpretation). the second point is more tricky than the first one. you may want to update your prompt to make sure that it does not allow to query more than X number of records at a time.
@MrJaczes9 ай бұрын
@@alejandro_ao i thought about this topic to use Views to handle sets form large database and pre-agregate it there
@ishanagrawal3963 ай бұрын
Thanks! This is helpful! How can we create it with Open Source model instead of Open AI as it is paid? Also, Is there a github repo for this?
@Jaypatel51210 ай бұрын
Hey, thanks for the video. However, few questions if you don't mind: 1. Do we need to assign the sql_chain inputs again when building a full_chain ? Won't it remember from its own structure. I see that as complexity grows, we end up adding a whole bunch of assignment in full chain. 2. For full chain, is there a way to use pipe operator to let the sql_chain response be passed to the next step where you run_query ? Basically, RunnablePassthrough.assign(sql_chain) | run_query | prompt | llm | StrOutput... ? Thanks again for your wonderful tutorials.
@reinerzufall31238 ай бұрын
you just think too complicated 😉
@AbhinavKumar-tx5er5 ай бұрын
Will it work with large dataset? lets say more than 20GB?
@Matepediaoficial11 ай бұрын
I love you!! You are the best!!!
@alejandro_ao11 ай бұрын
i love you more
@RicardoIturra-w8y11 ай бұрын
Excellent! ♥
@alejandro_ao11 ай бұрын
i really need to update my video on memory. i'll look into it!
@frankfromthebulb61814 ай бұрын
Great job! Your Calendly link in the description isn't working. How can I get in contact with you for a project?
@isvic00710 ай бұрын
Can we not using Oracle DB for this test?
@chibuzoemelike640311 ай бұрын
Thank you so much for this video, this is really helpful!! Looking forward to using huggingface models
@alejandro_ao11 ай бұрын
Great! Which models in particular are you interested in?
@palanikumarmsc8 ай бұрын
Mistral AI
@legilord37867 ай бұрын
thanks for the great video, does it matter how big the database is? or can a database be too big? in my case i have a database with a size of about 300mb
@abhaypkyek11 ай бұрын
You really made my day. i was trying to figure out this code from Langchain templates from many days. But you gave a perfect clarity giving step by step understanding. Thanks alot for that. Can u further enhance this in your next video as said in the end of the video with ollama & vizualization from the response using PandasAI or LIDA AI or something better please.
@alejandro_ao11 ай бұрын
it's great to hear this! thank you for letting me know. and congrats for finally getting through it! keep it up 👍 that's actually a great idea. i'll see if i can put it in the next video or make a dedicated video about this!
@natawebmaster10 ай бұрын
Thanks for useful video)
@alejandro_ao10 ай бұрын
it's my pleasure :)
@v.svishnu23805 ай бұрын
I have a doubt, Is it safe to connect langchain with production database ?
@alejandro_ao5 ай бұрын
I would recommend that you only do this after thorough testing. and very importantly, do not give write privileges to the MySQL user that you are using to access the database. You don’t want your LLM to be able to update or delete data in your database 👍
@TS-ml4dp5 ай бұрын
Hey Alejandro , your guides are great , the best in the net. I would like to ask if you think that replace the schema with Knowledge graph is a good Idea , the reason is that in practice in large DB and DWH the meta data on tables & fields name are not align with business terms and the idea is to leverage the meta data via knowledge graph. what do you think? can you do some guide about KG?
@warrenmarkham889110 ай бұрын
Thanks for the content. The link to your blog post doesn't work for me.
@alejandro_ao10 ай бұрын
hey there, can you try again? i think my dns server was giving me trouble last week
@warrenmarkham889110 ай бұрын
@@alejandro_ao Still not working I'm afraid.
@alejandro_ao10 ай бұрын
@@warrenmarkham8891 just refreshed the DNS settings on netlify! Should be up now!!
@warrenmarkham889110 ай бұрын
@@alejandro_ao Yep, you punched the right ticket that time. It is now working great.
@lucasgrandini17764 ай бұрын
teria como conversar com o sqlite de um zabbix-server?
@drummermike515011 ай бұрын
Great tutorial as usual Alejandro! Is it possible to do this with SQL Server? I look at the documentation and it doesn't appear so but maybe I'm missing something.
@alejandro_ao11 ай бұрын
Thanks! I am not sure how it would work with MS SQL Server. I suppose you would need a driver to connect it. Since SQLAlchemy supports it, I suppose that it can be done. Maybe if you add the driver to the URI like we did here, but instead of adding the MySQL driver, you add one for MS SQL Server? I checked and this driver might work, but I am not on Windows, so I have no way of testing it right away: ```python from langchain_community.utilities import SQLDatabase db_uri = "mssql+pyodbc://username:password@hostname:port/DatabaseName?driver=SQL+Server" db = SQLDatabase.from_uri(db_uri) ```
@drummermike515011 ай бұрын
@@alejandro_aoI'll give that a try. Thanks much!!
@dswithanand9 ай бұрын
I made the sqldb chatbot using fastapi and every thing is working fine except that the chat memory history. Can you suggest how can we implement memory with fastapi.
@dswithanand9 ай бұрын
Can you please answer this
@dr.aravindacvnmamit377011 ай бұрын
Hey very nice, I had one query , LLM based Application to assess the quality of language being used by parents and give practice sessions to improve them. Can you show us "It is a kind of Therapy for special children to make understand the words
@av174310 ай бұрын
Thanks for posting this. Very helpful. is there any open source LLM which can convert Natural language to SQL ? would Llama, Flan T5 etc work instead of gpt ?
@batyratamamedov76334 ай бұрын
Thank you for the video, can you please make a video or guide to implement it with Ollama and postgres database? Thanks in advance!
@MrYoriiАй бұрын
Great! Thanks
@monishamonisha-zt3uy10 ай бұрын
When I execute full_chain.invoke ({"question": "how many albums are there in the database?"}) it returns {'question': 'how many albums are there in the database?'}
@kartiksaini-xn6ke9 ай бұрын
i am facing the same issue, did you get any solution ?
@guanjwcn11 ай бұрын
Awesome. Could you include streaming in part 2 as well?
@alejandro_ao11 ай бұрын
streaming is coming very soon 😎
@fbravoc974810 ай бұрын
Amazing video!! Thanks for creating it!! Is there a way to apply the same principles but with another LLMs (open-source)?? If I had gone over all the database I am working with and I had identified the queries related to the most frequent questions, how can I finetune my queries to these frequent questions? ..should I think about adding RAG logic to it?
@uplifting_sounds4 ай бұрын
Love the idea, tried it but constraints from openai.
@MADMAX-rw7jx10 ай бұрын
There is no password in my SQL server what to put in db_uri sir
@MuhammadAhmad-o4q2 ай бұрын
Will this method work with multiple db's and 500+ tables ? If not than what can be a way
@alejandro_ao2 ай бұрын
as long as the db ERD schema fits into the context window, you should be good!
@SandhiniGopinathan10 ай бұрын
It's a wonderful session but the link to your blog post doesn't working.
@alejandro_ao10 ай бұрын
just fixed it! DNS propagation problem after meddling with some records on netlify :S
@prashu259252 ай бұрын
Can I DEPLOY this in my company ? Or it can cause data breaches?
@alejandro_ao2 ай бұрын
there is some data going to the OpenAI servers. if you are fine with that, go for it! else, you can either use a secure deployment on Azure OpenAI or host your own model :)
@youngchrisyang10 ай бұрын
Thanks so much Alejandro! Great contents. Btw seems your website is down today>?
@alejandro_ao10 ай бұрын
hey there, thanks! can you check again? i think my dns has been struggling with some changes i did recently :S
@youngchrisyang10 ай бұрын
@@alejandro_ao Thank you Alejandro. All works fine now!
@dswithanand9 ай бұрын
I have a question: If the database belongs to an e-commerce website with a substantial product inventory, and a user query such as 'SHOW ME PRODUCTS' risks exceeding the ChatGPT token limit of 60000, how can this issue be effectively managed?
@alejandro_ao9 ай бұрын
Yeah, maybe that will require a bit of prompt engineering to make sure that you never index more than X number of records. You could add something like "if you are selecting records, please never call more than 100 records in a single query" or something like that. another alternative would be to use an agent, that would make it more flexible
@dswithanand9 ай бұрын
@@alejandro_ao Thanks for reply.. I tested this and it actually worked. Thanks bro
@geekyprogrammer483111 ай бұрын
Can you create langchain based streaming chatbot? Would be very helpful. Thanks for creating high quality contents!
@alejandro_ao11 ай бұрын
coming very soon!
@matten_zero11 ай бұрын
Can you do this with NoSQL like MongoDB🙏?
@alejandro_ao11 ай бұрын
great idea
@KurskikhA10 ай бұрын
Hi, been following you for a long time, very cool content. Can you please tell me how to use Langchain for MSSQL or Postgres?
@alejandro_ao10 ай бұрын
hey there! thanks for following the content :) i'll be putting up a video about postgres soon. i actually haven't tested if this would work using a mssql driver. but look, apparently you can pass in a driver just like we did here with mysql, but for MSSQL: docs.sqlalchemy.org/en/20/dialects/mssql.html#module-sqlalchemy.dialects.mssql.pyodbc
@AdhisashanJ8 ай бұрын
Nice, how to do column mapping. Like "How many creator are there" as input question but we have artist instead. I tried as below but still it not works sql_chain = ( RunnablePassthrough.assign(schema=get_schema, column_mapping=get_column_mapping) | prompt | llm.bind(stop=["\ SQLResult:"]) | StrOutputParser() ) column_mapping = { r'creator|artists': 'artist', # Add more mappings as needed } def get_column_mapping(_): column_mapping_str = " ".join([f"{key}: {value}" for key, value in column_mapping.items()]) return column_mapping_str
@jimgsewell11 ай бұрын
Thank you. This is a real interesting idea. I wonder how complex of a question you can ask it. Can it come up with a query which requires a merging of tables, or returning multiple values? You have me curious.
@mwdcodeninja11 ай бұрын
I have used this method to join across a db of 34 tables to form a master document archive for customer data. I was happy with it.
@alejandro_ao11 ай бұрын
absolutely, it all depends on your LLM's accuracy at executing the queries. in my experience, GPT3 16k is very good. as @sanjayojha1 mentioned, a coding-specific LLM might give you even better results. just don't forget to limit the scope of your MySQL user to avoid security problems!
@chibuzoemelike640311 ай бұрын
@@mwdcodeninja wow, it's glad you've been able to use this before and it performs a join. I'm working on a project with bigquery which has so many data, it doesn't give the desired output as it doesn't perform join, I'll be glad if we can connect and work on it together. Thank you in advance
@chibuzoemelike640311 ай бұрын
@@alejandro_ao please how do we implement a code specific LLM to the chain.
@mwdcodeninja11 ай бұрын
@@chibuzoemelike6403 would you be able to share the schema?
@Sunny-tk2fu6 ай бұрын
No doubt your tutorial is awesome but i want to just suggest that you should minimize the sibilant sound through davinci resolve or other video editing software to be making video more compelling
@alejandro_ao6 ай бұрын
this is a great suggestion. i actually use resolve but didn't know i could do that. thank you!
@marvinmarkham84058 ай бұрын
Videos are exceptional just very high level and the mic seems close when speaking so its not pleaseant for me to watch the vids with that type of mic calibration. Luckily the website is a better walkthough for me. Thanks for your work
@alejandro_ao8 ай бұрын
hey thank you for the feedback, glad you found the tutorial useful mate. i’ll improve the sound in future vids :)
@muneerAbro8 ай бұрын
Very Informative video ❤, can you create this video in PHP
@alejandro_ao8 ай бұрын
hey, thanks! i would like to but unfortunately langchain does not have a PHP version that I know of :( if you are trying to create something like this for a PHP app, I would create an API that deals with these processes in python or js and then have my PHP app query this API
@muneerAbro7 ай бұрын
@@alejandro_ao welcome dear, I tried in php/ laravel, it is working fine but some time it shows response "Unable to generate query". So I am very afraid of this, If I deploy this chat on live server, If I get such response, it will be bad experience for me, can you suggest me How I get rid of this error? this error comes from chatgpt-4o openAi
@Astar-o5k10 ай бұрын
is open ai compulsary?
@alejandro_ao10 ай бұрын
absolutely not. you can import any language model that langchain supports: python.langchain.com/docs/integrations/chat/
@StnImg11 ай бұрын
❤❤Can u apply RAG on schema to accommodate bigger Databases with huge tables in next video?
@sanjayojha111 ай бұрын
Great as always. I personally find the RunnablePassthrough ugly syntax and confusing. Also, using a coding specific LLM we might get better SQL query with less hallucination.
@alejandro_ao11 ай бұрын
hey good to see you again. thanks! i totally agree with you that RunnablePassthrough could be better. and about the LLM, totally. not only better, but also faster, as it would be a smaller model 🤔
@chibuzoemelike640311 ай бұрын
Please what do you mean by coding specific LLM?
@monica.b18111 ай бұрын
Awesome videos.. really appreciate your efforts 👍 Could you please make a video to create a chat bot for WordPress websites.. because scrapping the content from WordPress websites is a bit tricky and passing them to divide into chunks are throwing errors...please help
@monica.b18111 ай бұрын
@UC1oXUA7qgs0GZc_yk46K2OQ hi, I am grateful you replied to my comment 😊🙏 Actually I don't have access to deal with database..the idea of my project is to create a webchat bot for dynamic WordPress websites, where I can scrape all the content from sitemap.xml and then divide it into chunks -> store in any database like faiss or vector store and finally with streamlit I would able to chat with the content on the site. So I need your help in this, as I am beginner and new to these technologies. Please
@monica.b18111 ай бұрын
@UC1oXUA7qgs0GZc_yk46K2OQ hi, I am grateful you replied to my comment 😊🙏 Actually I don't have access to deal with database..the idea of my project is to create a webchat bot for dynamic WordPress websites, where I can scrape all the content from sitemap.xml and then divide it into chunks -> store in any database like faiss or vector store and finally with streamlit I would able to chat with the content on the site. So I need your help in this, as I am beginner and new to these technologies. Please
@nazarmohammed568111 ай бұрын
chatting with website using gemini pro plz make a video on this???????