Oh yea! More to learn this weekend! Cool to see all of these technologies merging and synergizing. Keep up the great work!
@alejandro_ao10 ай бұрын
Thank you Brandon! Always a pleasure to see you here!
@aristoz198610 ай бұрын
Great!! I will try it out this week!! Keep on going with the good stuff🎉
@alejandro_ao10 ай бұрын
Thanks!! Let me know how it goes!
@lookingaround158610 ай бұрын
Thanks @alejandro_ao. Could you make a video on implementing graphs/charts alongside the NL response? Congrats on your diploma!
@jim023778 ай бұрын
I just made it through the entire tutorial. As usual you attention to the details is awesome. Now I want to try to agent version and see the difference!
@alejandro_ao8 ай бұрын
thank you Jim, it's great to see you around here! i'll be putting a version with agents soon!
@ztamnaja28 күн бұрын
Thanks! very good content!
@trideepsaha259410 ай бұрын
This is what we have been waiting for a time, No words only 🎈🤩..We must all do our homework. Special thanks for groqcloud.Congratulation on your Diploma AO.
@alejandro_ao10 ай бұрын
thank you!! very glad to hear this was useful! let me know what else you would like to see!
@trideepsaha259410 ай бұрын
@@alejandro_aoFrom this side anything u want to teach. #AI #RAG #LangChain #FineTune
@AasherKamal7 ай бұрын
The way you explain the details is impressive.
@shashankkumardubey626010 ай бұрын
Great project. As a beginner this project can help me to learn better. Do make some more unique projects . Keep doing great work.
@alejandro_ao10 ай бұрын
thank you mate, i will!
@oscarzamorapicazo92872 ай бұрын
Thanks Alejandro! GREAT STUFF AND EFFORT
@scollin1010 ай бұрын
Love your walkthroughs! I’m using your method and approach to chat with a MySQL database of resumes. The eventual goal would be to build up the database of both resumes and job descriptions and have my team of recruiters be able to prompt it to optimize efficiency.
@alejandro_ao10 ай бұрын
hey there, that's an awesome idea! just remember that, for a real application, you should not use the `root` user of your mysql database. create a new user that only has READ privileges. you wouldn't want your LLM to accidentally write anything into your database or delete some data!
@scollin1010 ай бұрын
@@alejandro_ao I use a .env for now just as good practice but it is all local as I’m encountering some issues. I would love to pick your brain and chat over coffee or a consultation session if that’s cool.
@scollin1010 ай бұрын
@@alejandro_ao Thanks for the tip on using root! I would like to pick your brain and seek your help on my app. I'll shoot you an email.
@Suryav124 ай бұрын
Great idea
@Suryav124 ай бұрын
Have you done it? And what's the approach you went with. Is it something like u let the say hr upload the resumes and extract necessary information and store it in database. And then do the stuffs in this video. Is it like that?
@vinayakmane75699 ай бұрын
I am loving your work bro . dont stop . keep making such unique projects
@alejandro_ao9 ай бұрын
thank you! there is much more coming up :)
@RahulM-lm6qg9 ай бұрын
please make video for this usecase only using open source LLM models
@homerobaroni16554 ай бұрын
A version using local Ollama would be nice!
@OwaisBinMushtaq10 ай бұрын
Great .... Will try to implement this week 🎉🎉🎉🎉
@alejandro_ao10 ай бұрын
let me know how that goes
@muhammadqasim65249 ай бұрын
Congratulations on your Diploma. 🎊 Enjoying your videos.
@mlg403510 ай бұрын
Congratulations on your diploma!! Great video!
@alejandro_ao10 ай бұрын
thank you!
@peralser7 ай бұрын
Great Work!! Thanks for sharing your time and knowledge with us.
@alejandro_ao7 ай бұрын
i appreciate it man. it's my pleasure :)
@TYTennis8 ай бұрын
Hi, this is great thank you so much I'm learning a lot from this. I see you said this wasn't production ready and that improvements could be made. What improvements do you think you would make and why would that be an improvement? Thanks again for this video, you're very ahead with tech :)
10 ай бұрын
Thx Alejandro , great stuff like always .. crystal clear 😎. I changed your code a bit to use it with a postgre database, works like a charm !✊❤🔥
@alejandro_ao10 ай бұрын
that's so cool! which adapter did you use to connect? psycopg2?
10 ай бұрын
@@alejandro_ao yes this one
10 ай бұрын
I started to push the prompts a little further (formatting, tone, etc...) it's a really good base to work with. Thanks again for this starter kit 🙏💪
10 ай бұрын
@@alejandro_ao did you ever try with a mongodb?
@alejandro_ao10 ай бұрын
@ so glad to hear this!
@babas599010 ай бұрын
Congrats on earning your diploma.!!! Thank you for your excellent video tutorials.
@alejandro_ao10 ай бұрын
thank you!!
@matiasparouy10 ай бұрын
Thanks Alejandro for sharing! excellent video! 👏
@alejandro_ao10 ай бұрын
thank you man :)
@NasserTabook10 ай бұрын
Great tutorial, Thank you for the hard work
@alejandro_ao10 ай бұрын
thank you, i appreciate it!!
@avijitsamantaray8420Ай бұрын
Great work ❤
@adityabhagwat69659 ай бұрын
Great content! Appreciate you sharing. Excited to give it a go!
@MarioLopez-bm9mf5 ай бұрын
Great, thanks for sharing. Quick question: Can it be used with large databases that have more than 100 tables? What do you recommend for handling large databases?
@Sanitiser2545 ай бұрын
Love your work man
@alejandro_ao5 ай бұрын
i appreciate it :)
@TheAbanik10 ай бұрын
Amazing video as usual, very helpful. Could you explain why you decided to use GPT 4 instead of 3.5 Turbo ? Is GPT 4 better in crafting the sql queries based on natural language ? Also can you try creating a video based on langchain agent to showcase a use case where user asks a question, the response can be either in a document/pdf or from a database and agent needs to figure out the correct way to respond.
@alejandro_ao10 ай бұрын
hey there, sure. i used gpt-4 because i needed its 128k token context window to be sure that it could ingest the entire database schema. but actually, when the schema could fit in GPT-3's context window, i found that it actually worked better. i will be doing videos on agents very soon!
@claudiogandolfo26274 ай бұрын
Hi Alejandro! Thanks for the video and congrats for your achivement. Here is my little contribution to your great viedo. It seems that the example SQL query for question: "which 3 artists have the most tracks?" is not right (at minute 28:27). One solution could be "SELECT Header.Name, COUNT(*) as track_count FROM (SELECT artist.Name, album.ArtistId, album.AlbumId FROM album JOIN artist ON album.ArtistId = artist.ArtistId) AS Header JOIN track ON Header.AlbumId = track.AlbumId GROUP BY Header.ArtistId ORDER BY track_count DESC LIMIT 3;" 😉
@AIWALABRO10 ай бұрын
I love this video! can I put it into my resume? In the end, you said that in real life we do things by "agents" instead of "chains". here we did it with "chains". 1) so for the production-ready how we approach it are we used chain or agents? 2) can you make a video by using agents?
@alejandro_ao10 ай бұрын
hey there, absolutely! feel free to add this to your cv :) 1) i would test both. the only difference is that an agent would probably be better at processing your query. it would be able to check the result from the mysql database and reformulate its sql query if the result does not answer the user's (your) question. best would be to test both and see which one performs better! 2) coming up!
@AIWALABRO10 ай бұрын
@@alejandro_ao thanks for your response! eagerly waiting for the agents video in mysql, we can say mysql part-3.
@songfromktown5 ай бұрын
Thanks for this great tutorial. In case I would like to have some plot (bar or line charts of any column of a table), how should I approach ?
@ritishajaiswal991810 ай бұрын
Hey Alejandro, Thanks a bunch for the helpful video. Could you make another one where you show us how to use a model we've downloaded locally (Can be a quantized one)?. Also, it'd be awesome if you could include Vertica database as the DB in the process. You don't have to go through the whole process again, just maybe explain how to connect to the database using the local model. I've heard there are models like SQL-coder-7b that are designed for translating plain English into queries. It'd be really helpful if you could check out a few of these models and share your thoughts on which one would work best in this kind of situation. My organization is pretty cautious about sending sensitive data to LLM model APIs, so it's important for us to be able to do this kind of thing locally without relying on APIs. Looking forward to the next video. Thanks again for all your helpful content!
@laurarusu19898 ай бұрын
Hi, I did this with llama3-7b. Actually the code is the same one from the video. First, you install the local model. in my case I installed llama3-7b by running in Terminal 'ollama pull codellama:7b' then I run 'Ollama run code llama:7b' (you can go to the llama site and see the actual command) then when I set the llm I used 'from langchain_community.chat_models import ChatOllama llm = ChatOllama(model="codellama:7b")' . everything else should be the same. Langchain supports many different llms, you could see the documentation and check if the llm you're interested in is available and use the corresponding module. I hope this helps you :)
@bobcravens10 ай бұрын
I’ve been enjoying your channel. I wonder at what temperature there is a probability of the LLM dropping tables from your DB ;-)
@alejandro_ao10 ай бұрын
oh yeah, absolutely, please DO NOT use this with a mysql user that has write privileges
@openyard9 ай бұрын
Thank you. Another video with great content.
@alejandro_ao9 ай бұрын
thank you!
@tinsp2534 ай бұрын
What a great work. Thank you. If the database tables don't have proper names, I guess the system won't work am I right? When I looked into my client's database, they used silly names that didn't represent the content of the table. How can I resolve the problem? Thanks again.
@ethanxsun9 ай бұрын
So cooooool! You are hero. Talented
@alejandro_ao9 ай бұрын
glad this was useful! keep it up!!
@kingfunny48219 ай бұрын
Thank you for your wonderful efforts and excellent explanation Please, I would like to ask you whether it has been explained how to make a conversation with files without using the Internet, that is, it is only local
@AmirShafiq10 ай бұрын
Good stuff and great explanation
@alejandro_ao10 ай бұрын
thank you sir 🫡
@ArusaKhalfay6 ай бұрын
I just built this, it does basic stuff and is great but if you notice and test it out with the last user question it actually gives the wrong response and does not correctly map the artist ids. Do you have ways to improve this performance. It makes mistakes with writing simple joins too and throws an error?
@luismario68087 ай бұрын
amazing video, thanks!
@alejandro_ao7 ай бұрын
Glad you liked it!
@niyetheg10 ай бұрын
I need your help, I followed all the steps and I am at the last part where I ask questions and get feedbacks but I keep getting this error validation error even though I have included the API KEY. for ChatGroq __root__ Did not find groq_api_key, please add an environment variable `GROQ_API_KEY` which contains it, or pass `groq_api_key` as a named parameter. Please what can I do?
@gianni430210 ай бұрын
print under it to see if its accessing the Groq qpi key, or just input the key and see if it works. if it does its an issue with the parth to your .env. make sure to 'load_dotenv()' at the start instead that may help
@Jay_jadan10 ай бұрын
Congrats on the diploma sir 🎓
@alejandro_ao10 ай бұрын
thank you sir 🫡
@LukeS-e3m5 ай бұрын
Are you able to create a video where you're searching through multiple tables? Or would this approach work for this too?
@MamirZhan-de7fv6 ай бұрын
I have been following your tutorial, it helped me a lots! Thanks for sharing. All these LLM models are greatly depends on sementic meaning of the tables and columns. But in my case, my database came from Open Way loan management system and most of my tables and columns name doesn't have semantic meaning. For example, the transaction table called DOC instead of transactions. In side DOC table, non of the fields name are self explanatory. As a result, most of time, my model can't answer my questions. What should I do?
@AJITKUMAR-k9k2g7 ай бұрын
Are you using any extension which gives you autocomplete code suggestion?
@andersonkoh13827 ай бұрын
Do you have a series for beginner to learn how to build custom theme and create pages with these? I'm used to build with the builder in WordPress.
@AyomideKazeem-g7n8 ай бұрын
Hello, great tutorial as always. I am completely new to this, so after running the code and changing a few things I keep getting this error "AttributeError: st.session_state has no attribute "db"" and I do not know how to solve it.
@aadilgani552810 ай бұрын
Congratulations on getting your diploma, can you please make a video On Chat with SAS (clinical data .bat files) database using open-source LLM.
@alejandro_ao10 ай бұрын
I have never used sas, but a senior technical architect there implemented an agent that interacts with sas data. Maybe this can be useful to you: www.linkedin.com/posts/bogdanteleuca_how-to-create-your-custom-langchain-agent-activity-7161603166952734720-jqRo/
@DoggoDebugger6 ай бұрын
Great stuff!
@alejandro_ao6 ай бұрын
Glad you enjoyed it!
@SibiP-b9xАй бұрын
is it possible to add a visualizations ?
@aniketdeshmukh57767 ай бұрын
Thanks Alejandro, for excellent video! I have one doubt, when we are using openai key with our sql database. are they able to access all our sql database data? If yes, then please let me know, how to prevent that. If No, then it's fine. Thanks for clarification in advance...!
@alejandro_ao7 ай бұрын
hey there, great question! according to the openai's docs and privacy policy, they don't keep logs of your prompts for more than 30 days. and they do not use these prompts to train or fine tune the model. so if that is your concern, you should be fine. however, you may absolutely do not want any of your data to go through the openai's servers regardless of their promise of privacy. if that is the case, here is some info about the implementation in this video: openai's servers only see what you add to the prompt that you sent to them. in this case, we are sending two prompts to openai during our process: 1. firstly, we ask the LLM to generate the SQL query. in this case, we are sending the schema of the database in our prompt. the schema itself does not include the data from your database, but it does say what kind of data your database contains, such as the names of the tables, the data types, etc. the LLM will use this schema to generate the SQL query. 2. secondly, we run the SQL query towards our database. this operation happens only between ourself and your database. openai is not involved here. 3. thirdly, we take the results that our database returned and we send them to openai for interpretation. here is another step where we are sending data to openai. so in short, in this tutorial, we are sending two pieces of information to openai: the schema of your database and the results from the query. if you are not comfortable with this, you can always run a local model using ollama! this will work great :)
@aniketdeshmukh57767 ай бұрын
Thanks @@alejandro_ao, For excellent clarification. Really Appreciate👏👏
@RakshithML-vo1tr5 ай бұрын
Any plans of making videos with the help of llama 405B parameter model
@MohitThehuman6 ай бұрын
Hi Great video Thank you I have one question how will we deal when there will be like 100 tables in the schema Like it will lead to the context window issue Do we need to add -vectordb to first search and filter out the relevant information then pass it to llm Any comments ?
@blackpanther300510 ай бұрын
Hello, good day, excellent video, sorry, how much VRAM did you use for the project?
@alejandro_ao10 ай бұрын
hey there, all the completions of this app are done in remote servers, through the providers' APIs so they are the ones dealing with all the processing power. you could run this in an old raspberry pi tbh 😎
@showbikshowmma35207 ай бұрын
Firstly, love you 3000 times! I have a question. I want to build a chatbot using an open-source language model like LLaMA 3 or another available large language model (LLM). How can I integrate the LLM with my MySQL database so it can answer questions based on the information it finds in the database? Additionally, I would like to integrate an API into the LLM because I have a hosted Python backend server. Through the API, the LLM will also be able to respond to user queries. I believe u will help me out in this case..
@juanjubiaАй бұрын
Could be possible to do something similar but with MongoDB community?
@delgrave47867 ай бұрын
Hey alejandro amazing project! I implemented this made a bunch if changes to it too. I had a request, do you have a video about creating and working with agents using open source llms like groq. If not can you make a tutorial for that? Most videos i see always use openai and i just am not able to implement them with open source llms. And openai refuses to take my debit card
@RaushanKumar-ut2ke9 ай бұрын
Hey alejandro. Do we have any method to connect multiple schema for database chatting
@RlUpadhyay-c6o2 ай бұрын
will be work for large database like I have 1200 hundred tables in my schema??
@victorchrist989910 ай бұрын
Congratulations on your diploma
@alejandro_ao10 ай бұрын
thanks!!
@thiagomantuanidesouza1366 ай бұрын
What to do if the columns contain underscore in the names? Is there any configuration to solve it?
@RaushanKumar-ut2ke9 ай бұрын
All the methods I saw is to use single schema but I didn't find any workaround for connecting multiple schema for database chatting
@ishanagrawal3964 ай бұрын
THanks! How can we us multiple database?
@arvindprajapath23157 ай бұрын
How are you getting those code suggestions or auto complete feature when you are typing the code in vs code. Can someone tell me which extension is this
@Shivang03698 ай бұрын
Hello brother where have you used LangChain's API? You mentioned it in .env file ? Also i am using the OpenAI model 3.5 for which i am getting below error: openai.BadRequestError: Error code: 400 - {'error': {'message': "This model's maximum context length is 16385 tokens. However, your messages resulted in 16734 tokens. Please reduce the length of the messages.", 'type': 'invalid_request_error', 'param': 'messages', 'code': 'context_length_exceeded'}} kindly tell me what should I do? I am planning to build this chatbot at enterprise level, which have larger database. Keep uploading brother, great explanation
@thangarajerode797110 ай бұрын
Could you please create the video about how to deploy the steamlit app into langserve?
@yugeshkaran754710 ай бұрын
Thanks for the video bro . Now I have built and hosted the similar chatbot on streamlit cloud community, but I have encountered an error with connecting with MySQL database. I have crossed my database server but still i haven't resolved it. Could please guide me what the issue might be??
@djaz734710 ай бұрын
Hi Alejandro , is possibile to connect a DB2 database through jdbc?
@SatyendraJaiswalsattu10 ай бұрын
Nice tutorial, please make one video - how to entract with my excel sheet data
@alejandro_ao10 ай бұрын
great idea!
@stanTrX9 ай бұрын
Thanks but traditional forms to pull data from db's would be probably more efficient.
@Parthi978 ай бұрын
can you show me the steps when my streamlit app is hosted on cloud so that when the user enters connection details my web app should be able to connect
@redo220118 ай бұрын
Thxs for sharing, its a great work!!! can you add similar example for amazon bedrock Claude 3
@danilabelokon42023 ай бұрын
Is it possible to build a chatbot that can handle updates to queries? For example, I want to create a car parts inventory system where I can add and remove parts using voice and chat. Additionally, I want the chatbot to provide functionality for asking about available parts.
@alejandro_ao3 ай бұрын
hey there, it sounds to me like you want an agent that is capable of performing CRUD operations on a database. that is possible, you just have to create some tools (python functions) and bind them to your agents. you can use langgraph for that!
@openyard2 ай бұрын
I followed this video and put together a RAG solution uses LLM to query a PostgresQL database (llm = ChatGroq(model="mixtral-8x7b-32768", temperature=0);) and it works well. But when I try to query a graph data implementation in PostgreSQL consisting of two tables "node" and "edge" the generated SQL queries are error prone. The query should use WITH RECURSIVE CTE queries, but somehow the LLM generated queries always have SQL DML query error and mostly do not RECURSIVE quries, Is there a way I can send examples to the LLM model through some hand crafted pairs of prompt and PG query examples?
@1sanak10 ай бұрын
Grats on the diploma! Can I use the same method to connect to a MSSQL DB?
@alejandro_ao10 ай бұрын
thanks!! i heard some guys from the channel were trying to do this. i haven't tested myself. but look, apparently you can pass in a driver just like we did here with mysql, but for MSSQL: docs.sqlalchemy.org/en/20/dialects/mssql.html#module-sqlalchemy.dialects.mssql.pyodbc
@udaynj9 ай бұрын
Does this mean that OpenAI and/or Langchain have access to your database schemas and also the data generated? That could be a huge security risk for most companies.
@JuanPalacios10 ай бұрын
excelente informacion!!!!
@alejandro_ao10 ай бұрын
gracias!
@fozantalat45099 ай бұрын
Can you create this app tutorial using agent instead of chains, that would be really awesome.
@alejandro_ao9 ай бұрын
i will probably do something like this using CrewAI
@devizl3 ай бұрын
thnx pro for everything this was amazing iwanna aske u a Q if i want to use my llama 3 local model how can i integrate it in the code i use: from langchain_ollama import ChatOllama llm = ChatOllama(model="l------", temperature=0) do i need to import url and if so do i need to put it in .env file and what is the correct way and thx again
@alejandro_ao3 ай бұрын
hey there, did you manage to get this working? as far as I can see, all you need to do is start up your local ollama service and you're good to go for using the langchain integration, right?
@zefur3216 ай бұрын
What is the extension name that you use to predict the next steps of coding (i see it will show as gray color) in VS Code?
@alejandro_ao6 ай бұрын
that’s gh copilot
@nguyenduythai45856 ай бұрын
If using open_ai_key, will the data in the database be leaked, how can it be secured?
@akshaysangwan217210 ай бұрын
The code you taught works properly for small queries but fails to write complex queries and provides incorrect information. Can you help? I'm using ChatGPT-3
@JoEl-jx7dm10 ай бұрын
Hey i have issue hosting on streamlit cloud, could ya try?
@LieutenantGeneral7 ай бұрын
I tried to train a bert model on Spider dataset but realised that I need a dataset that also consists of DDL commands, any suggestions or do I need to nake a custom dataset?
@LieutenantGeneral7 ай бұрын
Also to mention that my choice with bert was not good, so now I am going to use T5 model.
@gianni43029 ай бұрын
Great tutorial, but i keep running into error 400 due to token counts. Any real world schema alone will bump up to almost 10k tokens. anyone found a workaround? A tutorial on this would be lifesaving!
@VibhuSharma9 ай бұрын
I am running into the same issue, always getting exceeded token limit error. Were you able to figure out any workaround?
@tapanpati945210 ай бұрын
cool,bro i have a scenario to create a chatbot using 36,000 pdfs ,do you have any idea,but rag is not giving accurate answer.
@PhamDuc85047 ай бұрын
I want to ask how the VSC IDE can suggest the remaining parts of the code ? Thank !!!
@alejandro_ao6 ай бұрын
that's github copilot!
@CodexOdyssey5 ай бұрын
please make video ai manager API
@王君淑10 ай бұрын
Hello, greetings and congratulations on obtaining your diploma. Once again, I would like to ask you a question. I may need to analyze MySQL data. The first step is to execute sql_chain and run SQL statements, and the second step is to add an analysis chain. The problem I encountered is: when I add self-awareness knowledge to prompt words, and when I ask self-awareness questions, two chains need to participate at the same time. How can I control the first chain not to execute or have any other way to only run one chain, which may make the program more powerful. Let me give an example: User: hello! AI: hello, can I help you? When I asked this, he didn't need to execute SQL statements, but now he still does. So there are the issues mentioned above. I hope you can understand my language, because I am Chinese and this question was translated through translation software. I do not know English. Wishing you again.
@alejandro_ao10 ай бұрын
Hey there. If you want something more interactive, that is able to respond to other questions and not only those strictly related to generating a SQL query, you should probably implement an agent. LangChai makes it very straightforward to implement agents: python.langchain.com/docs/modules/agents/. You can also use CrewAI to create a team of agents: docs.crewai.com/ I hope this helps!
@王君淑10 ай бұрын
@@alejandro_ao Thank you very much. I have benefited a lot from watching your latest video.
@Ganitham-GenAI9 ай бұрын
Will the test results and the data be exposed to LLM or only meta data
@alejandro_ao8 ай бұрын
yes, all data goes through the LLM. if privacy is your concern, you can load an open source model locally, such as llama3
@who-kv6fe3 ай бұрын
After deploying how can i turn the host from localhost to a public server/web host? Any solutions?
@alejandro_ao3 ай бұрын
hey there, take a look at this: kzbin.info/www/bejne/bWXGZH6Xdr2DrM0si=EYXg71ydGQzZQZmc
@jimerlozano65237 ай бұрын
interested in more about sql.
@alejandro_ao7 ай бұрын
i will work on it !
@avinashnair50643 ай бұрын
Hey, actaully i want to crete a RAG applciaiton which can do all the stuffs like code agent + SQL agent + a normal document retriever how can we do that'?
@alejandro_ao3 ай бұрын
you would need to create an agent for that! then create the tools for code, sql and rag. or you can use a multi-agent system, like laggraph or crewai!
@avinashnair50643 ай бұрын
@ i want to keep it completely open source and everything should be self hosted could you please propose a simple solution?
@alejandro_ao3 ай бұрын
@@avinashnair5064 both crewai and langgraph are open source. you really can’t go wrong with any of these! langgraph is great for more controlled workflows. crewai is better for more creative automations, like emulating a human team of specialists in different fields
@manikandanr124210 ай бұрын
If we ask questions from out from db example user enter hi what will be it response
@alejandro_ao10 ай бұрын
hopefully the overall chain would be able to get that you were actually just saying "hi". but in order for your program to decide whether or not to call the database depending on the user question, you might need to create an agent or a bit more sophisticated chain
@tapanpati945210 ай бұрын
Can you please make a video how to use langsmith?
@alejandro_ao10 ай бұрын
definitely, i'm working on it!
@saibaldasgupta110 ай бұрын
Is this code can be use for postgresql?
@vincentfernandez714510 ай бұрын
You need to use psycopg2-binary instead of my-sql-connector
@tofani-pintudo9 ай бұрын
Yes, you just need to change the db config.
@stanTrX9 ай бұрын
How those code tips are popping in vs code? Does anyone knows?
@alejandro_ao9 ай бұрын
hey there, that's github copilot. saves a lot of time :)
@VoltVandal10 ай бұрын
congrats !
@alejandro_ao10 ай бұрын
thanks!
@akhil02278 ай бұрын
Has anyone managed to work with postgresql?
@LadiAsrith3 ай бұрын
How I set up langsmith? Please help me that
@alejandro_ao3 ай бұрын
hey there. just create an account and add the api keys. if you’re using langchain, it automatically starts tracking. if you’re using any other framework, you’ll need to manually specify what you will track
@mbanduk8 ай бұрын
How would I go about this if Airtable was my database?
@alejandro_ao7 ай бұрын
that's a cool idea. you can use this integration with langchain: python.langchain.com/v0.2/docs/integrations/document_loaders/airtable/ but you would have to build your own chain, which might look a bit different from what i show here. still, the implementation should not be too different. i hope this helps!
@hadjersa287 ай бұрын
why u choosed to use mixtral and gpt4?/
@alejandro_ao7 ай бұрын
i wanted to show one proprietary and open source llm. but you can use any supported integration with langchain!