Пікірлер
@balajimani9366
@balajimani9366 8 күн бұрын
great very helpful
@balajimani9366
@balajimani9366 8 күн бұрын
your all videos are very useful for AI Beginners
@abhilashmathew2506
@abhilashmathew2506 17 күн бұрын
Good Explanation.. Thanks for the video.. where can i see the part 2 video
@polly28-9
@polly28-9 17 күн бұрын
How about huge database? We send the huge schema of the huge database with the initial prompt, which costs a lot? What will be the solution for huge database?
@ahmaddajani3639
@ahmaddajani3639 26 күн бұрын
great video. But I have a question if there are multiple users asking at the same time, what will happen? If user A is asking and memory now is filled with first question and answer, if someone else asked a question, lets say user B immediately after user A, the memory will be filled with question and answer so what I mean conversation history will be filled by both users or it will create it separately since it is a local variable using session state? Do we need to implement a session?
@techlycan
@techlycan 26 күн бұрын
Session is created by default . Different users , different sessions
@suneelc
@suneelc Ай бұрын
Good information
@aikomaemascarinas8698
@aikomaemascarinas8698 Ай бұрын
❤❤
@chittaranjanpradhan5290
@chittaranjanpradhan5290 Ай бұрын
Can you please paste the actual streamlit python code here. The video was blurry and difficult to see the lines.Nice video
@techlycan
@techlycan Ай бұрын
since its only 3-4 lines of code i did not maintain it. Rest of the code comes automatically when you crate you streamlit app .... can you try the vide with higher quality. i could see it clearly with higher resolution.
@chittaranjanpradhan5290
@chittaranjanpradhan5290 Ай бұрын
@@techlycan thanks I can see it.can you few more on streamlit with actual real case scenario
@user-lh7mc3lo8t
@user-lh7mc3lo8t 2 ай бұрын
Nice explanation
@anagai
@anagai 2 ай бұрын
How do you get it to display result in nice table format like that. I have json output and just displays the json
@techlycan
@techlycan 2 ай бұрын
Convert the query output in Data frame and then simply display it. Note that my DB here is RDBMS.
@vinodvb
@vinodvb 2 ай бұрын
Hi TechLycan, can you please share your email id, have some query regarding personal tutoring? Thanks.
@techlycan
@techlycan 2 ай бұрын
plz write to [email protected]
@vinodvb
@vinodvb 2 ай бұрын
getting below error NameError: name 'dt_picker_date' is not defined Traceback: File "/usr/lib/python_udf/7b2315e442920a4c09924aee7d7c7f87110b9a5f6512b60408d246602b4ac48f/lib/python3.8/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 552, in _run_script exec(code, module.__dict__) File "/home/udf/91416072/streamlit_app.py", line 18, in <module> results = session.sql(sql.format(date_input=dt_picker_date.s
@vinodvb
@vinodvb 2 ай бұрын
Liked your videos on dbt. Please create video on snowflake cortex and containers
@sonalithakur8234
@sonalithakur8234 2 ай бұрын
Can you provide GitHub link of this project
@balakrishnakoraganti341
@balakrishnakoraganti341 2 ай бұрын
Can you please let me know the training details. I would like to attend the sessions
@saifmahin7425
@saifmahin7425 2 ай бұрын
Excellent Video Brother. Considering the content, you should get thousands of views. Best wishes.
@techlycan
@techlycan 2 ай бұрын
Thanks brother ! Happy to know it could help you to get better understanding :)
@sachinprakash2525
@sachinprakash2525 2 ай бұрын
Please share the link for prompt creation and load process video
@datasciencebyyogi623
@datasciencebyyogi623 3 ай бұрын
can you please share the code git repo
@user-xx8xg5yf4d
@user-xx8xg5yf4d Ай бұрын
import os from app_secrates import OPENAI_API_KEY from langchain.llms import OpenAI from langchain.memory import ConversationBufferMemory from langchain.prompts import PromptTemplate from lamgchain.chains import LLMChain import streamlit as st os.environ["OPENAI_API_KEY"] = OPENAI_API_KEY memory = ConversationBufferMemory() st.title("Conversational Bot") user_input = st.text_input("Enter your message here") if 'chat_history' not in st.session_state: st.session_state.chat_history = [] else: for message in st.session_state.chat_history: memory.save_context({"input": message['human']}, {'output': message['AI']}) promprt_template = PromptTemplate( input_variables= ['history', 'input'], template=""" You are coversational bot. Maintain formal tone in your responses. History: {history} humen: {input} AI: """ ) llm = OpenAI(temprature=0.0) conversation_chain = LLMChain(llm=llm, prompt=promprt_template, memory=memory, verbose=True) if user_input: response = conversation_chain.run(input=user_input) message = {"human": user_input, "AI": response['text']} st.session_state.chat_history.append(message) st.write(response) with st.expander(label="Chat History", expanded=True): st.write(st.session_state.chat_history)
@tumbler8324
@tumbler8324 3 ай бұрын
Bhiya maja aa gaya & i appreciate your work one request can you make a Frequently asked interview question series on the following topic zero-copy cloning pipeline optimization time travel and file safe stream and task RBAC diff between sys admin and user admin
@kamilpatel3195
@kamilpatel3195 3 ай бұрын
This is gonna working with smaller database smoothly and very effeciently but when we talk about the bigger database or the enterprise level database then this code is break in to Execute the simple query. I have tried this method earlier and it was not work well with the bigger database.
@techlycan
@techlycan 3 ай бұрын
Yes but I think you are missing the use case here.... Tech PPL do not need this as they are proficient in SQL already....it's usually the business PPL who face challenges when they need to interact with tables....and for reporting the tables are usually summarized tables limited in count
@user-fi7zo7fp4u
@user-fi7zo7fp4u 3 ай бұрын
TechLycan nicely explained . Only thing is that you have used snowpipe in the demo and note snowpipe streaming
@kauserperwez1556
@kauserperwez1556 3 ай бұрын
In one of the column in database, if I have personal information that I want to mask and show in results, can someone please explain how do I do that
@leedsshri
@leedsshri 3 ай бұрын
Hello, is it possible to connect to Teradata database from Langchain ? If yes, could you pls show that ?
@benjamintousifar6522
@benjamintousifar6522 3 ай бұрын
I think the way that you explained the vertical and horizontal scaling is not entirely true. When we increase the size of a VW we are adding more nodes to the same cluster, not adding a cluster to it. That happens when we are scaling horizontally. Similarly, for horizontal scaling we are not adding another VW, but we are adding more cluster to the same VW if the multicluster option is enabled in the enterprise or higher edition. Hope that is clear
@techlycan
@techlycan 3 ай бұрын
Partially true.... It depends what you define as a cluster....... for example the small VW in SF is single cluster XS which has 8 compute nodes (CPUs) with in.... be it vertical or horizontal scaling , you can not add less than a single cluster (8 compute nodes) to VW , that's why it is referred as adding cluster in vertical scaling. In case of horizontal scaling suppose you applied it on VW of M size i.e. a VW with 4 clusters.... in that case what you will get is another set of 4 clusters getting active or suspended as a single unit , you can not add a copy of different cluster size in horizontal scaling hence the another unit of same clusters size (4 clusters i.e 32 compute nodes) i referred as clone of existing VW because they will always be identical in size and behavior. Hope that makes sense to you !
@vinodvb
@vinodvb 2 ай бұрын
@@techlycanCan you confirm if this is true : XS VWH has 1 node, 4 cores and 8 threads and it can run max 8 queries in parallel? www.google.com/search?q=how+many+nodes+in+xs+snowflake+warehouse&rlz=1C1GCEA_enUS1012US1012&oq=how+many+nodes+in+xs+snowflake+warehouse&gs_lcrp=EgZjaHJvbWUyBggAEEUYOTIHCAEQIRigATIHCAIQIRigATIHCAMQIRigATIHCAQQIRigATIHCAUQIRigATIHCAYQIRifBTIHCAcQIRifBTIHCAgQIRifBTIHCAkQIRifBdIBCTIzNzQzajBqN6gCALACAA&sourceid=chrome&ie=UTF-8
@prashantmhatre9225
@prashantmhatre9225 3 ай бұрын
Nice
@venkateshmuppalla3032
@venkateshmuppalla3032 3 ай бұрын
@techlycan could you please provide your mail id. I want to communicate with you.
@user-lk8rs4jn2u
@user-lk8rs4jn2u 4 ай бұрын
@Techlycan Could you please provide your email id. I want to communicate with you.
@user-lk8rs4jn2u
@user-lk8rs4jn2u 4 ай бұрын
Thank you for making this video. The best of its kind
@rajivgandhik8422
@rajivgandhik8422 4 ай бұрын
Thank you so much. I am trying to set up the snowflake pipe streaming and i am almost done. However , I cannot run the statement 'GRANT ROLE kafka_connector_role_1 TO USER securityadmin;'. It gives an error 'User 'SECURITYADMIN' does not exist or not authorized' . Please advise on this.
@techlycan
@techlycan 4 ай бұрын
Securityadmin is a role not user
@champstark8974
@champstark8974 4 ай бұрын
how to use tools with LLMChain
@user-qu3vl2zg5q
@user-qu3vl2zg5q 4 ай бұрын
Hello thanks for sharing the video. I am currently working with MongoDB databases. Could you please create a similar video on MongoDB? Unfortunately, I haven't found a MongoDB agent in the Lang chain, so I was wondering if you could help me with that.
@GeriFitrah
@GeriFitrah 4 ай бұрын
i have question, if user want ask something outside the database for ex user ask USD rate in january 2020, can it handle with same prompt ? or we need to make another prompt to handle question outside the database ? thx for the tutorial
@techlycan
@techlycan 4 ай бұрын
@newg1203 This prompt has a limited scope. If you increase the scope then prompt needs to be modified accordingly. Moreover you may need to pass on further information to LLM to help writing queries. well defined catalogue or RAG could be used to enhance the functionality.
@cjthefinesse
@cjthefinesse 4 ай бұрын
Good stuff dude
@saiveeramalla5507
@saiveeramalla5507 4 ай бұрын
Also can you make seperate tab in which the results you got can be plotted may be like piechart or graph dynamically based on user prompt
@renaudgg
@renaudgg 4 ай бұрын
when you wrote this line: llm = OpenAI(temperature=0.9) what is the default model it uses? if instead I write: llm = OpenAI(model="gpt-4", temperature=0.9) im getting error: Error: Error code: 404 - {'error': {'message': 'This is a chat model and not supported in the v1/completions endpoint. Did you mean to use v1/chat/completions?', 'type': 'invalid_request_error', 'param': 'model', 'code': None}} how can we make it so that its gpt4?
@techlycan
@techlycan 4 ай бұрын
Your API key defines the model which you would have generated in openai portal... This section only define the properties and not model.
@renaudgg
@renaudgg 4 ай бұрын
@@techlycan ok I see, but when I create a new key in my portal, it just ask my for a name, i dont even have a chocie of a model? i dont get it....
@techlycan
@techlycan 4 ай бұрын
@@renaudgg I have not tried gpt4 key as of yet but I think the portal provides documentation to generate it....it's not free so may be you need to subscribe first and then you may get the option to create it.
@renaudgg
@renaudgg 4 ай бұрын
is every new question we will ask to the AI with the "memory" will push every time the entire conversation as tokens every single time? example if at the beginning there is no history and lets say, for argument sake, my first question is 10 token, then the AI answer and the total so far is 10+5 = 15 then i ask a second question that is lets say 20 tokens. when I press enter, will it "eat up" 15 + 20 ?
@techlycan
@techlycan 4 ай бұрын
Yes that's the context setting but in real case you won't need messages upto that back.... Realistically thinking not more than 3-5 previous messages should be passed and even that depends on what you are using LLM app for...so fix the no of previous messages
@renaudgg
@renaudgg 4 ай бұрын
What is the difference between langchain_community.llms and langchain_openai.chat_models ? when you use the first one, I see you dont put the model gpt-4 is it by default?
@AliAlias
@AliAlias 4 ай бұрын
Thanks very nice 🌹, How to use open source LLM from hugging face locally using ctransformer or lamacpp library with Pandas Ai in offline mode? for example codellama or pandalytic gguf type
@renaudgg
@renaudgg 4 ай бұрын
You think we could put many tables schemas into text files for Vector DB? My problem is I did like your other video where I put some SQL tables in .YAML file in template. the problem is if I add more tables, I will get error that I overpass my tokens. how could we make the A.I "remember" the table structures so that when we ask question it will find the proper answer in the SQL Database
@techlycan
@techlycan 4 ай бұрын
That's what ... You can't put full schema ddl in config file and pass as prompt usually for reporting there are limited tables but not always ....there are 2 ways to achieve it.... Train the model on your schema ..highly unlikely....another way is to store the ddl into vector db and then make two calls in iteration ...first to extract the relevant ddls from vector db and second to build SQL using ddl and question...... This is what is displayed here how to extract data from vector db...aka RAG
@renaudgg
@renaudgg 4 ай бұрын
@@techlycan yea ok, but I just thought of something, i can just create one big view of all important fields I need and there you go..i just solve my problem
@techlycan
@techlycan 4 ай бұрын
@@renaudgg that's interesting ... One Difference ... RDBMS works based on exact match where as VDB based on vectors but you can try this approach and tell us if that works... May be if I have to go your way then I would pass the table names in the prompt to identify which tables would be needed... Extract may be columns needed from view and then build final prompt to pass to LLM .... But that not necessarily be your approach too....Eager to listen if it works for you
@renaudgg
@renaudgg 4 ай бұрын
@@techlycan I did a big table that regroups many columns of different tables. then once an hour, I append new data to the table (without deleting the rest) so the AI only know that table which I think is perfect.
@channelShutter005
@channelShutter005 5 ай бұрын
Hey, Thanks for this video. Could you please tell me how you got the Analytics database in this video?
@techlycan
@techlycan 5 ай бұрын
You can simply create database and schemas using create statements
@user-xw3qh8pg9g
@user-xw3qh8pg9g 5 ай бұрын
can you share colab?
@manaskalra1570
@manaskalra1570 5 ай бұрын
Hi Great project Can you explain me how can I change the prompt? like what if I want it to generate queries for a different db?
@manaskalra1570
@manaskalra1570 5 ай бұрын
also how do we import execute_sf_query? is that a predefined function? somewhere in the project?
@user-zd2xj8sw6c
@user-zd2xj8sw6c 5 ай бұрын
Sir do u have paid course in dbt
@techlycan
@techlycan 5 ай бұрын
Yes ...plz join my group for details chat.whatsapp.com/Jz2ltS009Zi9DqXgiQMG0U
@narasimhanmurugan5448
@narasimhanmurugan5448 5 ай бұрын
Good one pls upload another session
@rajkishormahanada6223
@rajkishormahanada6223 6 ай бұрын
Nice tutorial 👍
@techlycan
@techlycan 5 ай бұрын
Thank you 👍
@user-lk8rs4jn2u
@user-lk8rs4jn2u 4 ай бұрын
@@techlycan Could you please share your mail id.
@mohammedvahid5099
@mohammedvahid5099 6 ай бұрын
Excellent
@gouravsaini-ci4jb
@gouravsaini-ci4jb 6 ай бұрын
Thanks a lot! This was an amazing tutorial.
@techlycan
@techlycan 5 ай бұрын
Glad it was helpful!
@ashwinkumar5223
@ashwinkumar5223 6 ай бұрын
Awesome Sir.
@ashwinkumar5223
@ashwinkumar5223 6 ай бұрын
Awesome sir. I try to do part 1&2 and let you know sir. Please help me while facing any issues during this. Thanks a lot for sharing this information.
@techlycan
@techlycan 5 ай бұрын
Keep it up
@mohammedvahid5099
@mohammedvahid5099 6 ай бұрын
Make complete videos bro ❤ thnk u so much
@techlycan
@techlycan 6 ай бұрын
Complete ? What got left in agenda ?