How to Use Llama 3 with PandasAI and Ollama Locally

  Рет қаралды 13,521

Tirendaz AI

Tirendaz AI

26 күн бұрын

Today, we'll cover how to perform data analysis and visualization with local Meta Llama 3 using Pandas AI and Ollama for free. Happy learning.
▶ Subscribe: bit.ly/subscribe-tirendazai
▶ Join my channel: bit.ly/join-tirendazai
00:01 Introduction
01:32 Setup
03:02 Initialize the model
05:15 Initialize the app
08:10 Build the app
09:18 Inference
11:16 Data visualization
RELATED VIDEOS:
▶ PandasAI Tutorials: bit.ly/pandasai
▶ Ollama Tutorials: bit.ly/ollama-tutorials
▶ LangChain Tutorials: bit.ly/langchain-tutorials
▶ Generative AI for DS: bit.ly/genai-for-data-science
▶ HuggingFace Tutorials: bit.ly/hugging-face-tutorials
▶ LLMs Tutorials: bit.ly/llm-tutorials
FOLLOW ME:
▶ Medium: / tirendazacademy
▶ X: / tirendazacademy
▶ LinkedIn: / tirendaz-academy
Don't forget to subscribe and turn on notifications so you don't miss the latest videos.
▶ Project files: github.com/TirendazAcademy/Pa...
Hi, I am Tirendaz, PhD. I create content on generative AI & data science. My goal is to make the latest technologies understandable for everyone.
#ai #generativeai #datascience

Пікірлер: 80
@DallasGraves
@DallasGraves 23 күн бұрын
Holy crap! My mind is absolutely RACING with proof of concept projects I can deliver to our Finance team with this. Thank you so much for making this!
@TirendazAI
@TirendazAI 23 күн бұрын
My pleasure 😊
@user-el8jv8hx2g
@user-el8jv8hx2g 20 күн бұрын
What are you thinking about? I'm a college student and I want to understand more use cases.
@MANONTHEMOON419
@MANONTHEMOON419 16 күн бұрын
dude, please do another video on this with more things you can do, or maybe explaining further, this is amazing.
@user-flashaction
@user-flashaction 25 күн бұрын
This channel has not been discovered yet, thank you for the up-to-date and practical videos!
@TirendazAI
@TirendazAI 25 күн бұрын
Thanks 🙏
@wvagner284
@wvagner284 23 күн бұрын
Great, great video! I just got a new subscriber! Congrats and regards from Brazil!
@TirendazAI
@TirendazAI 23 күн бұрын
Thanks 🙏
@ladonteprince
@ladonteprince 20 күн бұрын
Undiscovered gem - the voice the instruction pure gold
@TirendazAI
@TirendazAI 20 күн бұрын
Thanks 🙏
@60pluscrazy
@60pluscrazy 24 күн бұрын
Unbelievable, never knew about Pandas AI. THANKS VERY MUCH 🎉🎉🎉
@TirendazAI
@TirendazAI 24 күн бұрын
Thanks 🙏
@WhySoBroke
@WhySoBroke 24 күн бұрын
Excellent video and great methods!!
@TirendazAI
@TirendazAI 24 күн бұрын
Glad you liked it!
@GhostCoder83
@GhostCoder83 25 күн бұрын
Very precious. Keep it up.
@TirendazAI
@TirendazAI 25 күн бұрын
Thanks 🙏
@mubasharsaeed6044
@mubasharsaeed6044 23 күн бұрын
Please make a video for agent which perform action using lama or langchain for example if we give an prompt bring red box the lama 3 generate the plan and agent do action. Thanks
@felipemorelli4059
@felipemorelli4059 19 күн бұрын
Thanks!
@TirendazAI
@TirendazAI 19 күн бұрын
You're welcome 🙏
@felipemorelli4059
@felipemorelli4059 19 күн бұрын
Excellent video. which cpu are you using?
@TirendazAI
@TirendazAI 19 күн бұрын
Thanks! My system is AMD Ryzen 5 7500F, 64GB RAM and 4070 TI Super graphics card with 16GB VRAM
@Moustafa_ayad
@Moustafa_ayad 24 күн бұрын
Good work
@TirendazAI
@TirendazAI 24 күн бұрын
Thanks 🙏
@teachitkh
@teachitkh 23 күн бұрын
so great video
@TirendazAI
@TirendazAI 23 күн бұрын
Thank you 🤗
@teachitkh
@teachitkh 22 күн бұрын
@@TirendazAI how about pdf .?can you help this?
@user-mv9ul9tz1c
@user-mv9ul9tz1c 23 күн бұрын
Thank you for the tutorial. I have a question: do I need to apply for an API key before using PandasAI?
@TirendazAI
@TirendazAI 23 күн бұрын
PandasAI is open source and free. If you are using any open source model, you do not need an API key.
@JohnBvoN
@JohnBvoN 17 күн бұрын
Wanted to know, can I load the model directly from huggingface? Also I have stored the model and tokenizer using save_pretrained, How can I use these?
@TirendazAI
@TirendazAI 17 күн бұрын
To load the model from HuggingFace, you can use langchain. Check this link: python.langchain.com/v0.1/docs/integrations/platforms/huggingface/
@varganbas427
@varganbas427 20 күн бұрын
Grear! Thanks! it works 😁there was an error in python !!!
@TirendazAI
@TirendazAI 20 күн бұрын
You're welcome!
@user-pn6ey5dn4y
@user-pn6ey5dn4y 23 күн бұрын
Thanks for your video. I've replicated your entire solution as a starting point. I keep getting errors from the LLM when trying the exact same queries, same dataset, same everything as what you did in your video. I have a RTX 4070 12GB and tried multiple llama and dolphin llama models up. It seems that every time we ask the LLM to write code to create a histogram or pie chart it creates an error - can you help? Here is an example: Query : create a heatmap of the numerical values Result: Unfortunately, I was not able to answer your question, because of the following error: 'list' object has no attribute 'select_dtypes'
@TirendazAI
@TirendazAI 10 күн бұрын
When a prompt does not work, try again by changing this prompt. For example, "Plot a heatmap of the numerical values".
@stanTrX
@stanTrX 23 күн бұрын
Tebrikler kardeşim. Aksanından hemen anladım. Selamlar. Abone oldum 🎉
@TirendazAI
@TirendazAI 23 күн бұрын
Teşekkürler 🙏
@user-mm1tt6oy7v
@user-mm1tt6oy7v 24 күн бұрын
Thanks for this video. How to integrate groq api to go faster?
@TirendazAI
@TirendazAI 24 күн бұрын
You can leverage the langchain_groq library or utilize the OpenAI compatibility. I showed how to use pandasai with groq api in this video: kzbin.info/www/bejne/eWe1an2Cfb93fpI
@HmzaY
@HmzaY 10 күн бұрын
I have 64gb ram and 8gb vram, i want to run llama 70B but it doesn't fit. how can i run it on system ram (64gb one) on python . can you make a video for that?
@TirendazAI
@TirendazAI 10 күн бұрын
I also have 64gb RAM, it worked for me. My system used about 58GB RAM for llama-3:70B. I show RAM I use if I make a video with llama 3:70B.
@lowkeylyesmith
@lowkeylyesmith 12 күн бұрын
Is it possible to expand the limit per file? my csv files are larger then 1GB.
@TirendazAI
@TirendazAI 11 күн бұрын
This is possible, but you need to use a larger model, such as llama-3:70b instead of llama-3b:8b.
@greg-guy
@greg-guy 22 күн бұрын
I don’t know how Panda works so sorry for the dumb question. Is the entire CSV data processed by LLM, meaning that large set will be slow or even too big , or is the calculation/processing of data is all done in Panda, meaning LLM only creates formulas for Panda ?
@TirendazAI
@TirendazAI 22 күн бұрын
Yes, LLM processes your data and generates an answer based on your question. You can think of this process as summarizing a text. If you have a small data set, you may get a faster response.
@urajcic1
@urajcic1 22 күн бұрын
great tutorial! I was playing around with this dataset and I get strange errors for questions like 'how many First class female survived?' - Unfortunately, I was not able to answer your question, because of the following error: 'list' object has no attribute 'loc' or list indices must be integers or slices, not str or invalid syntax (, line 3) can anyone reproduce and explain why is this happening?
@TirendazAI
@TirendazAI 21 күн бұрын
Sometimes, when LLM does not understand the prompt, it may not return the output you want. You can try changing the prompt or using a different prompt.
@RICHARDSON143
@RICHARDSON143 21 күн бұрын
Hey buddy I also want to integrate additional feature so that model can also generate sql query for the result as well trying it using pandasql but no success, can you help me
@TirendazAI
@TirendazAI 10 күн бұрын
I made a video about MySQL database, take a look.
@scottmiller2591
@scottmiller2591 21 күн бұрын
Is it possible to get PandasAi to show the code it used to generate the plots, etc.?
@TirendazAI
@TirendazAI 21 күн бұрын
Yes, you can get code if you specify in your prompt.
@scottmiller2591
@scottmiller2591 21 күн бұрын
@@TirendazAI Thanks!
@TirendazAI
@TirendazAI 21 күн бұрын
@@scottmiller2591 My pleasure!
@user-kk1li5mk7q
@user-kk1li5mk7q 19 күн бұрын
How are you able to get the response so fast. It is taking me few minutes to get a response. My csv file has 7K records. running on ubuntu 22 with i7 with 32 GB RAM.
@TirendazAI
@TirendazAI 19 күн бұрын
The most important component for a fast response is the graphics card.
@varshakrishnan3686
@varshakrishnan3686 15 күн бұрын
I'm getting the error no module named pandasai.llm.local_llm. Is there any way to solve it?
@TirendazAI
@TirendazAI 15 күн бұрын
llm is a module in pandasai. Make sure pandasai is installed and virtual environment is activated.
@varganbas427
@varganbas427 21 күн бұрын
Sorry, but my sample rewrite as your code return me a msg "Unfortunately, I was not able to answer your question, because of the following error: Connection error" and after in code py there is a error message : raise APIConnectionError(request=request) from err openai.APIConnectionError: Connection error.
@TirendazAI
@TirendazAI 21 күн бұрын
Did you start Ollama using the "ollama serve" command?
@sahilchipkar9761
@sahilchipkar9761 24 күн бұрын
Can it handles Large Data? Connecting through sql
@TirendazAI
@TirendazAI 24 күн бұрын
Yes it can. You can work with data such as CSV, XLSX, PostgreSQL, MySQL, BigQuery, Databrick, Snowflake.
@spkgyk
@spkgyk 24 күн бұрын
Why use the default llama rather than the llama instruct?
@TirendazAI
@TirendazAI 24 күн бұрын
The instruct models are fine-tuned to be able to follow prompted instructions. This version usually is used to make a chatbot, implementing RAG or using agents.
@sebastianarias9790
@sebastianarias9790 13 күн бұрын
i'm not getting a response from the chat. it keeps "Generating the prompt". what could be a reason for that? Thanks!
@TirendazAI
@TirendazAI 13 күн бұрын
Did you get any error? If yes, can you share this error? I can say something if I see the error.
@sebastianarias9790
@sebastianarias9790 13 күн бұрын
@@TirendazAI there’s no error my friend. It only takes a very long time to get the output. Any ideas?
@TirendazAI
@TirendazAI 13 күн бұрын
Which large model do you use?
@sebastianarias9790
@sebastianarias9790 13 күн бұрын
@@TirendazAI llama3 !
@sebastianarias9790
@sebastianarias9790 13 күн бұрын
@@TirendazAI llama3 !
@vinaya68vinno1
@vinaya68vinno1 23 күн бұрын
Can I do this using MySQL in streamlit for visualization can you send code
@TirendazAI
@TirendazAI 23 күн бұрын
I am planning to implement a project using MySQL.
@vinaya68vinno1
@vinaya68vinno1 23 күн бұрын
@@TirendazAI I need this fastly can you send code
@majukanumi9639
@majukanumi9639 24 күн бұрын
it is very slow to answer i have 48gb of ram but asking simple question takes ages ...
@TirendazAI
@TirendazAI 24 күн бұрын
A powerful graphics card is important for fast response. My card is 4070 TI super with 16 GB VRAM. For small or medium datasets, I can get answers in a short time.
@ccc_ccc789
@ccc_ccc789 22 күн бұрын
amazingngngngngng!
@stanTrX
@stanTrX 23 күн бұрын
Pandas Ai yi duymamıştım
@TirendazAI
@TirendazAI 13 күн бұрын
Son zamanlarda bir çok yapay zeka aracı çıktı takip etmek zorlaştı.
MySQL database with PandasAI & Ollama & Streamlit
16:03
Tirendaz AI
Рет қаралды 3,4 М.
How to Use PandasAI Agent with Llama 3 using Ollama?
16:48
Tirendaz AI
Рет қаралды 3 М.
Did you find it?! 🤔✨✍️ #funnyart
00:11
Artistomg
Рет қаралды 119 МЛН
КАК СПРЯТАТЬ КОНФЕТЫ
00:59
123 GO! Shorts Russian
Рет қаралды 2,5 МЛН
ПЕЙ МОЛОКО КАК ФОКУСНИК
00:37
Masomka
Рет қаралды 10 МЛН
Conforto para a barriga de grávida 🤔💡
00:10
Polar em português
Рет қаралды 106 МЛН
Data Analysis with Llama 3: Smart, Fast AND Private
7:49
Rabbitmetrics
Рет қаралды 5 М.
Customize Dolphin Llama 3 with Ollama!
17:28
AI DevBytes
Рет қаралды 2,1 М.
LLaMA 3 UNCENSORED 🥸 It Answers ANY Question
8:48
Matthew Berman
Рет қаралды 45 М.
Llama 3 RAG: Create Chat with PDF App using PhiData, Here is how..
7:35
Artificial Einstein: Did AI just do the impossible?
19:40
Dr Brian Keating
Рет қаралды 63 М.
RAG from the Ground Up with Python and Ollama
15:32
Decoder
Рет қаралды 20 М.
wyłącznik
0:50
Panele Fotowoltaiczne
Рет қаралды 20 МЛН