No video

Unlocking The Power Of AI: Creating Python Apps With Ollama!

  Рет қаралды 25,963

Matt Williams

Matt Williams

Күн бұрын

Its amazing how easy the Python library for Ollama makes it to build AI into your apps. You can be up and running in minutes. This video gives you a nice overview of what is possible.
It seems I forgot to push the code to the repo before disappearing for the weekend to go camping. But it's there now. On a side note, Crescent Beach on the north coast of the Olympic Peninsula of Washington State is gorgeous....
Code for all the videos can be found at github.com/tec...
Be sure to sign up to my monthly newsletter at technovangelis...
And if interested in supporting me, sign up for my patreon at / technovangelist

Пікірлер: 54
@tal7atal7a66
@tal7atal7a66 4 ай бұрын
this guy is professor , all my respects ❤
@bonolio
@bonolio Ай бұрын
Straight up, I am not a programmer, but I know enough to hack together tools with python when I need to. I had some tasks that I wasn't to add some AI functionality to, but have never really coded against an LLM or run a local LLM. This video and your Up and running with Ollama video were brilliant. It was like a condensed info dump that gave just what I needed to go from zero to some functional solutions in just a few hours. I know I have so much more to learn, but in truth, but with just the few examples you have provided, the possibilies are endless. I am subscribed and will be trawling through the rest of your content as a priority.
@KOSMIKFEADRECORDS
@KOSMIKFEADRECORDS 23 күн бұрын
please put me on to the video... what real world usecase have you now found?
@HashirAbdi
@HashirAbdi 4 ай бұрын
Best non-intimidating and Clear Explanation of how to leverage the Python Ollama library. Tazeem (Respects)
@randomdude12370
@randomdude12370 4 ай бұрын
I love the way you describe things. Great work!
@matschbiem2082
@matschbiem2082 3 ай бұрын
Great video to get started with the Python API, thanks a lot! :) Your way of explaining, presenting and so on is really great
@utawmuddy5940
@utawmuddy5940 3 ай бұрын
you have no idea how good this info is for myself! wish I would have really seen this a while ago!
@lundril
@lundril 4 ай бұрын
I am just starting into the LLM and this video from you really helps. Thank You :-)
@GonzoGonschi
@GonzoGonschi 2 ай бұрын
finally someone with explaining skills :)
@mwkoti
@mwkoti 4 ай бұрын
Thanks for your work. Your videos are always spot-on
@technovangelist
@technovangelist 4 ай бұрын
Glad you like them!
@TorsTechTalk
@TorsTechTalk Ай бұрын
wonderful video, subbed! Thank you
@NLPprompter
@NLPprompter 4 ай бұрын
OMG i love this one too. I really hope ollama will keep growing,
@cavalrycome
@cavalrycome 4 ай бұрын
Ollama also provides an OpenAI-compatible API, which is more convenient to code for because you can more easily switch between model servers that are compatible with that standard (OpenAI, Mistral, Groq, LM Studio, etc.). That version of the API is available under /v1 instead of /api.
@technovangelist
@technovangelist 4 ай бұрын
Using the OpenAI api should be an option of last resort. It’s there for compatibility but it’s always going to be less features than the native api. The OpenAI api was poorly designed, as stated by many of the OpenAI devs in various articles. They didn’t expect chatgpt to be useful to anyone. Many newer apis are more consistent and well thought out. Using the OpenAI api will definitely hold you back.
@actellimQT
@actellimQT 2 ай бұрын
Does this still apply? New openai API is out and response can't pass tools as far as I can reckon
@joetrades2472
@joetrades2472 3 ай бұрын
Id love to see in more detail how to set o llama in a cloud server as you did at the end. Also how to make it safe by stsblishing headers and certification
@qbitsday3438
@qbitsday3438 3 ай бұрын
Hi Mat thank you so much for all the great Videos , If you could do a video on how to Embed AI in to a chip (can be narrow AI) .
@bruhaspati560
@bruhaspati560 4 күн бұрын
In chat completion is there something like chat request with a pdf file?
@stuartedwards3622
@stuartedwards3622 3 ай бұрын
Great stuff Matt. Was having a play with this today and have a question. In your script 7.py, you explain how to train the model with system / user / assistant messages and then get an output for a new user message (Amsterdam). How could we extend this to cover multiple messages (London, Brussels, Madrid etc) without having to reprompt the model with the system and assistant messages each time? I could only get a single output from each call to chat despite putting in several user messages.
@nicolastecher577
@nicolastecher577 4 ай бұрын
Fantastic video! Would you mind explaining how to export the app for others to use? I bet what makes the model work doesn’t get bundled when using py2app…..
@jimlynch9390
@jimlynch9390 4 ай бұрын
I don't see the intro-dev-python... in your videoprojects repo.
@technovangelist
@technovangelist 4 ай бұрын
Doh. Though I got it on before my internet went out.
@henrijohnson7779
@henrijohnson7779 4 ай бұрын
Yep - I was searching also and could not find it up to now
@arjunarihant7283
@arjunarihant7283 4 ай бұрын
yeah, its still not there
@technovangelist
@technovangelist 4 ай бұрын
yes it is. there is a link in the description
@vanderleigoncalves5184
@vanderleigoncalves5184 2 ай бұрын
Very good
@technovangelist
@technovangelist 2 ай бұрын
Thanks.
@MuhammadAzhar-eq3fi
@MuhammadAzhar-eq3fi 4 ай бұрын
Thank you Sir
@userou-ig1ze
@userou-ig1ze 4 ай бұрын
How do I invest in you or your company, this guy is going places 😊
@timtensor6994
@timtensor6994 4 ай бұрын
Hi Matt do you also have a video on tree summarization ? For example currently i am looking into reddit api where i can extract posts and posts description. The summarization of post description is relatively straightforward. Since it mostly comes within the context length . However with regards to get a good understanding of comments its a bit difficult . Is there a way to do that?
@technovangelist
@technovangelist 4 ай бұрын
interesting. That’s the way i just assumed would be how to summarize a long doc. Never knew someone put a name on it. dealing with comments is tough though. i assume reddit is like slack and discord where figuring out threads is hard.
@timtensor6994
@timtensor6994 4 ай бұрын
@@technovangelist true but with some preprocessing, i could extract the comments , description , title and posts as pandas dataframe. The problem as mentioned is the comments, how to summarize them together while keeping context to ^basically lets say keep the llm in track , kind of like given this summary of the description please do recursive summary of comments .
@roborob2570
@roborob2570 4 ай бұрын
Hi Matt, fascinating to learn that you can interface with ollama. Could you please point me to the 1.py and so on files ? I can not find the files on github and I tried :-). Much obliged, Rob
@akhilreddygogula8884
@akhilreddygogula8884 4 ай бұрын
Hii Matt, Great content, I'm really curious about ollama services deployed on k8s, scaling multiple instances on a local cluster for multi agent based application is a good use case. I mostly prefer working with AI locally with open source models as most of the community do. Hope you'll look at this. Thanks
@technovangelist
@technovangelist 4 ай бұрын
K8s is great for a lot of things but I don’t think this is one of them.
@akhilreddygogula8884
@akhilreddygogula8884 4 ай бұрын
@@technovangelist thanks for reply, I got one more doubt, So Is it because, ollama doesn't support parallel inference(as of my knowledge) to serve multiple users or any other reason. Otherwise can you suggest a better inference serve that works with k8s??
@technovangelist
@technovangelist 4 ай бұрын
Its just a use case that isn't well suited for something like k8s. That’s great for tools that don't require a huge load. Ollama and similar tools will use 100% of a server. All the cpu and All the GPU. So k8s just takes some of that away. The Ollama team believes in kubernetes. We formed to create a tool for RBAC on K8S. But AI is not well suited to the platform.
@johnkoeck3702
@johnkoeck3702 4 ай бұрын
Can Ollama run locally with speech recognition and text to speech? I'm also curious if could be run locally on a raspberry pi 4 or 5 ?
@technovangelist
@technovangelist 4 ай бұрын
No but I have seen some folks have interesting addons that they mention in the discord
@johnkoeck3702
@johnkoeck3702 4 ай бұрын
@technovangelist Thank you for replying so quickly.
@SailAway33
@SailAway33 4 ай бұрын
I love the pause and drink at the end I know it kills people.
@jimlynch9390
@jimlynch9390 4 ай бұрын
I'm a bit confused about base64 coding of image data. Since the context size is defaulted to 2k on ollama (I think I read that somewhere) how can you include image data that is encoded. I suspect even a trivial .png file will generate more than 2 k bytes. What am I missing? By the way, thanks for this video, it really helps.
@chrisBruner
@chrisBruner 4 ай бұрын
I added an example to the github, but it's not been incorporated into the readme yet. Just do this type of thing. ollama.generate(model='llava', prompt='What is this image', images =['IMG_8798.JPG'])
@dinoscheidt
@dinoscheidt 4 ай бұрын
Just a way to transport the data (in a poor way). Imagine you only can write chat messages with another person, and want to send an image. You would come up with some weird string character based combination as well to describe each pixel in the image you originally want to send. On the other side, that string is than turned into what it should be: a binary bit map. But if your protocol can’t sent bit maps, you need to go a level higher and encode it in text, or emojis, or what ever. Welcome to the world of Python and the engineering craftsmanship level of „data scientists“.
@cavalrycome
@cavalrycome 4 ай бұрын
First, context lengths are specified in terms of the number of tokens rather than bytes. Tokens are words or parts of words in the case of language models. Second, if you're using the models curated by Ollama, they will have whatever context length those models support. Llava 1.6 has a context length of 32K for example. When creating your own modelfile, you need to specify the context length as one of the parameters in it using the num_ctx parameter.
@Delchursing
@Delchursing 4 ай бұрын
Would using Ollama allow us to make free llm on local machine?
@technovangelist
@technovangelist 4 ай бұрын
Yes, absolutely
@Delchursing
@Delchursing 4 ай бұрын
@@technovangelist wow, just getting into ai and a bit worried about tokens. Thank you.
@FloodGold
@FloodGold 4 ай бұрын
What no April Fools joke? Too bad you didn't flip the awkward end to the start, haha
@armandCodes
@armandCodes 4 ай бұрын
I'm the only one who can't find the .py files?
@technovangelist
@technovangelist 4 ай бұрын
go to the repo for all the video projects: github.com/technovangelist/videoprojects. they are sorted by date of the video and there is also the name of the topic.
@technovangelist
@technovangelist 4 ай бұрын
the location is also in the description
Getting Started on Ollama
11:26
Matt Williams
Рет қаралды 48 М.
Is Dify the easiest way to build AI Applications?
13:50
Matt Williams
Рет қаралды 15 М.
123 GO! Houseによる偽の舌ドッキリ 😂👅
00:20
123 GO! HOUSE Japanese
Рет қаралды 5 МЛН
Fortunately, Ultraman protects me  #shorts #ultraman #ultramantiga #liveaction
00:10
Please Help Barry Choose His Real Son
00:23
Garri Creative
Рет қаралды 21 МЛН
Survive 100 Days In Nuclear Bunker, Win $500,000
32:21
MrBeast
Рет қаралды 161 МЛН
Supercharge your Python App with RAG and Ollama in Minutes
9:42
Matt Williams
Рет қаралды 33 М.
Level Up Your Typescript Skills: Adding Ollama To Your Apps!
13:12
Matt Williams
Рет қаралды 24 М.
Python RAG Tutorial (with Local LLMs): AI For Your PDFs
21:33
pixegami
Рет қаралды 201 М.
How does function calling with tools really work?
10:09
Matt Williams
Рет қаралды 10 М.
Unlocking The Power Of GPUs For Ollama Made Simple!
11:53
Matt Williams
Рет қаралды 25 М.
ADVANCED Python AI Agent Tutorial - Using RAG
40:59
Tech With Tim
Рет қаралды 137 М.
The Secret Behind Ollama's Magic: Revealed!
8:27
Matt Williams
Рет қаралды 31 М.
Finally Ollama has an OpenAI compatible API
10:47
Matt Williams
Рет қаралды 18 М.
I Hacked a Discord Bot, the Owner said this...
9:09
No Text To Speech
Рет қаралды 1,2 МЛН
123 GO! Houseによる偽の舌ドッキリ 😂👅
00:20
123 GO! HOUSE Japanese
Рет қаралды 5 МЛН