Design Your Own Ollama Model Now!

  Рет қаралды 24,540

Matt Williams

Matt Williams

Күн бұрын

Пікірлер: 42
@TheFullTimer
@TheFullTimer 2 ай бұрын
I wanted to say that I loved this video and subscribed after watching. I believe my search was "Ollama Model File template" or something similar, scrolled past some thumbnails or channels I didn't like but liked the preview when I hovered over yours. I had watched some others first, from large name channels, which weren't as helpful as they first appeared. However, I immediately found that your content was right on target for both your title and my search. It was everything I needed to provide some insight to someone starting with Ollama. I had no idea who you are, didn't look at other things on your channel, and didn't read the details or comments before hand. Just your video and your presentation at face value. Not to harp on that but, as this platform can be fickle, I wanted to pass along details that may be helpful. Again, it was excellent and I truly appreciate the time you took to script and produce content to educate the audience in a natural and respectful manner. Thank you.
@wilson_joe
@wilson_joe 8 ай бұрын
I really appreciate you videos, you have a simple, understandable , friendly approach to teaching which keeps me coming back for more.
@technovangelist
@technovangelist 7 ай бұрын
Thanks.
@user-wr4yl7tx3w
@user-wr4yl7tx3w 9 ай бұрын
may be a dummy guide would be also helpful. it's a bit advance, the content, though very useful.
@karthickdurai2157
@karthickdurai2157 5 ай бұрын
this would be a dummy guide
@jrnmadsen2710
@jrnmadsen2710 3 ай бұрын
You're right. Ignore comment from @karthickdurai2157,- just trying to give an illusion of competence.
@arthurtempest3824
@arthurtempest3824 3 ай бұрын
@@karthickdurai2157 oh-......uhm...EVEN MORE DUMMY VIDEO THEN!! 😭😭😭
@tharun2003
@tharun2003 7 ай бұрын
You saved my day. Thank you Matt.
@userou-ig1ze
@userou-ig1ze 9 ай бұрын
Please keep doing what you're doing, at this point I would guess job offers from all over the world pour in. Thanks for your continuous videos! I went through this with the meditron model, that I suspect is still not fully correct in prompt format, but couldn't fix, maybe with this video I will be more successful. Ps: Let us know in case you sell merch :)
@technovangelist
@technovangelist 9 ай бұрын
No merch but the is a patreon at patreon.com/technovangelist and a newsletter at technovangelist.com/newsletter
@francescobassignana4211
@francescobassignana4211 8 ай бұрын
Hi! Thanks for the video. I have a question about using the Ollama model with LangChain: When I run the .invoke method with a simple prompt, does the Ollama library automatically insert the prompt into the pre-configured template in the model file, or do I need to manually include it in the LangChain prompt template?
@tecnopadre
@tecnopadre 9 ай бұрын
I love your endings
@xspydazx
@xspydazx 7 ай бұрын
very good !!! qell understood.... (quick advice)... Tempreture is related to the training also (as things which were not trained deeply will need higher tempreture ... and things deeply embedded will be ok with lowest tempreture: how do people train thier odels and what are thier acceptable levels? as some are .=0.5 and under whilst other dont care and let the model complete an epoch on large dataset and assume the data took .. as long as thier final output was preferable : when in fact all the data which did not go in at the loss below .0.5 did not take and is not retrivable perhaps its there ephemeallly ... as it is like a pretraining ... its just used for next word prediction... but we are doing tasks ! which is whole sequece prediction/recall so when we train for a task we expect the whole of the data set to be fit in range .... so low temptrture 1 should be acceptable losses ... Some say tha this effect the soft max of possiblisty chosen byu the topk sample as well as the topP percentage of cutt of... but this is when there are many sample chosen... but this also depicts the values that were trained at thatr rate of loss .... so it will be collecting sample from the level under the temptrture rate of 1 ( a lot ) so this will need constraining with topP (selecting the highest of probablitys ... but the softmax will also spread them alowing for more random also, when the model has been over trained.) .. so an over trained model can be loosened by raising the temptretue and a wild model tamed ! lol...
@xspydazx
@xspydazx 7 ай бұрын
i would like to see a vdieo on publishing a modl really !
@redgeciccone8218
@redgeciccone8218 8 күн бұрын
Thank you for this video, but I still don’t understand the concept. My company is asking to integrate an AI with their own data or a system that processes incoming emails. I need the model to analyze the email and return a JSON with data from its analysis. I tried integrating this into a prompt, but the AI makes mistakes. I played with the temperature, setting it to 0. The result is almost perfect, but it can still make errors. In short, I’d like to be able to train the bot with the company’s data. I’m a bit lost about what I can do.
@eyeseethru
@eyeseethru 9 ай бұрын
Thanks for all the helpful videos on Ollama! I've since located the answers, but these are a few questions I was always left with whenever I saw mention of making a model file. Asking so it may help other new Ollama users: What kind of file is it? What program should be used to create it? Is it saved in a specific file format or location?
@technovangelist
@technovangelist 9 ай бұрын
I created it in vscode. It’s just a text file like everything else in a code editor. And put it anywhere you like. Once you run ollama create, blobs and manifests are generated in a specific place.
@wardehaj
@wardehaj 9 ай бұрын
Great video, very usefull! I have a request for you: please make a video about making an ollama model of dbrx and/or grok 1.5 vision models
@atrocitus777
@atrocitus777 9 ай бұрын
i see that you can use your own docker registry with ollama as a way of hosting model files. would love to see a video on this for users running ollama on closed networks.
@technovangelist
@technovangelist 9 ай бұрын
It’s not actually the same as the docker registry. It was written by the same person that created the docker registry though.
@technovangelist
@technovangelist 9 ай бұрын
It had to be modified because layers in a docker image are tiny whereas models are huge.
@twinnie38
@twinnie38 8 ай бұрын
So helpful, so interesting, thanks 👍After generating my model, I notice that I have to specify the number of layers to use even though my GPU has enough memory in Ollama (--n-gpu-layers). If I use fewer layers, what does this mean in practice?
@KhanaKhala1
@KhanaKhala1 9 ай бұрын
Extremely useful but what if there is no template in the readme?
@technovangelist
@technovangelist 9 ай бұрын
Then look in that file I showed. And if not there then look how the model was trained or fine tuned
@explorer945
@explorer945 9 ай бұрын
Thank you for the short and sweet video. How do you get so much good audio quality on your videos? step 0: have a great voice. What is step1 (gear, setup in OBS/plugins) :?
@technovangelist
@technovangelist 9 ай бұрын
I think I need a video on it. I don’t use obs though.
@explorer945
@explorer945 9 ай бұрын
@@technovangelist yes, video please. You could add affiliate links to the gear as well. Really loved the base in audio
@technovangelist
@technovangelist 9 ай бұрын
kzbin.info/www/bejne/goLIZHd8n7KMqKcsi=R4u3h6yPtbUaHeDh
@mpesakapoeta
@mpesakapoeta 5 ай бұрын
Any tutorial on model creating from custom data,like pdf s? Like for companies?
@AyushSharma-qd1lq
@AyushSharma-qd1lq 5 ай бұрын
yes please ive been looking for this. if you find anything please share, any help is appreciated
@AliAlias
@AliAlias 9 ай бұрын
Thanks ❤ Very helpful 😊
@Hemanthkumar-zz6fb
@Hemanthkumar-zz6fb 5 ай бұрын
Can I train model via chatting with the model? how to do it
@UTubeGuyJK
@UTubeGuyJK 9 ай бұрын
My coworker and I set up a windows machine to run ollama. It works great but occasionally seems to crash. Could it be the keep_alive setting? If I want others to be able to hit it via the api, should I set the keep_alive to “forever”? (I don’t remember the flag for that off the top of my head). Thanks for your work on Ollama!
@technovangelist
@technovangelist 9 ай бұрын
In most cases you shouldn’t need to worry about keep alive.
@Cloud_Dude
@Cloud_Dude 9 ай бұрын
there is folder based on the date of this video . do you have a gist containing content of the template per model ?
@technovangelist
@technovangelist 9 ай бұрын
No. It was just a few lines that you can grab from the same sources I did so didn’t bother with it
9 ай бұрын
Excellent thank you!
@sanjaybhatikar
@sanjaybhatikar 6 ай бұрын
I fine-tuned a HuggingFace embedding model locally and got a set of files with safetensors on disk. Tried to convert to GGUF with llama.cpp but it fails. Ollama requires the model as GGUF. Any suggestions for how to integrate the fine-tuned model in Ollama?😢
@technovangelist
@technovangelist 6 ай бұрын
I’m not sure. Your best bet is to ask on the ollama discord. Discord.gg/ollama
@SonGoku-pc7jl
@SonGoku-pc7jl 9 ай бұрын
thanks!!!
@florentflote
@florentflote 9 ай бұрын
Don’t Embed Wrong!
11:42
Matt Williams
Рет қаралды 19 М.
Is Open Webui The Ultimate Ollama Frontend Choice?
16:43
Matt Williams
Рет қаралды 125 М.
Quando A Diferença De Altura É Muito Grande 😲😂
00:12
Mari Maria
Рет қаралды 45 МЛН
So Cute 🥰 who is better?
00:15
dednahype
Рет қаралды 19 МЛН
Сестра обхитрила!
00:17
Victoria Portfolio
Рет қаралды 958 М.
Fine Tune a model with MLX for Ollama
8:40
Matt Williams
Рет қаралды 66 М.
How is this Website so fast!?
13:39
Wes Bos
Рет қаралды 1,3 МЛН
How to create CUSTOM LLMs using OLLAMA
10:25
The Neural Maze
Рет қаралды 1,7 М.
Coding a Web Server in 25 Lines - Computerphile
17:49
Computerphile
Рет қаралды 362 М.
Better Searches With Local AI
8:30
Matt Williams
Рет қаралды 31 М.
Optimize Your AI - Quantization Explained
12:10
Matt Williams
Рет қаралды 15 М.
EASIEST Way to Fine-Tune a LLM and Use It With Ollama
5:18
warpdotdev
Рет қаралды 262 М.
AI Is Making You An Illiterate Programmer
27:22
ThePrimeTime
Рет қаралды 247 М.
Quando A Diferença De Altura É Muito Grande 😲😂
00:12
Mari Maria
Рет қаралды 45 МЛН