Crack Ollama Environment Variables with Ease - Part of the Ollama Course

  Рет қаралды 3,799

Matt Williams

Matt Williams

Күн бұрын

Пікірлер: 24
@DodiInkoTariah
@DodiInkoTariah Ай бұрын
Thanks Matt for your videos. Please are you able to do one with instruction for installing LLama 3.2 11B. It will be very helpful for many people but no pressure.
@sadboi5672
@sadboi5672 Ай бұрын
isn't there no 11B model for 3.2? 3.2 only has 1B and 3B variants
@technovangelist
@technovangelist Ай бұрын
When it works I will. But there isn’t anything special with it.
@ShinyBlueBoots
@ShinyBlueBoots 19 күн бұрын
@@technovangelist Is there an EV for locale labguage setting? For example, I want my AI responses to come back in English UK Dictionary?
@christopherditto
@christopherditto Ай бұрын
Thank you for this video! Question, how to I include environmental variables in Ollama responses to prompts? For example, is there something I can add to my modelfile to append the number of tokens used and the general.name environment variable (eg. "Llama 3.2 3B Instruct") to the end of each response?
@emil8367
@emil8367 Ай бұрын
thanks Matt ! Is there any list of all env variables with description for each in the Ollama docs ?
@jimlynch9390
@jimlynch9390 Ай бұрын
I think the OLLAMA_HOST needs a bit more explanation. This is what the server variable that lets you use ollama from another system looks like: Environment=OLLAMA_HOST=0.0.0.0. Then you can access the ollama server by setting a local environment variable to something like "export OLLAMA_HOST=192.168.2.41:11434". if the server is on 192.,168.2.41. Without the 0.0.0.0 the server system will reject any attempts to connect to port 11434.
@ShinyBlueBoots
@ShinyBlueBoots 19 күн бұрын
Is there an EV for locale labguage setting? For example, I want my AI responses to come back in English UK Dictionary?
@technovangelist
@technovangelist 19 күн бұрын
No. If anything that would be part of the prompt
@ShinyBlueBoots
@ShinyBlueBoots 18 күн бұрын
@@technovangelist Thanks Matt! That's what I do now... it works for a small period of time then forgets :-)
@alx8439
@alx8439 Ай бұрын
After feature request 4361 ollama team have added all the previously missing configuration options to be shown via `ollama serve -h`
@technovangelist
@technovangelist Ай бұрын
yup, most are there
@alexlee1711
@alexlee1711 Ай бұрын
Run ollama on macOS, but it only use CPU. In "System Monitoring", GPU occupies 0%. Environment: macOS14 + radeon RX570(metal is supported) and AMD radeon por VII (metal3 is supported).
@technovangelist
@technovangelist Ай бұрын
Gpu on Mac is only supported with Apple Silicon Macs unfortunately. Since they are getting older every day I don’t see that changing.
@alexlee1711
@alexlee1711 Ай бұрын
​@@technovangelist Thank you for your guidance. It seems that I have to use ubuntu or windows.
@technovangelist
@technovangelist Ай бұрын
But even if you do install Ubuntu or windows on that machine the gpu isn’t supported. I think your best bet is an updated Mac.
@MaxJM74
@MaxJM74 Ай бұрын
tks 👍
@emmanuelgoldstein3682
@emmanuelgoldstein3682 Ай бұрын
I'm subscribed with all notifications turned on but I didn't get this one for some reason... ☹
@M24Tom
@M24Tom Ай бұрын
what about ollama running in docker container?
@technovangelist
@technovangelist Ай бұрын
What about it. That’s the easy one. Just add them to the docker command
@marcusk7855
@marcusk7855 Ай бұрын
Just tried to change the temp directory yesterday on linux. It does not work.
@toadlguy
@toadlguy Ай бұрын
Ha, ha, ha. We understand how KZbin works. We either pay for YT Premium or we watch ads and you get paid based on views. You don’t need to announce it is FREE at the beginning of your video. (Thanks for the content, though😊)
@technovangelist
@technovangelist Ай бұрын
If that were true I wouldn’t be asked often if it would stay free. Lots put a teaser on KZbin then move the rest to a paid platform.
Have You Picked the Wrong AI Agent Framework?
13:10
Matt Williams
Рет қаралды 77 М.
Revolutionize Your Notes with AI Magic!
10:13
Matt Williams
Рет қаралды 3,8 М.
МЕНЯ УКУСИЛ ПАУК #shorts
00:23
Паша Осадчий
Рет қаралды 3,3 МЛН
Haunted House 😰😨 LeoNata family #shorts
00:37
LeoNata Family
Рет қаралды 11 МЛН
Человек паук уже не тот
00:32
Miracle
Рет қаралды 4,1 МЛН
бабл ти гель для душа // Eva mash
01:00
EVA mash
Рет қаралды 8 МЛН
Optimize Your AI Models
11:43
Matt Williams
Рет қаралды 13 М.
Is Open Webui The Ultimate Ollama Frontend Choice?
16:43
Matt Williams
Рет қаралды 99 М.
Fine Tune a model with MLX for Ollama
8:40
Matt Williams
Рет қаралды 37 М.
The intro to Docker I wish I had when I started
18:27
typecraft
Рет қаралды 228 М.
I love small and awesome models
11:43
Matt Williams
Рет қаралды 27 М.
Using Ollama and N8N for AI Automation
13:43
Matt Williams
Рет қаралды 34 М.
Microservices are Technical Debt
31:59
NeetCodeIO
Рет қаралды 626 М.
Don’t Embed Wrong!
11:42
Matt Williams
Рет қаралды 12 М.
Embeddings in Depth - Part of the Ollama Course
10:27
Matt Williams
Рет қаралды 9 М.
18 Weird and Wonderful ways I use Docker
26:18
NetworkChuck
Рет қаралды 375 М.
МЕНЯ УКУСИЛ ПАУК #shorts
00:23
Паша Осадчий
Рет қаралды 3,3 МЛН