Thanks Matt for your videos. Please are you able to do one with instruction for installing LLama 3.2 11B. It will be very helpful for many people but no pressure.
@sadboi5672Ай бұрын
isn't there no 11B model for 3.2? 3.2 only has 1B and 3B variants
@technovangelistАй бұрын
When it works I will. But there isn’t anything special with it.
@ShinyBlueBoots19 күн бұрын
@@technovangelist Is there an EV for locale labguage setting? For example, I want my AI responses to come back in English UK Dictionary?
@christopherdittoАй бұрын
Thank you for this video! Question, how to I include environmental variables in Ollama responses to prompts? For example, is there something I can add to my modelfile to append the number of tokens used and the general.name environment variable (eg. "Llama 3.2 3B Instruct") to the end of each response?
@emil8367Ай бұрын
thanks Matt ! Is there any list of all env variables with description for each in the Ollama docs ?
@jimlynch9390Ай бұрын
I think the OLLAMA_HOST needs a bit more explanation. This is what the server variable that lets you use ollama from another system looks like: Environment=OLLAMA_HOST=0.0.0.0. Then you can access the ollama server by setting a local environment variable to something like "export OLLAMA_HOST=192.168.2.41:11434". if the server is on 192.,168.2.41. Without the 0.0.0.0 the server system will reject any attempts to connect to port 11434.
@ShinyBlueBoots19 күн бұрын
Is there an EV for locale labguage setting? For example, I want my AI responses to come back in English UK Dictionary?
@technovangelist19 күн бұрын
No. If anything that would be part of the prompt
@ShinyBlueBoots18 күн бұрын
@@technovangelist Thanks Matt! That's what I do now... it works for a small period of time then forgets :-)
@alx8439Ай бұрын
After feature request 4361 ollama team have added all the previously missing configuration options to be shown via `ollama serve -h`
@technovangelistАй бұрын
yup, most are there
@alexlee1711Ай бұрын
Run ollama on macOS, but it only use CPU. In "System Monitoring", GPU occupies 0%. Environment: macOS14 + radeon RX570(metal is supported) and AMD radeon por VII (metal3 is supported).
@technovangelistАй бұрын
Gpu on Mac is only supported with Apple Silicon Macs unfortunately. Since they are getting older every day I don’t see that changing.
@alexlee1711Ай бұрын
@@technovangelist Thank you for your guidance. It seems that I have to use ubuntu or windows.
@technovangelistАй бұрын
But even if you do install Ubuntu or windows on that machine the gpu isn’t supported. I think your best bet is an updated Mac.
@MaxJM74Ай бұрын
tks 👍
@emmanuelgoldstein3682Ай бұрын
I'm subscribed with all notifications turned on but I didn't get this one for some reason... ☹
@M24TomАй бұрын
what about ollama running in docker container?
@technovangelistАй бұрын
What about it. That’s the easy one. Just add them to the docker command
@marcusk7855Ай бұрын
Just tried to change the temp directory yesterday on linux. It does not work.
@toadlguyАй бұрын
Ha, ha, ha. We understand how KZbin works. We either pay for YT Premium or we watch ads and you get paid based on views. You don’t need to announce it is FREE at the beginning of your video. (Thanks for the content, though😊)
@technovangelistАй бұрын
If that were true I wouldn’t be asked often if it would stay free. Lots put a teaser on KZbin then move the rest to a paid platform.