epic ending, Matt. Thanks, amazing job you're doing over here/there.
@FrankSchwarzfree3 күн бұрын
Matt, this video was incredibly helpful! I was one of those poor souls lost in the Discord trying to figure out updates. You just saved me (and probably a lot of others) a ton of time and frustration. The step-by-step for each platform was clear and concise - even *I* could understand it! Keep up the amazing work, and looking forward to more Ollama wisdom in your course. Subscribed! 😄 👍
@gemini1q2 күн бұрын
First introduction, presentation quality is like Tim Cook, video content is even better...
@brinkoo73 күн бұрын
Idk why I didn’t think about just using docker 😂 thank you 🙏
@saurav10962 күн бұрын
using ollama in docker makes it slowww
@FrancotujkКүн бұрын
hi Matt! Some time ago I asked you about how to integrate llms directly in an app. I mean for example for an Electron (desktop) app, how to include an llm inside it. You have helped me a lot indicating me a post of how to use the binaries (a post on discord) as Llamacpp does. However, I've not achieved the performance that Ollama has. Do you think is it possible to use ollama as something like a "package/container" and include it directly on an Electron app? So instead of dealing with creating ollama again, just use it directly. Thanks for all the content you post in your channel, it's really helpful!
@jamesspo3 күн бұрын
Very clear and helpful content. Thank you.
@enriquebruzual17022 күн бұрын
I have it installed on Win 11, I really like how simple the app is. I have had some set backs with the automatic updates, but so far always fixable.
@emil83672 күн бұрын
thanks Matt, is there a way to do the update without loosing the ollama.service configuration each time for eg Ubuntu ? (ofc I do have backup and I paste my old config but I'd really like to keep what is inside the ollama.service 🙂)
@FahtheGamer2 күн бұрын
Hi i am running ollama on android it seems to run fine for models with 8b and less then that fine
@LawrenceOrsiniКүн бұрын
@technovangelist why not use WSL? You didn’t give a reason why it is faster?
@technovangelistКүн бұрын
Using wsl means you are using ollama in a Ubuntu container in the wsl vm on top of windows. Native for most folks is 20-30% faster.
@diggajupadhyay3 күн бұрын
Thanks Matt ❤
@TheTechnicalMystique2 күн бұрын
Do you happen to know how to retrieve or change your webui password on linux? Ive been trying and asked discord, but to no avail.
@JanezNovak-u6y3 күн бұрын
How to configure ollama for windows to use proxy. I have pc behind corporate proxy but ollama for windows not using it.
@brinkoo73 күн бұрын
Funny this pops up as im trying to figure out how to build newest version on NixOS 😂😝
@AliAlias2 күн бұрын
How to make ollama auto download and update
@HaydonRyan2 күн бұрын
I do wish the ollama team would build and distribute via the standard package managers. That said I understand it’s not the core work.