thank you! it seems much lighter than uising docker... how can i update it? through git pull?
@patrickdali40633 ай бұрын
thank you so much, this tutorial was extremely clear and useful ! you won a sub !
@MarkAnthonyfernandez-j9h21 күн бұрын
i happen to installed ollama on docker and i wan to try this can I clean up my existing ollama then follow this method?
@luisfernandogarciagonzales9566Ай бұрын
Amazing, thanks From Honduras.
@esmail88Ай бұрын
perfect , clear to the point , funny
@vsurp4 ай бұрын
'open-webui' is not recognized as an internal or external command, operable program or batch file. :/
@Natlamir4 ай бұрын
Sounds like something might have gone wrong during the pip install open-webui?
@vsurp4 ай бұрын
@@Natlamir i honestly don't know, i've tried setting different paths for installation of python and all that.
@youtubeccia92764 ай бұрын
best instruction guy on yt :D
@Natlamir4 ай бұрын
Thank you for the kind words!
@RKFlago2 ай бұрын
Is it possible to install open-webui & ollama in another drive (like d drive)
@kidsstoriezhub12 сағат бұрын
bhai ye narrator kaha se lia, thats more interesting than the entire process xD
@NoobMad5551Ай бұрын
I got a problem when command open-webui serve
@MarcoAWilcox6194 ай бұрын
Thank you! Pretty efficient way to save memory. In my case saved 5GB compared to running open webui with docker.
@Natlamir4 ай бұрын
That is great to hear, saving you 5GB! Thanks for sharing your experience.
@Tech-era4 ай бұрын
How do I run the model offline? once I disable the internet, the whole process just stops
@Natlamir4 ай бұрын
Is that so? Interesting. I will try that. Does that happen for you when using just ollama through the command prompt, or only when using it with WebUI?
@Tech-era4 ай бұрын
@@Natlamir only happens when using llama on webui. It however works fine on the command prompt whether in offline or online mode
@Natlamir4 ай бұрын
@@Tech-era Thanks for the info, I will try that out and will see if the same happens for me when trying through webui.
@Natlamir4 ай бұрын
@@Tech-era I think I found a solution. You will need to add an environment variable. A user variable. Add this: Variable name: OLLAMA_BASE_URLS Variable value: 127.0.0.1:11434 then relaunch the conda prompt and try again.
@brug69983 ай бұрын
how can i update it without uninstalling then reinstalling?
@Natlamir3 ай бұрын
Good question, you should be able to use the upgrade flag when you want to update it. something like this: pip install --upgrade open-webui
@brug69983 ай бұрын
@@Natlamir Thanks so much!
@kingofmayy4 ай бұрын
Hey can you make a video next about how to integrate Fabric GitHub project into the mix, would be super cool if i could use custom prompts inside webui.❤
@Natlamir4 ай бұрын
Thanks for the suggestion! That is a great idea, I will look into integrating fabric patterns for different types of prompts.
@kingofmayy4 ай бұрын
@@Natlamir I can't wait to see the vid, trying to figure it out for myself as well.
@DarrenAddy3 ай бұрын
I second the Fabric integration!
@altf4urlife4 ай бұрын
If I use this method, will it run on the CPU or GPU?
@Natlamir4 ай бұрын
Should run on GPU. When I run it, it is utilizing GPU / CUDA
@JoshVonhauger4 ай бұрын
Great video i am new to conda. How do you SSL cert the environment
@Natlamir4 ай бұрын
Thanks! I am not too familiar with that, but they seem to have some detail in the Hosting section: docs.openwebui.com/tutorial/apache/