Installing Open WebUI Ollama Local Chat with LLMs and Documents without Docker

  Рет қаралды 7,892

Natlamir

Natlamir

Күн бұрын

Пікірлер: 37
@marcocinalli755
@marcocinalli755 Күн бұрын
thank you! it seems much lighter than uising docker... how can i update it? through git pull?
@patrickdali4063
@patrickdali4063 3 ай бұрын
thank you so much, this tutorial was extremely clear and useful ! you won a sub !
@MarkAnthonyfernandez-j9h
@MarkAnthonyfernandez-j9h 21 күн бұрын
i happen to installed ollama on docker and i wan to try this can I clean up my existing ollama then follow this method?
@luisfernandogarciagonzales9566
@luisfernandogarciagonzales9566 Ай бұрын
Amazing, thanks From Honduras.
@esmail88
@esmail88 Ай бұрын
perfect , clear to the point , funny
@vsurp
@vsurp 4 ай бұрын
'open-webui' is not recognized as an internal or external command, operable program or batch file. :/
@Natlamir
@Natlamir 4 ай бұрын
Sounds like something might have gone wrong during the pip install open-webui?
@vsurp
@vsurp 4 ай бұрын
@@Natlamir i honestly don't know, i've tried setting different paths for installation of python and all that.
@youtubeccia9276
@youtubeccia9276 4 ай бұрын
best instruction guy on yt :D
@Natlamir
@Natlamir 4 ай бұрын
Thank you for the kind words!
@RKFlago
@RKFlago 2 ай бұрын
Is it possible to install open-webui & ollama in another drive (like d drive)
@kidsstoriezhub
@kidsstoriezhub 12 сағат бұрын
bhai ye narrator kaha se lia, thats more interesting than the entire process xD
@NoobMad5551
@NoobMad5551 Ай бұрын
I got a problem when command open-webui serve
@MarcoAWilcox619
@MarcoAWilcox619 4 ай бұрын
Thank you! Pretty efficient way to save memory. In my case saved 5GB compared to running open webui with docker.
@Natlamir
@Natlamir 4 ай бұрын
That is great to hear, saving you 5GB! Thanks for sharing your experience.
@Tech-era
@Tech-era 4 ай бұрын
How do I run the model offline? once I disable the internet, the whole process just stops
@Natlamir
@Natlamir 4 ай бұрын
Is that so? Interesting. I will try that. Does that happen for you when using just ollama through the command prompt, or only when using it with WebUI?
@Tech-era
@Tech-era 4 ай бұрын
@@Natlamir only happens when using llama on webui. It however works fine on the command prompt whether in offline or online mode
@Natlamir
@Natlamir 4 ай бұрын
@@Tech-era Thanks for the info, I will try that out and will see if the same happens for me when trying through webui.
@Natlamir
@Natlamir 4 ай бұрын
@@Tech-era I think I found a solution. You will need to add an environment variable. A user variable. Add this: Variable name: OLLAMA_BASE_URLS Variable value: 127.0.0.1:11434 then relaunch the conda prompt and try again.
@brug6998
@brug6998 3 ай бұрын
how can i update it without uninstalling then reinstalling?
@Natlamir
@Natlamir 3 ай бұрын
Good question, you should be able to use the upgrade flag when you want to update it. something like this: pip install --upgrade open-webui
@brug6998
@brug6998 3 ай бұрын
@@Natlamir Thanks so much!
@kingofmayy
@kingofmayy 4 ай бұрын
Hey can you make a video next about how to integrate Fabric GitHub project into the mix, would be super cool if i could use custom prompts inside webui.❤
@Natlamir
@Natlamir 4 ай бұрын
Thanks for the suggestion! That is a great idea, I will look into integrating fabric patterns for different types of prompts.
@kingofmayy
@kingofmayy 4 ай бұрын
@@Natlamir I can't wait to see the vid, trying to figure it out for myself as well.
@DarrenAddy
@DarrenAddy 3 ай бұрын
I second the Fabric integration!
@altf4urlife
@altf4urlife 4 ай бұрын
If I use this method, will it run on the CPU or GPU?
@Natlamir
@Natlamir 4 ай бұрын
Should run on GPU. When I run it, it is utilizing GPU / CUDA
@JoshVonhauger
@JoshVonhauger 4 ай бұрын
Great video i am new to conda. How do you SSL cert the environment
@Natlamir
@Natlamir 4 ай бұрын
Thanks! I am not too familiar with that, but they seem to have some detail in the Hosting section: docs.openwebui.com/tutorial/apache/
@receps.8396
@receps.8396 2 ай бұрын
what is the name of the narrator? Man?
@codingchannel6263
@codingchannel6263 12 күн бұрын
that stpis voice AI is annoying
Learn Ollama in 15 Minutes - Run LLM Models Locally for FREE
14:02
Tech With Tim
Рет қаралды 83 М.
Run ALL Your AI Locally in Minutes (LLMs, RAG, and more)
20:19
Cole Medin
Рет қаралды 378 М.
Леон киллер и Оля Полякова 😹
00:42
Канал Смеха
Рет қаралды 4,7 МЛН
Introduction to STACKS | Data Structures & Algorithms | Lecture 66
22:11
EASIEST Way to Fine-Tune a LLM and Use It With Ollama
5:18
warpdotdev
Рет қаралды 251 М.
Getting Started with Ollama and Web UI
13:35
Dan Vega
Рет қаралды 81 М.
host ALL your AI locally
24:20
NetworkChuck
Рет қаралды 1,6 МЛН
FREE Domain and SSL for Local Network | Nginx Proxy Manager on Docker - #13
16:22
Tech - The Lazy Automator
Рет қаралды 86 М.
Ollama UI - Your NEW Go-To Local LLM
10:11
Matthew Berman
Рет қаралды 135 М.
Use Your Self-Hosted LLM Anywhere with Ollama Web UI
10:03
Decoder
Рет қаралды 81 М.