For macOS/Linux users, please check out this tutorial: kzbin.info/www/bejne/d5vQYWhteM2Ip9k
@rayrwyr8 сағат бұрын
Thanks for your video. Full of useful knowledge. How to enable support for web browsing for local installation of DeepSeek R1? I am not worried about my DeepSeek R1 local installation accessing the internet. I asked what is today's date and it answered "The date today is October 29, 2023.". It says it does not have access to web browsing. I am running it via ollama and chatbox.
@himaya-relaxsleep14103 күн бұрын
I was watching your geospatial analysis tutorials for my thesis. Now I have graduated and working as a Data Analyst, back here again.
@giswqs3 күн бұрын
Welcome back!
@Bittermandl3 күн бұрын
Thx a lot for the video, works great!
@ONYXELIZER3 күн бұрын
THANK YOUUUUUUU YOU’RE THE BOSS
@zhaokane2 күн бұрын
but when I upload local files when I chat with model, get an error like "HTTP Error 403: Forbidden", the file is already upload in the data folder, but ollama model just can't read it : (
@storytime_adventures_lsp4 күн бұрын
Great video tutorial thank you sir
@dheerajj74962 күн бұрын
great video does gpu tdp should be considered while selecting the models?
@XD-cr3du2 күн бұрын
Amazing tutorial, thank you! One thing I'm missing, after I've installed everything, and I restart my computer, what is the fastest way to then open up the deepseek model with open web ui again? Do I need to go to powershell everytime and initiate the model and open web ui from there and the manualy go to the localhost location? Or is there a faster way? Like a bookmark on the local host?
@KjelldonMoore2 күн бұрын
wait, isnt the open web ui is under construction?
@giswqsКүн бұрын
See this new tutorial on how to autostart webui: kzbin.info/www/bejne/oIXHlaumrah0gNk
@XD-cr3du20 сағат бұрын
@@giswqs Great! Thank you!
@MohamedAshik-t3l3 күн бұрын
I have run successfully thank you for.
@emanuelv59343 күн бұрын
Thank you!
@zhenlu66113 күн бұрын
Many thanks for your contribution... DeepSeek Janus Pro just out... Is it possible to make a video to guide us to install and test it locally?
@giswqs3 күн бұрын
Will look into it
@zhenlu66112 күн бұрын
@@giswqs appreciate and happy new year
@mohammedadnan24502 күн бұрын
All this lama models are downloaded n installed in drive C ....is there any way to store it in another drive
@gurjotgrewal0074 күн бұрын
can you please make a video of how we can use deepseek locally and use it as a notebook lm for our private files and folders
@v3nkattrader6842 күн бұрын
Awesome ....
@MohamedAshik-t3l3 күн бұрын
Good explained
@michisuper16502 күн бұрын
Thanks for the video. However if I try to use webui without an internet connection I get an error. How can I solve it?
@giswqsКүн бұрын
I just uploaded a new tutorial on how to use open-webui and deepseek without Internet connection. Check it out: kzbin.info/www/bejne/rZWrg6iaYspsrLs
@onemoment73682 күн бұрын
Can we train the AI? if so how? thanks
@zhenlu6611Күн бұрын
do I need change setting in the firefox?is it because the proxy setting? my setting right now is USING SYSTEM PROXY SETTING. I also tried chrome , same problem...should I change to no proxy? I already setup a admin account when I first enter the webui. Many thanks
@giswqsКүн бұрын
It runs locally. No need to use proxy
@giswqsКүн бұрын
It runs locally. No need to use proxy
@zhenlu6611Күн бұрын
@@giswqs i keep on getting: firefox can’t establish a connection to the server at localhost:8080. I just had luck once when first install and setup an admin account, then after close website, the problem began when I tried to reopen...any idea what I should do? do i need open a terminal and run ollama deepseek?
@zhenlu6611Күн бұрын
@@giswqs this is the rejection when using chrome: This site can’t be reached localhost refused to connect. Try: Checking the connection Checking the proxy and the firewall ERR_CONNECTION_REFUSED
@akkir67074 күн бұрын
Thanks sir
@UMERFAROOQ-cl2nd2 күн бұрын
how to open next time do i need to run the command again and again on windows ?
@giswqsКүн бұрын
See this new tutorial on how to autostart webui: kzbin.info/www/bejne/oIXHlaumrah0gNk
@ThereIsProbablyNoGod3 күн бұрын
Is there an easy way, to uninstall everything again? I want to remove OpenwebUI from a windows 11 PC so that nothing is left on the PC.
@giswqs3 күн бұрын
See the instructions here to remove uv from your computer: docs.astral.sh/uv/getting-started/installation/#uninstallation
@onemoment73683 күн бұрын
hi i have everything set up as per your instruction, however i noticed the response time is very slow even just for a simple hello. Do you know why is that? I have amd ryzen 5950x cpu and amd 6800XT gpu. for 8b model it took 17 seconds to respond, and for 32b model, its not even responding and it has been more then a minute. .
@giswqs3 күн бұрын
Open the performance tab from the task manager. Look at if the GPU is being utilized when running the model
@onemoment73683 күн бұрын
@@giswqs Shared GPU memory goes up to 8gb, dedicated less then that, and Memory usage is 31/32, almost 100%
@onemoment73683 күн бұрын
@@giswqs I have downloaded llama3.2, and its using 8gb of Dedicated GPU and its faster then deepseek. I wonder why deekseek is using RAM instead of GPU.
@endbeginner55203 күн бұрын
if i close the powershell and i want to reopen again then i have to wait again this much time to run the model?
@giswqs3 күн бұрын
The package installation is one time only. It should take a few seconds to reopen the web UI
@giswqsКүн бұрын
See this new tutorial on how to autostart webui: kzbin.info/www/bejne/oIXHlaumrah0gNk
@TheDroidan2 күн бұрын
Can we use Geforce RTX 3070 Ti 8GB for this?
@giswqs2 күн бұрын
Yes, you should be able to use the 1.5b and 7b models
@lahtin3n4 күн бұрын
Each time I start Open WebUI after closing it, it's asking me to go through the admin creation again. Am I doing something wrong? Should I not launch it using "uvx --python 3.11 open-webui@latest serve"?
@giswqs4 күн бұрын
You need to use the following command. The user info is saved in the DATA_DIR. $env:DATA_DIR="C:\open-webui\data"; uvx --python 3.11 open-webui@latest serve
@lahtin3n4 күн бұрын
@@giswqs Thank you! And thank you for this video!
@giswqs4 күн бұрын
My pleasure
@johnnydeeep4 күн бұрын
web access to deepseek
4 күн бұрын
The only model equivalent to ChatGPT is the 671B, so you are not running a local AI like ChatGPT going for a subpar version, and the smaller models are way less performant
@lahtin3n4 күн бұрын
Nothing is preventing you from using the 671b model.
4 күн бұрын
@@lahtin3n the 404Gb of required memory are
@himaya-relaxsleep14103 күн бұрын
you are right. but that qwen coder 2.5 at 3B sure is useful.
3 күн бұрын
@@himaya-relaxsleep1410 Not really, to be honest; the other models are not good. Deepseek R1 claims to provide access to an open-source ChatGPT like model, which this title plays on, but even the 32B model falls far short of this promise and only the 671B will do. Otherwise, we already had plenty of models to use locally.
@himaya-relaxsleep14103 күн бұрын
i dont care really which equals which. im looking for useful tools i can run locally. And as I said a model at 3B that is fast and useful that can run on gtx 1660 is useful for me. Besides what do you expect? What do you hope? imitate openAI's power on a cheap pc? tose beasts run on data centers.. not some budget gaming pc you have.