so this "guide" is missing some key things like how to change the ip for wsl then how to run ollama like a service. even in his written guide is not telling on how to do this.
@thiesenf2 ай бұрын
GPT4ALL is another good locally running chat interface... it can run On both the CPU but also on the GPU using Vulkan...
@klovvin2 ай бұрын
This would be better content if done by an AI
@thiesenf2 ай бұрын
Atleast we got the usually extremely boring stock videos as B rolls... *sigh*...
@kderectorful2 ай бұрын
I am accessing the openai server via a Mac, and my guess is that the netsh command is about your windows workstation you are accessing the server. Is there a similar command that would need to be run, or if I do this on my Linux server via Firefox, will I still have the same issue. I cannot seem to get the ollama3:latest installed for openweb. Any insight would be greatly appreciated as this was the most concise video I have seen on the topic.
@romayojr3 ай бұрын
this is awesome and can’t wait to try it. is there a mobile app for open webui?
@jjaard3 ай бұрын
I suspect technically it can easily run via any browser
@trucpham97723 ай бұрын
How to run ollama3 in macos, i want public localhost to use nextchatgpt , can you share command this solution
@mjes9112 ай бұрын
How many concurrent users can this support for business cases?
@kenmurphy42593 ай бұрын
Thanks Brandon, nice review of what’s out there for local LLMs
@SteheveRodriguez3 ай бұрын
Its great idea, thanks Brandon. I will test on my homelab.
@fermatdad3 ай бұрын
Thank you for the helpful tutorial.
@LibyaAi3 ай бұрын
Nobody explained how to install ollama and run it in properite way ، it should be step's ، is docker important to install before ollama? I tried to install ollama alone and it doesn't installed completely!! I don't know why
@kironlau3 ай бұрын
1. you should mention what is your os 2. read the official documentation 3. if you run on win, just download the exe/msi file, install with one click(and click yes...)