Local LLM with Ollama, LLAMA3 and LM Studio // Private AI Server

  Рет қаралды 14,113

VirtualizationHowto

VirtualizationHowto

Күн бұрын

Пікірлер
@kenmurphy4259
@kenmurphy4259 6 ай бұрын
Thanks Brandon, nice review of what’s out there for local LLMs
@SteheveRodriguez
@SteheveRodriguez 6 ай бұрын
Its great idea, thanks Brandon. I will test on my homelab.
@romayojr
@romayojr 6 ай бұрын
this is awesome and can’t wait to try it. is there a mobile app for open webui?
@jjaard
@jjaard 6 ай бұрын
I suspect technically it can easily run via any browser
@kderectorful
@kderectorful 5 ай бұрын
I am accessing the openai server via a Mac, and my guess is that the netsh command is about your windows workstation you are accessing the server. Is there a similar command that would need to be run, or if I do this on my Linux server via Firefox, will I still have the same issue. I cannot seem to get the ollama3:latest installed for openweb. Any insight would be greatly appreciated as this was the most concise video I have seen on the topic.
@mjes911
@mjes911 5 ай бұрын
How many concurrent users can this support for business cases?
@trucpham9772
@trucpham9772 6 ай бұрын
How to run ollama3 in macos, i want public localhost to use nextchatgpt , can you share command this solution
@fermatdad
@fermatdad 6 ай бұрын
Thank you for the helpful tutorial.
@LibyaAi
@LibyaAi 6 ай бұрын
Nobody explained how to install ollama and run it in properite way ، it should be step's ، is docker important to install before ollama? I tried to install ollama alone and it doesn't installed completely!! I don't know why
@kironlau
@kironlau 6 ай бұрын
1. you should mention what is your os 2. read the official documentation 3. if you run on win, just download the exe/msi file, install with one click(and click yes...)
@SyamsQbattar
@SyamsQbattar 5 ай бұрын
Is LMStudio better than Ollama?
@camsand6109
@camsand6109 5 ай бұрын
no, but its a good option
@SyamsQbattar
@SyamsQbattar 5 ай бұрын
@@camsand6109 then, Ollama is better?
@TheTricou
@TheTricou 4 ай бұрын
so this "guide" is missing some key things like how to change the ip for wsl then how to run ollama like a service. even in his written guide is not telling on how to do this.
@thiesenf
@thiesenf 5 ай бұрын
GPT4ALL is another good locally running chat interface... it can run On both the CPU but also on the GPU using Vulkan...
@nobody-P
@nobody-P 6 ай бұрын
😮I'm gonna try this now
@klovvin
@klovvin 5 ай бұрын
This would be better content if done by an AI
@thiesenf
@thiesenf 5 ай бұрын
Atleast we got the usually extremely boring stock videos as B rolls... *sigh*...
Local LLM Challenge | Speed vs Efficiency
16:25
Alex Ziskind
Рет қаралды 116 М.
host ALL your AI locally
24:20
NetworkChuck
Рет қаралды 1,5 МЛН
Quando A Diferença De Altura É Muito Grande 😲😂
00:12
Mari Maria
Рет қаралды 45 МЛН
UFC 310 : Рахмонов VS Мачадо Гэрри
05:00
Setanta Sports UFC
Рет қаралды 1,2 МЛН
Run a GOOD ChatGPT Alternative Locally! - LM Studio Overview
15:16
MattVidPro AI
Рет қаралды 52 М.
Container vs VM: Hypervisor War is Over!
13:42
VirtualizationHowto
Рет қаралды 16 М.
All You Need To Know About Running LLMs Locally
10:30
bycloud
Рет қаралды 192 М.
I Analyzed My Finance With Local LLMs
17:51
Thu Vu data analytics
Рет қаралды 501 М.
FREE Local LLMs on Apple Silicon | FAST!
15:09
Alex Ziskind
Рет қаралды 218 М.
Ollama AI Home Server ULTIMATE Setup Guide
26:06
Digital Spaceport
Рет қаралды 36 М.