AI Unleashed: Install and Use Local LLMs with Ollama - ChatGPT on Steroids! (FREE)

  Рет қаралды 5,679

Linux Tex

Linux Tex

Күн бұрын

Пікірлер: 34
@乾淨核能
@乾淨核能 5 ай бұрын
hardware requirement?
@m4saurabh
@m4saurabh 5 ай бұрын
Nvidia H100
@乾淨核能
@乾淨核能 5 ай бұрын
@@m4saurabh seriously? that's not for everyone T_T
@LinuxTex
@LinuxTex 5 ай бұрын
For Phi 3.8b, 8 gb ram. No gpu needed. For llama 3.1 7b, 16 gb ram. Most consumer GPUs will suffice. H100 not necessary.😉
@乾淨核能
@乾淨核能 5 ай бұрын
@@LinuxTex thakn you!
@go0ot
@go0ot 4 ай бұрын
Do more local llm installs videos
@nothingbutanime4975
@nothingbutanime4975 5 ай бұрын
Do make a video regarding ai generated images local models
@LinuxTex
@LinuxTex 5 ай бұрын
Definitely bro👍
@obertscloud
@obertscloud 21 күн бұрын
thanks how do I get modules that are no censored ?
@MrNorthNJ
@MrNorthNJ 5 ай бұрын
I have an old file/media server which I am planning on rebuilding as a future project. Would I be able to run this on that server and still access it with other computers on my network or would it just be available on the server itself?
@LinuxTex
@LinuxTex 5 ай бұрын
That's a great idea actually. You could make it accessible on other devices on your network as ollama suppots that.
@MrNorthNJ
@MrNorthNJ 5 ай бұрын
@@LinuxTex Thanks!
@shubhamshandilya6160
@shubhamshandilya6160 4 ай бұрын
How to make api calls to these offline llm. For used in projects
@MuhammadMuaazAnsari-l1b
@MuhammadMuaazAnsari-l1b 5 ай бұрын
0:39 The Holy Trinity 😂😂🤣
@nehalmushfiq141
@nehalmushfiq141 5 ай бұрын
okay thats someting good and appreciatable content
@LinuxTex
@LinuxTex 5 ай бұрын
Thanks Nehal. 👍
@kennethwillis8339
@kennethwillis8339 4 ай бұрын
How can I have my local LLM work with my files?
@LinuxTex
@LinuxTex 4 ай бұрын
Sir you need to setup RAG for that. In the MSTY app that I've linked in the description below, you can create knowledge bases easily by just dragging and dropping files. Then you can interact with them using your local llms.
@atharavapawar3272
@atharavapawar3272 5 ай бұрын
which linux is perfect for my samsung NP300E5Z 4 RAM , Intel Core i5-2450M Processor , plz reply
@caststeal
@caststeal 4 ай бұрын
Go with mx linux. Xfce desktop environment
@decipher365
@decipher365 4 ай бұрын
Yes we need much more such videos
@Soth0w76
@Soth0w76 4 ай бұрын
I already use ollama on my Galaxy A54 with Kali Nethunter
@nageswaraopatha5445
@nageswaraopatha5445 5 ай бұрын
Do more videos like this❤ good apps
@LinuxTex
@LinuxTex 5 ай бұрын
Will do. Thanks for the comment👍
@Mr_Ravee
@Mr_Ravee 5 ай бұрын
Quality content dude..👍 got a sub👆👍
@MarimoBakaa
@MarimoBakaa 5 ай бұрын
Helpful video!
@LinuxTex
@LinuxTex 5 ай бұрын
Thank you Syed👍
@janetmartha3261
@janetmartha3261 5 ай бұрын
Useful information 💯 ,,🔥🔥🔥🔥🔥
@LinuxTex
@LinuxTex 5 ай бұрын
Thank you👍
@rohankshetrimayum1049
@rohankshetrimayum1049 5 ай бұрын
Wow
@gamalieltonatiuh1924
@gamalieltonatiuh1924 4 ай бұрын
😮👍
host ALL your AI locally
24:20
NetworkChuck
Рет қаралды 1,5 МЛН
Run a GOOD ChatGPT Alternative Locally! - LM Studio Overview
15:16
MattVidPro AI
Рет қаралды 51 М.
Почему Катар богатый? #shorts
0:45
Послезавтра
Рет қаралды 2 МЛН
Unlimited AI Agents running locally with Ollama & AnythingLLM
15:21
Tim Carambat
Рет қаралды 166 М.
Free local AI Server at Home: Step-by-Step Guide
14:24
Lepczynski Tech Cloud Adventures
Рет қаралды 8 М.
OPEN SOURCE alternatives to the MOST POPULAR productivity apps!
15:37
The Linux Experiment
Рет қаралды 1,3 МЛН
Run ALL Your AI Locally in Minutes (LLMs, RAG, and more)
20:19
Cole Medin
Рет қаралды 303 М.
7 Apps Better Than The Defaults Your Distro Ships
12:43
DistroTube
Рет қаралды 111 М.
You've been using AI Wrong
30:58
NetworkChuck
Рет қаралды 576 М.
Unleash the power of Local LLM's with Ollama x AnythingLLM
10:15
Tim Carambat
Рет қаралды 130 М.
What is a LVM partition ?
26:16
benlinux-en
Рет қаралды 402
Почему Катар богатый? #shorts
0:45
Послезавтра
Рет қаралды 2 МЛН