How to Run a Local LLM on Raspberry Pi: Step-by-Step Guide to Deploy AI Models Locally

  Рет қаралды 3,191

Alessandro Crimi

Alessandro Crimi

Күн бұрын

Пікірлер: 6
@yucelaytacakgun9928
@yucelaytacakgun9928 9 күн бұрын
How can we use RAG? because I want to use local LLM for coding and run the code. Finally working robot
@Lp-ze1tg
@Lp-ze1tg 2 ай бұрын
Raspberry pi 4 or 5? 4gb or 8gb ram?
@alecrimi
@alecrimi 2 ай бұрын
in this video RPI 4 with 4GB, if you have Pi5 and 8GB of RAM, things will go definitely better. Also, you cannot run Phi LLM and LLama2 with 4GB of RAM, with 4GB you can only use TinyLLama and TinyDolphin
@LivingLinux
@LivingLinux 2 ай бұрын
​@@alecrimi You can use a swap file. Better make sure you have some fast storage. But the larger the model, the slower it gets (even without swap).
@alecrimi
@alecrimi 2 ай бұрын
@@LivingLinux Swap, external hat like Coral or the Hailo 8L, there are plenty of possibilies. Though the Hailo hat works only on the RPI 5 plus given the price makes more sense to buy a Jetson
@Levince36
@Levince36 9 күн бұрын
@@LivingLinux May I ask, I wonder why Pi couldn't run SSD but my HDD worked? Is it the support of Pi 4 (4GB) or the lack of power, or anything else?
Free local AI Server at Home: Step-by-Step Guide
14:24
Lepczynski Tech Cloud Adventures
Рет қаралды 3,8 М.
كم بصير عمركم عام ٢٠٢٥😍 #shorts #hasanandnour
00:27
hasan and nour shorts
Рет қаралды 11 МЛН
Every Prepper Needs This...
9:34
Data Slayer
Рет қаралды 15 М.
Using Ollama to Run Local LLMs on the Raspberry Pi 5
9:30
Ian Wootten
Рет қаралды 76 М.
I Built a CoPilot+ AI PC (without Windows)
12:50
Jeff Geerling
Рет қаралды 437 М.
All You Need To Know About Running LLMs Locally
10:30
bycloud
Рет қаралды 177 М.
Using Clusters to Boost LLMs 🚀
13:00
Alex Ziskind
Рет қаралды 75 М.
Run your own AI (but private)
22:13
NetworkChuck
Рет қаралды 1,7 МЛН
Qwen Just Casually Started the Local AI Revolution
16:05
Cole Medin
Рет қаралды 94 М.