Рет қаралды 132
LOCAL LLM QWEN2:0.5b RUNNING ON HOME SERVER I3-8100T ON WHICH ALSO RUNS HOME ASSISTANT AND ELSE
So i find it quite amazing. The A.I assistant you see is running on my home server along side Home Assistant (which runs in a Virtualbox VM with 2 cores and 6gb of memory) but alongside, Plex, Proxies, Cloud sharing files and many other stuff. It's not the fastest but it has to generate all the text before posting the answer. All this, on an old i3-8100t (Tiny Lenovo, ultra quite, hidden under my TV) that i bought 3 or 4 years ago for a little more than a hundred dollars. We'll soon be able to control smart devices via Home Assistant and Ollama LLM. All in one cheap devices with Wake Word, Voice Assistant, A.I and all this ran locally are just around the corner.