Support the channel by buying with these links (affiliate): Raspberry Pi 4 s.click.aliexpress.com/e/_DFBtkCP Raspberry Pi 5 s.click.aliexpress.com/e/_DmMSKb5
@therobotstudio10 ай бұрын
Great stuff as always! Will be trying this as soon as I’m back in the workshop…
@Hardwareai10 ай бұрын
Please do! :)
@exploring-electronic10 ай бұрын
More no-bullsh**t tutorials! 👍
@Hardwareai10 ай бұрын
More to come!
@ggerganov10 ай бұрын
Very good tutorial - well done! Btw, following the chat format of the model (i.e. adding the , , tokens) should improve the accuracy, though at the moment, there is not an easy way to do it with "server". It's possible to do it with "main" but it takes some extra command-line arguments
@Hardwareai10 ай бұрын
Appreciate your work on llama.cpp! Thanks for the info, it will definitely be useful for people watching.
@sleetible4 ай бұрын
Does the new Hailo AI module offer any improvement on running any of these LLMs? I know it speeds up the vision side of things but I haven't seen anyone use it for LLMs yet.
@sleetible4 ай бұрын
Oh, and would it speed up Whisper at all? Allowing a larger model to be run?
@Hardwareai4 ай бұрын
re: LLMs, not really. The question has been asked many times in different locations, here is one of the replies www.reddit.com/r/LocalLLaMA/comments/1d7shcr/comment/l71q04c re: Whisper: given that it as transformer as well, Hailo are not geared towards this type of NN. but I remember seeing a paper about modifying BERT to be run with Google Coral USB, so... your mileage may wary, but it is going to be very far from plug-n-play
@paulkolesnikov14413 ай бұрын
Would an ai hat help with the model? I am pretty new to this but I wanted a local LLM on rasp.
@Hardwareai3 ай бұрын
Hello! Check other comments, some people already asked about Hailo AI module (which is what is inside Pi AI hat). Short answer - no, it can't be used for LLM acceleration.
@Hardwareai3 ай бұрын
But on the bright side, you don't really need a special AI hat, if you watched the video, you can see that small language models can be run pretty fast on Pi.
@JayPixie11134 ай бұрын
What raspberry pi version did you use?
@Hardwareai3 ай бұрын
Raspberry Pi 4. I only had 4 at that time, but since then bought 5 as well.