Thanks a lot for the detailed explanation in the video! I have a question regarding Ollama. Is it possible to use Ollama and the models available on it in a production environment? I would love to hear your thoughts or any experiences you might have with it. Thank you!
@serychristianrenaud5 ай бұрын
Thanks
@ssswayzzz4 ай бұрын
do you think you will need more than m3 max 36gb 14cpu/30gpu for running all these models im thinking of getting one myself to try all these models all i know that i need a capable device , btw thank you so much for your videos once i get this device im sure i will be around your channel alot benefiting from your experience thank you
@AIDevBytes4 ай бұрын
I don't have to have the M3 Max, but it definitely will help. I say the minimum RAM you will need on Apple Silicon is 16 GB to run the smaller models. With the M3 Max you will be able to run most of the Open-Source models up to about 40 billion parameters, just not the really large parameter models like the 70 billion parameter models. Also, happy you are finding the content useful.
@kannansingaravelu3 ай бұрын
How do we fine tune the model? Do you have a video on it?
@AIDevBytes3 ай бұрын
I currently don't have a video covering that yet, but looking to create one in the future.