Thank you guys for all the love and support! Hope you enjoy this video on Ollama and running models locally.
@phantomslayer97142 ай бұрын
Bro , your content is underrated. Absolute W. Appreciate your content. Literally helped me so much !!!!
@leonvanzyl2 ай бұрын
Wow! Thank you! Glad I could help.
@regman254 ай бұрын
Leon all your videos are always amazingly simple to understrand ! Thanks for your job !
@leonvanzyl4 ай бұрын
You are very welcome
@Rhannmah2 ай бұрын
Very well made beginner tutorial! Much appreciated!
@leonvanzyl2 ай бұрын
You're welcome
@muratcanyuksel19594 ай бұрын
Great video, clear, structured and to the point. Thank you Leon!
@leonvanzyl4 ай бұрын
Thank you 😊
@maniecronje4 ай бұрын
Great content as always Leon ❤ looking fwd to more FW videos 😊
@leonvanzyl4 ай бұрын
Thank you!
@shadowrebel9182Ай бұрын
THANK U VERY MUCH MAN. IT HELPED ME SO MUCH
@leonvanzylАй бұрын
You're welcome 🤗
@jgz24 ай бұрын
Thanks Leon
@leonvanzyl4 ай бұрын
You're welcome
@LucasLaino4 ай бұрын
It would be amazing to see an integration with WhatsApp!
@jacobtyo78054 ай бұрын
Is there a reason you guys went with Ollama vs vLLM?
@leonvanzyl4 ай бұрын
Hey there. Ollama integration is available in many of the platforms and frameworks covered on my channel - like Langchain, CrewAI, Flowise and Langflow. Not sure how well vLLM is supported? I'll check it out though 👍
@Aliexpress714 ай бұрын
Pode refinar e melhorar o uso flowise com ollama. Por exemplo: flowise , redis, postgreSQL e postgreSQL com pgvector. Espero que traga coisas boas ...
@leonvanzyl4 ай бұрын
Exactly the plan. I'll be using local models in some of the tutorials going forward. Not only Flowise, but other platforms as well.