Рет қаралды 332
Simply put, this is how you can deploy an LLM/SLM models like Mistral, Microsoft Phi, Llama on your preferred machine whether that is your local machine, docker container or Kubernetes.
The demo uses ollama as a platform for deploying and managing the LLM models.
Disclaimer: This is part of my Udemy course: www.udemy.com/...
Follow me on Twitter for more content: / houssemdellai