Рет қаралды 1,573
In this video, I'll show you how I used Ollama in my project. Ollama allows you to run open-source large language models, such as Llama 3, locally. It optimizes setup and configuration details, including GPU usage.
Here are the links to the previous videos:
1. • I Built an Interactive...
2. • I Built an Interactive...
3. • I Built an interactive...
My system:
Dell Alienware Aurora R10
- Ryzen 7 5800
- Nvidia RTX 3090
- 64GB DD4 ram