Nice . May I know what configuration required to run locally of above video. Like RAM and SSD
@DataMagicAI13 күн бұрын
I am running on 32 GB RAM with 512 SSD. Machine also has 2 GB GPU memory. But you can run on any machine above 16 GB RAM. Ollama uses quantized/small varient size LLM