How to Run Ollama Docker FastAPI: Step-by-Step Tutorial for Beginners

  Рет қаралды 4,916

Bitfumes

Bitfumes

Күн бұрын

Пікірлер: 20
@Bitfumes
@Bitfumes 5 ай бұрын
Please subscribe to Bitfumes channel to level up your coding skills. Do follow us on other social platforms: Hindi Channel → www.youtube.com/@code-jugaad LinkedIn → www.linkedin.com/in/sarthaksavvy/ Instagram → instagram.com/sarthaksavvy/ X → x.com/sarthaksavvy Telegram → t.me/+BwLR8bKD6iw5MWU1 Github → github.com/sarthaksavvy Newsletter → bitfumes.com/newsletters
@ano2028
@ano2028 5 ай бұрын
Thanks a lot! The whole tutorial is really to follow, I have been trying to dockerize and get my fastapi container and ollama container to interact with each for the last two days, you video helps me a lot
@Bitfumes
@Bitfumes 5 ай бұрын
waooo that's nice to know. and thanks for this amazing comment. please subscribe to my newsletter bitfumes.com/newsletters
@mochammadrevaldi1790
@mochammadrevaldi1790 5 ай бұрын
helpfully, Thank man!
@Bitfumes
@Bitfumes 5 ай бұрын
cool cool please subscribe to my newsletter bitfumes.com/newsletters
@NikolaosPapathanasiou
@NikolaosPapathanasiou 5 ай бұрын
Hey nice video man! Since the Ollama is running in the docker container, is it using the GPU ?
@Bitfumes
@Bitfumes 5 ай бұрын
not in my case it uses cpu but you need to specify runtime if you want to use GPU in docker so yes you can
@karthikb.s.k.4486
@karthikb.s.k.4486 5 ай бұрын
Nice . May I know how are you getting suggestions in vs code . When you press docker the command suggests are coming in VS CODE what is the settings for this please let me know
@Bitfumes
@Bitfumes 5 ай бұрын
Thanks bhai, bdw I am using GitHub copilot so maybe thats why I get suggestion
@HennuhoMifirinso
@HennuhoMifirinso 2 ай бұрын
Thanks Man! this works well but it downloads the model everytime you run docker-compose... is there a way to persists the model?
@Bitfumes
@Bitfumes 2 ай бұрын
You can use volume with docker and install model there so you don't have to download everytime your create container
@shreyarajpal4212
@shreyarajpal4212 3 ай бұрын
So I can directly host this as a website then right?
@Bitfumes
@Bitfumes 3 ай бұрын
yes or no yes you obviously can but it's not recommended although you can use AWS ECS to setup docker and then use same application
@mat15rodrig
@mat15rodrig 3 ай бұрын
thanks for the video!! Do you know how to resolve this error? ERROR:root:Error during query processing: HTTPConnectionPool(host='localhost', port=11434): Max retries exceeded with url: /api/chat (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused'))
@Bitfumes
@Bitfumes 3 ай бұрын
make sure your ollama is running properly and your most use 0.0.0.0 for docker host
@iainmclean6095
@iainmclean6095 4 ай бұрын
Just so you know, this does not work on Apple Silicon.
@Bitfumes
@Bitfumes 4 ай бұрын
how much RAM you have in your mac ?
@iainmclean6095
@iainmclean6095 4 ай бұрын
@@Bitfumes 128 GB, M3 Max
@iainmclean6095
@iainmclean6095 4 ай бұрын
@@Bitfumes 128 GB - M3 Max
@iainmclean6095
@iainmclean6095 4 ай бұрын
@@Bitfumes I have 128GB Ram and an M3 Max, the error I think is related to Docker and Ollama running on Apple Silicon
Don’t Choose The Wrong Box 😱
00:41
Topper Guild
Рет қаралды 62 МЛН
Что-что Мурсдей говорит? 💭 #симбочка #симба #мурсдей
00:19
coco在求救? #小丑 #天使 #shorts
00:29
好人小丑
Рет қаралды 120 МЛН
Docker Tutorial for Beginners
50:38
mCoding
Рет қаралды 85 М.
Domain-Driven Design: The Last Explanation You'll Ever Need
21:05
Software Developer Diaries
Рет қаралды 15 М.
Simple HTTPs for Docker! // Traefik Tutorial (updated)
38:06
Christian Lempa
Рет қаралды 61 М.
Run AI Models Locally: Ollama Tutorial (Step-by-Step Guide + WebUI)
14:22
Build an AI app with FastAPI and Docker - Coding Tutorial with Tips
35:18
EASILY Train Llama 3 and Upload to Ollama.com (Must Know)
14:51
Mervin Praison
Рет қаралды 52 М.
Don’t Choose The Wrong Box 😱
00:41
Topper Guild
Рет қаралды 62 МЛН