Please subscribe to Bitfumes channel to level up your coding skills. Do follow us on other social platforms: Hindi Channel → www.youtube.com/@code-jugaad LinkedIn → www.linkedin.com/in/sarthaksavvy/ Instagram → instagram.com/sarthaksavvy/ X → x.com/sarthaksavvy Telegram → t.me/+BwLR8bKD6iw5MWU1 Github → github.com/sarthaksavvy Newsletter → bitfumes.com/newsletters
@ano20285 ай бұрын
Thanks a lot! The whole tutorial is really to follow, I have been trying to dockerize and get my fastapi container and ollama container to interact with each for the last two days, you video helps me a lot
@Bitfumes5 ай бұрын
waooo that's nice to know. and thanks for this amazing comment. please subscribe to my newsletter bitfumes.com/newsletters
@mochammadrevaldi17905 ай бұрын
helpfully, Thank man!
@Bitfumes5 ай бұрын
cool cool please subscribe to my newsletter bitfumes.com/newsletters
@NikolaosPapathanasiou5 ай бұрын
Hey nice video man! Since the Ollama is running in the docker container, is it using the GPU ?
@Bitfumes5 ай бұрын
not in my case it uses cpu but you need to specify runtime if you want to use GPU in docker so yes you can
@karthikb.s.k.44865 ай бұрын
Nice . May I know how are you getting suggestions in vs code . When you press docker the command suggests are coming in VS CODE what is the settings for this please let me know
@Bitfumes5 ай бұрын
Thanks bhai, bdw I am using GitHub copilot so maybe thats why I get suggestion
@HennuhoMifirinso2 ай бұрын
Thanks Man! this works well but it downloads the model everytime you run docker-compose... is there a way to persists the model?
@Bitfumes2 ай бұрын
You can use volume with docker and install model there so you don't have to download everytime your create container
@shreyarajpal42123 ай бұрын
So I can directly host this as a website then right?
@Bitfumes3 ай бұрын
yes or no yes you obviously can but it's not recommended although you can use AWS ECS to setup docker and then use same application
@mat15rodrig3 ай бұрын
thanks for the video!! Do you know how to resolve this error? ERROR:root:Error during query processing: HTTPConnectionPool(host='localhost', port=11434): Max retries exceeded with url: /api/chat (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused'))
@Bitfumes3 ай бұрын
make sure your ollama is running properly and your most use 0.0.0.0 for docker host
@iainmclean60954 ай бұрын
Just so you know, this does not work on Apple Silicon.
@Bitfumes4 ай бұрын
how much RAM you have in your mac ?
@iainmclean60954 ай бұрын
@@Bitfumes 128 GB, M3 Max
@iainmclean60954 ай бұрын
@@Bitfumes 128 GB - M3 Max
@iainmclean60954 ай бұрын
@@Bitfumes I have 128GB Ram and an M3 Max, the error I think is related to Docker and Ollama running on Apple Silicon