Hello, the video was nicely and clearly explained, step by step. I would like to see the same Ollma setup but on a serverless architecture. Could you please post a video on the Ollma serverless setup?
@ScaleUpSaaS5 ай бұрын
Thanks for sharing. Appreciated. Can you elaborate more…
@pushkarsawant97895 ай бұрын
@@ScaleUpSaaS I mean setting up ollma on serverless technology on AWS using lambda or other services. Or maybe on Google cloud functions for serverless
@ScaleUpSaaS5 ай бұрын
We don’t know if it’s possible. But we will check and let you know 🫡
@ScaleUpSaaS4 ай бұрын
We try to look for a solution for you. Unfortunately we didn't found one yet. We will let you know if something comes up...
@pushkarsawant97894 ай бұрын
@@ScaleUpSaaS, Thank You
@moizamjad13303 ай бұрын
Can i link the webui with my domain? So people can access the web ui through domain and of course SSL. I'd be really thankful if you explain or make a video on it.
@ScaleUpSaaS3 ай бұрын
I will make video about this very soon
@sudarshanggouda3 ай бұрын
I recently came across your video on installing and running Llama3 (or any LLM) using Ollama on AWS Linux. I was wondering if it's possible to interact with the deployed model programmatically by calling it as an API in code. Could you provide insights or a brief guide on how to achieve this? Thank you for the great content!
@ScaleUpSaaS3 ай бұрын
Yes. You can. You can call it as API All you need to do is to implement Python FastAPI and once you getting requests to fast API you can make an inner request to your local Ollama. So you want us to make video about it?
@sudarshanggouda3 ай бұрын
@@ScaleUpSaaS Can you please make a video or explain how we can do it. I have writtern the Fast API code but not getting how to call the local Ollama API
@sudarshanggouda3 ай бұрын
@@ScaleUpSaaS Yes please If there will be an video that would helpful.
@ScaleUpSaaS3 ай бұрын
Sure. We will be happy to share that with you.
@sudarshanggouda3 ай бұрын
@@ScaleUpSaaS Thank you
@cjoshy4 ай бұрын
I followed entire turorial but when i type 'llama3' in 'select a model' , 'Pull "llama3" from Ollama ' option is not appearing.
@ScaleUpSaaS4 ай бұрын
Please try the tutorial again from scratch. We tried it many times with users. And it’s worked each time.
@squ34ky13 күн бұрын
I was also facing the same issue. I solved it by logging into the container directly using docker exec -it open-webui bash then running ollama pull Then refreshing openwebui did the trick. The models were listed.
@cjoshy12 күн бұрын
@@squ34ky Thanks, bro, I was planning to install it again tomorrow, you came at the right time 🫂
@ywueeee5 ай бұрын
How to run is private? Someone looking for those endpoint can find it on clear Web?
@ScaleUpSaaS5 ай бұрын
You can run it on your computer using docker as we showed in the tutorial. Or the next thing do what we did in the video and restrict access to the server only to your IP (config security group).
@ywueeee5 ай бұрын
@@ScaleUpSaaS but what if you're WiFi Ip is not static and keeps on changing and you want access to the LLM from any device and any network but still keep it safe only accessible to you?
@ScaleUpSaaS5 ай бұрын
@wagmi614 in that case you can use Elastic IP address. In this video you can see how we are setting elastic IP address in AWS Full Node.js Deployment to AWS - FREE SSL, NGINX | Node js HTTPS Server kzbin.info/www/bejne/r5nMpolsmNaehNU
@ScaleUpSaaS5 ай бұрын
Watch this. Full Node.js Deployment to AWS - FREE SSL, NGINX | Node js HTTPS Server kzbin.info/www/bejne/r5nMpolsmNaehNU
@ywueeee5 ай бұрын
@@ScaleUpSaaS wait i don't get it how elastic ip of aws helps when it's my ip that's changing and i want to input to be accepted from any ip?
@jimjohn17195 ай бұрын
Is this free to run on AWS?. If not, can you comment on the AWS cost incurred to run this application?.
@ScaleUpSaaS5 ай бұрын
Thanks for sharing. Ollama, llama3 or any other LLM that you can pull are free to use. But the server , because we are not using free tier instance type, it will cost you money for aws.