Install & Run Ollama on AWS Linux: Easily Install Llama3 or Any LLM Using Ollama and WebUI

  Рет қаралды 1,682

Scale-Up SaaS

Scale-Up SaaS

Күн бұрын

Пікірлер: 36
@pushkarsawant9789
@pushkarsawant9789 5 ай бұрын
Hello, the video was nicely and clearly explained, step by step. I would like to see the same Ollma setup but on a serverless architecture. Could you please post a video on the Ollma serverless setup?
@ScaleUpSaaS
@ScaleUpSaaS 5 ай бұрын
Thanks for sharing. Appreciated. Can you elaborate more…
@pushkarsawant9789
@pushkarsawant9789 5 ай бұрын
@@ScaleUpSaaS I mean setting up ollma on serverless technology on AWS using lambda or other services. Or maybe on Google cloud functions for serverless
@ScaleUpSaaS
@ScaleUpSaaS 5 ай бұрын
We don’t know if it’s possible. But we will check and let you know 🫡
@ScaleUpSaaS
@ScaleUpSaaS 4 ай бұрын
We try to look for a solution for you. Unfortunately we didn't found one yet. We will let you know if something comes up...
@pushkarsawant9789
@pushkarsawant9789 4 ай бұрын
@@ScaleUpSaaS, Thank You
@moizamjad1330
@moizamjad1330 3 ай бұрын
Can i link the webui with my domain? So people can access the web ui through domain and of course SSL. I'd be really thankful if you explain or make a video on it.
@ScaleUpSaaS
@ScaleUpSaaS 3 ай бұрын
I will make video about this very soon
@sudarshanggouda
@sudarshanggouda 3 ай бұрын
I recently came across your video on installing and running Llama3 (or any LLM) using Ollama on AWS Linux. I was wondering if it's possible to interact with the deployed model programmatically by calling it as an API in code. Could you provide insights or a brief guide on how to achieve this? Thank you for the great content!
@ScaleUpSaaS
@ScaleUpSaaS 3 ай бұрын
Yes. You can. You can call it as API All you need to do is to implement Python FastAPI and once you getting requests to fast API you can make an inner request to your local Ollama. So you want us to make video about it?
@sudarshanggouda
@sudarshanggouda 3 ай бұрын
@@ScaleUpSaaS Can you please make a video or explain how we can do it. I have writtern the Fast API code but not getting how to call the local Ollama API
@sudarshanggouda
@sudarshanggouda 3 ай бұрын
@@ScaleUpSaaS Yes please If there will be an video that would helpful.
@ScaleUpSaaS
@ScaleUpSaaS 3 ай бұрын
Sure. We will be happy to share that with you.
@sudarshanggouda
@sudarshanggouda 3 ай бұрын
@@ScaleUpSaaS Thank you
@cjoshy
@cjoshy 4 ай бұрын
I followed entire turorial but when i type 'llama3' in 'select a model' , 'Pull "llama3" from Ollama ' option is not appearing.
@ScaleUpSaaS
@ScaleUpSaaS 4 ай бұрын
Please try the tutorial again from scratch. We tried it many times with users. And it’s worked each time.
@squ34ky
@squ34ky 13 күн бұрын
I was also facing the same issue. I solved it by logging into the container directly using docker exec -it open-webui bash then running ollama pull Then refreshing openwebui did the trick. The models were listed.
@cjoshy
@cjoshy 12 күн бұрын
@@squ34ky Thanks, bro, I was planning to install it again tomorrow, you came at the right time 🫂
@ywueeee
@ywueeee 5 ай бұрын
How to run is private? Someone looking for those endpoint can find it on clear Web?
@ScaleUpSaaS
@ScaleUpSaaS 5 ай бұрын
You can run it on your computer using docker as we showed in the tutorial. Or the next thing do what we did in the video and restrict access to the server only to your IP (config security group).
@ywueeee
@ywueeee 5 ай бұрын
@@ScaleUpSaaS but what if you're WiFi Ip is not static and keeps on changing and you want access to the LLM from any device and any network but still keep it safe only accessible to you?
@ScaleUpSaaS
@ScaleUpSaaS 5 ай бұрын
@wagmi614 in that case you can use Elastic IP address. In this video you can see how we are setting elastic IP address in AWS Full Node.js Deployment to AWS - FREE SSL, NGINX | Node js HTTPS Server kzbin.info/www/bejne/r5nMpolsmNaehNU
@ScaleUpSaaS
@ScaleUpSaaS 5 ай бұрын
Watch this. Full Node.js Deployment to AWS - FREE SSL, NGINX | Node js HTTPS Server kzbin.info/www/bejne/r5nMpolsmNaehNU
@ywueeee
@ywueeee 5 ай бұрын
@@ScaleUpSaaS wait i don't get it how elastic ip of aws helps when it's my ip that's changing and i want to input to be accepted from any ip?
@jimjohn1719
@jimjohn1719 5 ай бұрын
Is this free to run on AWS?. If not, can you comment on the AWS cost incurred to run this application?.
@ScaleUpSaaS
@ScaleUpSaaS 5 ай бұрын
Thanks for sharing. Ollama, llama3 or any other LLM that you can pull are free to use. But the server , because we are not using free tier instance type, it will cost you money for aws.
To Brawl AND BEYOND!
00:51
Brawl Stars
Рет қаралды 17 МЛН
It’s all not real
00:15
V.A. show / Магика
Рет қаралды 20 МЛН
Tools EVERY Software Engineer Should Know
11:37
Tech With Tim
Рет қаралды 23 М.
The Honey Scam: Explained
10:53
Marques Brownlee
Рет қаралды 2,4 МЛН
Full Node.js Deployment to AWS App Runner 🚀
16:05
Scale-Up SaaS
Рет қаралды 1 М.
Your Own FREE ChatGPT API | Ollama API + Llama3.2
4:18
Scale-Up SaaS
Рет қаралды 394
Amazon S3 - Static Website Hosting with Custom Domain and TLS
16:28
Bryan Krausen
Рет қаралды 37 М.