Run any AI model remotely for free on google colab

  Рет қаралды 20,102

Tech with Marco

Tech with Marco

Күн бұрын

Пікірлер: 90
@FilipeBento
@FilipeBento 9 ай бұрын
Great stuff!! Ngrok is now asking for auth -- solved this by adding await asyncio.gather( run_process(['ngrok', 'config', 'add-authtoken','']) ) before: await asyncio.gather( run_process(['ollama', 'serve']), run_process(['ngrok', 'http', '--log', 'stderr', '11434']), )
@lamechemohh9113
@lamechemohh9113 9 ай бұрын
please i use windows how to use ngrok with him?
@FilipeBento
@FilipeBento 9 ай бұрын
@@lamechemohh9113 you mean Ollama? You will need to run WSL2 (if you have any Win version that is not the Home edition).
@techwithmarco
@techwithmarco 9 ай бұрын
Thanks a lot for adding this! Just pinned the comment
@techwithmarco
@techwithmarco 8 ай бұрын
and by the way, I updated the github repository to reflect your proposals :)
@burncloud-com
@burncloud-com 13 күн бұрын
@@techwithmarcoyou video is cool, I still have many question, can I contact you?
@SethuIyer95
@SethuIyer95 8 ай бұрын
Thank you so much. I was killing my Intel mac with the LLM questions xD. This gives a good rest for it.
@techwithmarco
@techwithmarco 8 ай бұрын
Perfect!
@techwithmarco
@techwithmarco 10 ай бұрын
If you want to learn more about ollama.ai, head over to my initial video about it :) kzbin.info/www/bejne/rIbbcp55mMaaa9U
@d3mist0clesgee12
@d3mist0clesgee12 9 ай бұрын
great stuff bro, keep them coming, thanks again.
@techwithmarco
@techwithmarco 9 ай бұрын
Thanks! I will :)
@mobilesales4696
@mobilesales4696 Ай бұрын
Tell me how can I add Tele-FLM-1T local llm model but directly install in Google colab and how host on server using Google colab and how can I put those address in any framework I mean how to configure it plz plz kindly tell me instructions plz I
@iamderrickfoo
@iamderrickfoo 4 ай бұрын
This is awesome stuff! Would like to know after this up can we connect this to Webui or Anythingllm?
@thoufeekbaber8597
@thoufeekbaber8597 5 ай бұрын
Thank you. I could run this succesfully in the terminal, but how can access the model or the collab through jupyter notebook instance?
@renega991
@renega991 3 ай бұрын
Hi amazing stuff! Is there a way to connect the ngrok to jupyter notebook? Thanks!
@jeffsanaraujo
@jeffsanaraujo 8 ай бұрын
That’s a fantastic video! Do you know if Ollama has OpenAI API compliant endpoints? So we could use Google Colab as a “Backend-as-a-Service” for some time in our chatbots :) One way I saw people doing is to create a long audio (like 12 hours of audio), loading it in the Google colab, and giving it a play, it’s a silence audio. It seems to work to keep the session opened for more time.
@techwithmarco
@techwithmarco 8 ай бұрын
There is currently an issue at the ollama gh project, so feel free to check that out and track the progress :) github.com/jmorganca/ollama/issues/305 And good tip with the audio sound, never thought of that ... 😄
@tuliomop
@tuliomop 8 ай бұрын
great tip
@CharlesDubois-f7p
@CharlesDubois-f7p 2 ай бұрын
How can I make this work with the ollama library in a python script? This works well when typing the prompts directly in the terminal, but my script still seems to run on my local instance.
@CharlesDubois-f7p
@CharlesDubois-f7p 2 ай бұрын
For anyone running into the same issue, I figured it out. I had to set the environement variable in the script with os.environ["OLLAMA_HOST"] = ngrok_url BEFORE importing ollama
@QHawk7
@QHawk7 7 күн бұрын
*Great! Thanks, can you do it with kaggle? , and with a local notebook/VSC?* Any update to this ?
@ralfrath699
@ralfrath699 Ай бұрын
I have win 10. How can I start the model?
@barskaracadag3923
@barskaracadag3923 Ай бұрын
Hi, I am jsut curious what is gonna happen once collab kicks us from using the gpu. Restart it all?
@Shivam-bi5uo
@Shivam-bi5uo 8 ай бұрын
how do i save the progress, because everytime i run it, it downloads the model all from the start?
@WhyWouldHeSayThat
@WhyWouldHeSayThat 8 ай бұрын
use your google drive bro, pay for 100gb. its worth it if you're an ai guy
@omerfarukagtoprak2398
@omerfarukagtoprak2398 2 ай бұрын
Thank you Wonderful video!!
@SethuIyer95
@SethuIyer95 8 ай бұрын
Thank you!
@إضاءةذهبية
@إضاءةذهبية 2 ай бұрын
very thanks, you help me alot!😍
@yanncotineau
@yanncotineau 6 ай бұрын
i got a 403 forbidden error, but replacing run_process(['ngrok', 'http', '--log', 'stderr', '11434']) with run_process(['ngrok', 'http', '--log', 'stderr', '11434', '--host-header="localhost:11434"']) fixed it for me.
@tiagosanti3
@tiagosanti3 6 ай бұрын
Fixed it for me too, thanks
@MR-kh8ve
@MR-kh8ve 6 ай бұрын
for me worked too, thank you!
@nicholasdunaway2605
@nicholasdunaway2605 6 ай бұрын
THANK YOU
@Kursadysr
@Kursadysr 5 ай бұрын
You are a life saver!!!
@techwithmarco
@techwithmarco 5 ай бұрын
great spot! I already updated the script on github :)
@pathsvivi
@pathsvivi 4 ай бұрын
Thanks for the video. One question though, how can I avoid downloading the language models every time I run Colab notebook? Can I save Ollama and its models in Google drive and retrieve them when running the notebook?
@kunalbhooshan9667
@kunalbhooshan9667 15 сағат бұрын
Can you add code for adding model from colabe rather then ollama
@aryanflory
@aryanflory 5 ай бұрын
hey, how to the export step on windows? I have the ollama installed
@biological-machine
@biological-machine 4 ай бұрын
just use "set OLLAMA_PATH=the_url"
@jameschan6277
@jameschan6277 2 ай бұрын
Please help if I use windows PC desktop, how can I open terminals like MAC?
@mellio19
@mellio19 6 ай бұрын
but can't run stable diffusion this way?
@bennguyen1313
@bennguyen1313 7 ай бұрын
I imagine it's costly to run LLMs.. is there a limit on how much Google Colab will do for free? I'm interested in creating a Python application that uses AI.. from what I've read, I could use ChatGPT4 Assistant API and I as the developer would incur the cost whenever the app is used. Alternatively, I could host a model like Ollama, on my own computer or on the cloud (beam cloud/ Replicate/Streamlit/replit)? As a 3rd option, could Google Colab work in my situation? Is OpenAI's Assistant API totally different from the API to programmatically interact with llama2 , mistral , etc?
@AnonymousAccount514
@AnonymousAccount514 23 күн бұрын
has this stopped working? have they caught on to us?
@vg2812
@vg2812 6 ай бұрын
Error: something went wrong, please see the ollama server logs for details am getting this error after running export OLLAMA_HOST= ... what should i do????
@techwithmarco
@techwithmarco 5 ай бұрын
See the other latest comments or check out the new version on github. Should resolve the issue :)
@vg2812
@vg2812 5 ай бұрын
@@techwithmarco okay I will check
@vg2812
@vg2812 5 ай бұрын
@@techwithmarco thank you for the reply
@asdfg1346on
@asdfg1346on 2 ай бұрын
can such a llm model be used in a web app not just in a terminal locally and how?
@attilavass6935
@attilavass6935 7 ай бұрын
How can we keep our downloaded LLMs permanently, eg. on a mounted Google Drive? It would speed up the start of inference in a new ollama server start.
@techwithmarco
@techwithmarco 6 ай бұрын
Yes, that's a brilliant idea! You can save those in google drive with this snippet for example: import os # Mount Google Drive from google.colab import drive drive.mount('/content/drive') # Create a folder in the root directory !mkdir -p "/content/drive/My Drive/My Folder" # Start Ollama with a path where models are stored OLLAMA_MODELS=/content/drive/My Drive/My Folder ollama serve
@attilavass6935
@attilavass6935 6 ай бұрын
@@techwithmarco that's great, thank you! :)
@Codescord
@Codescord 10 ай бұрын
can we just make it as api end point and create good frontend on top of it?
@techwithmarco
@techwithmarco 10 ай бұрын
Yes, kind of. The url which is getting exposed via ngrok, it is also usable as url in front ends especially built for ollama.ai Check out my other ollama linked video, there I show how to start up a front end for that. (Last section)
@MultiverseMayhemtoyou
@MultiverseMayhemtoyou 9 ай бұрын
This is Fire Can you help me connect open Interpretur Like this with So I can Give access to my computer But it wont load my PC that much
@py_man
@py_man 8 ай бұрын
You can
@DCS-um9oc
@DCS-um9oc 4 ай бұрын
i got windows machine, do i need ollama locally tooo?
@abhishekratan2496
@abhishekratan2496 6 ай бұрын
Very usefull video also the code btw i can't get it running on windows what would be the way to set OLLAMA_HOST variable on window set OLLAMA_HOST= "--" doesn't seem to work it still runs on local machine
@techwithmarco
@techwithmarco 5 ай бұрын
I think it depends on the terminal and shell you are using. Are you using the standard windows terminal?
@TirthSheth108
@TirthSheth108 5 ай бұрын
Hii @@techwithmarco , thanks for chiming in. I'm actually experiencing the same issue as @abhishekratan2496 , but I'm running it on the Ubuntu terminal. Setting the OLLAMA_HOST variable doesn't seem to work for me either. Any insights on how to resolve this would be greatly appreciated! Thanks.
@techwithmarco
@techwithmarco 5 ай бұрын
@@TirthSheth108 Okay that's weird. I just used it a few days ago and it worked perfectly. I'll investigate and let you know :)
@AllMindControl
@AllMindControl 4 ай бұрын
did anyone figure this out? it just tells me that export is not a recognized command
@AfnanQasim-wk8nq
@AfnanQasim-wk8nq 4 ай бұрын
canw e load 70B model with this same technque ?
@harsh9558
@harsh9558 9 ай бұрын
4:33 The model is downloading on colab or locally? Also can u plz tell what command changes will be there if we are using windows terminal?
@techwithmarco
@techwithmarco 9 ай бұрын
The model is being downloaded on the remote machine (colab) The commands stay the same, if you use WSL2 on windows with ollama.
@groshatc
@groshatc 5 ай бұрын
awesome man
@khushalsharma2031
@khushalsharma2031 9 ай бұрын
Thanks for the video. You mentioned disconnecting the runtime. So I am assuming google will itself shut the running notebook in few hours. Do you know for how many hours continuously we can run this?
@techwithmarco
@techwithmarco 9 ай бұрын
I just googled because I did not know, but apparently 90 minutes if you do not interact, or absolute 12 hours
@khushalsharma2031
@khushalsharma2031 9 ай бұрын
@@techwithmarco so if we leave the server running colab tab ideal. I am assuming it will auto shut in 90 minutes.
@techwithmarco
@techwithmarco 9 ай бұрын
Honestly I am not sure because I haven't used it for that long in one run. I would assume it will be up for 12 hours because the tunnel is working in the background and the jupyter notebook is still running :)
@clashgamers4072
@clashgamers4072 9 ай бұрын
It will ask for are you a robot? captcha if you're inactive for a while , you could write a small javascript fn in browser to randomly click some ui elements but yeah 12 hours is the hardlimit after that you can't connect to a GPU instance for another day or so
@stargate-s8
@stargate-s8 4 ай бұрын
Found a Gem 💎
@asir9129
@asir9129 Ай бұрын
Missed opportunity to say "say less" as supposed to "say no more", I think it sounds funnier
@techwithmarco
@techwithmarco 28 күн бұрын
I really don't get it as I am not a native speaker 😂
@AlexandreCastanet
@AlexandreCastanet 8 ай бұрын
Good do you have idea to benmarch mixtral on colab ?
@techwithmarco
@techwithmarco 8 ай бұрын
No sorry I am not that deep into AI stuff so that I know how to benchmark the performance 🥲
@thepsych3
@thepsych3 6 ай бұрын
i get error like 403 forbidden
@ricardomorim9444
@ricardomorim9444 6 ай бұрын
replace: run_process(['ngrok', 'http', '--log', 'stderr', '11434']) with run_process(['ngrok', 'http', '--log', 'stderr', '11434', '--host-header="localhost:11434"']) That fixed it for me.
@paulopatto8283
@paulopatto8283 5 ай бұрын
@@ricardomorim9444 tkx very much guys, solved my issue.
@lamechemohh9113
@lamechemohh9113 9 ай бұрын
Please what about windows user?
@techwithmarco
@techwithmarco 9 ай бұрын
You can use ollama with WSL2, it is not available yet in windows
@AhmedEssam_eramax
@AhmedEssam_eramax 9 ай бұрын
fantastic
Traefik security issue - mitigate with docker-socket-proxy
11:48
Tech with Marco
Рет қаралды 3,6 М.
Discover GPT-4o: The New Frontier of Multimodal AI!
11:18
TechyTricksAI
Рет қаралды 2,5 М.
Пришёл к другу на ночёвку 😂
01:00
Cadrol&Fatich
Рет қаралды 11 МЛН
Как подписать? 😂 #shorts
00:10
Денис Кукояка
Рет қаралды 7 МЛН
Every parent is like this ❤️💚💚💜💙
00:10
Like Asiya
Рет қаралды 7 МЛН
哈莉奎因怎么变骷髅了#小丑 #shorts
00:19
好人小丑
Рет қаралды 52 МЛН
Run ALL Your AI Locally in Minutes (LLMs, RAG, and more)
20:19
Cole Medin
Рет қаралды 87 М.
Ollama on Google Colab: A Game-Changer!
15:08
TechXplainator
Рет қаралды 2,8 М.
host ALL your AI locally
24:20
NetworkChuck
Рет қаралды 1,1 МЛН
What is RAG? (Retrieval Augmented Generation)
11:37
Don Woodlock
Рет қаралды 145 М.
GraphRAG: The Marriage of Knowledge Graphs and RAG: Emil Eifrem
19:15
GPT-o1: The Best Model I've Ever Tested 🍓 I Need New Tests!
10:58
Matthew Berman
Рет қаралды 235 М.
Пришёл к другу на ночёвку 😂
01:00
Cadrol&Fatich
Рет қаралды 11 МЛН