Ollama on Google Colab: A Game-Changer!

  Рет қаралды 5,648

TechXplainator

TechXplainator

Күн бұрын

Пікірлер: 42
@ahmetozkaya6387
@ahmetozkaya6387 Ай бұрын
Clear, crisp and straightforward videos, awesome, thanks a lot ❤💯👋
@СобственникиРиверсайд
@СобственникиРиверсайд 11 күн бұрын
Thanks for the code and interesting video. Does anyone know how to add a CORs setting to this code so that you can send a request from a web page? Currently requests through the browser are blocked.
@UnknownDigitalCreator
@UnknownDigitalCreator Ай бұрын
Very Nice...Good Explanation Deserve Respect...❤❤❤❤
@archilecteur
@archilecteur 2 ай бұрын
Instead of Terminal, can the call to the model running in Google Colab be inserted in an app?
@marcelocruzeta8822
@marcelocruzeta8822 3 ай бұрын
Great. Thank you for the class. I Installed Open WebUI from GitHub Repo. No Docker. Can I configure it to run with the remote Ollama? I found it. Have to change in settings. Never mind.
@fabriciocincunegui5332
@fabriciocincunegui5332 4 ай бұрын
thnx for patient
@bnermine9780
@bnermine9780 4 ай бұрын
Thank you for the great video! Could the model then be used inside a local python code? I am writing a classification script using an llm but running it on my cpu takes ages. Can I edit my local python code so that the classification is done with the model running on google colab but the results are stored locally? This would also help me apply the same model to different use cases. Thank you!!
@TechXplainator
@TechXplainator 4 ай бұрын
Thank you so much for your kind words! And yes, you can definitely do that. Here is how that could work: 1. Keep the Colab notebook running with Ollama and Ngrok set up as shown in the tutorial. 2. In your local Python script, use the 'requests' library to send classification requests to the Ollama model via the Ngrok URL. 3. Process the responses and store the results locally. I hope that helps. Happy coding ☺️
@chillscripter
@chillscripter 5 ай бұрын
do the exact thing that you said in the video but i got the error from ollama : the parameter is incorrect how can i solve that?
@TechXplainator
@TechXplainator 5 ай бұрын
Hey there! To help me figure out what's going wrong, could you please tell me: 1. Are you using a fixed Ngrok link or letting Colab create a new one each time? 2. Did you open the link from the notebook in a browser? Does it say "Ollama is running"? 3. Have you correctly linked your local Ollama to Colab-Ollama by setting the OLLAMA_HOST environment variable to your Ngrok URL? (You can usually do this in your terminal with a command like export OLLAMA_HOST=) 4. When you run a model locally (like typing ollama run llama3.1), does the model download to your computer or to Colab? Can you see the download happening in your Colab notebook?
@ardasemsettinoglu
@ardasemsettinoglu 5 ай бұрын
When I write the export OLLAMA_HOST it said that" export : The term 'export' is not recognized as the name of a cmdlet" Is it because I am using docker ?
@TechXplainator
@TechXplainator 5 ай бұрын
The error message you're encountering is not related to Docker, but rather to the command shell you're using. The "export" command is specific to Unix-like systems (such as Linux and macOS) and is not recognized in Windows PowerShell or Command Prompt. To set an environment variable in Windows, you should use the "set" command instead of "export". Here's how you can set the OLLAMA_HOST variable in PowerShell: $env:OLLAMA_HOST = "your_value_here" Or in Command Prompt: set OLLAMA_HOST=your_value_here Hope this helps ☺️
@jameschan6277
@jameschan6277 5 ай бұрын
Please help if I use windows PC desktop, how can I open terminals like MAC?
@TechXplainator
@TechXplainator 5 ай бұрын
To open terminals on a Windows PC desktop similar to how you would on a Mac, you can use the following methods: Option 1: PowerShell: 1. Press `Windows+X` and select "Windows PowerShell" or "Windows PowerShell (Admin)" from the menu. 2. Alternatively, press `Windows+R`, type `powershell`, and press Enter to open a PowerShell window. Option 2: Command Prompt: 1. Press `Windows+R`, type `cmd`, and press Enter to open a Command Prompt window. 2. You can also search for "Command Prompt" in the Start menu, right-click the result, and select "Run as Administrator" if you need elevated privileges. Hope this helps ☺️
@wowfielder101
@wowfielder101 3 ай бұрын
HELP PLS the "export OLLAMA_HOST=" command is not working in cmd pls help
@TechXplainator
@TechXplainator 3 ай бұрын
You mean on windows? This is how it should work (have not verified it - I'm using mac): 1. Open Command Prompt as Administrator. 2. Run the command below, replacing `` with your Ngrok URL: setx OLLAMA_HOST "" 3. Close and reopen Command Prompt to apply the changes.
@HunterJuniorX
@HunterJuniorX 6 ай бұрын
is there a way to use models from hugging face?
@TechXplainator
@TechXplainator 6 ай бұрын
Yes there is - if they are available as quantized models (GGUF files). I made a video on how you can import GGUF files from huggingface and use them in Ollama - feel free to check it out: kzbin.info/www/bejne/rKSUpmywZ7pnkKM
@andreabaffascirocco2934
@andreabaffascirocco2934 5 ай бұрын
I have try but seems that the command ollama run llama3.1 download the model on my laptop instead of colab.
@TechXplainator
@TechXplainator 5 ай бұрын
Try running the command export OLLAMA_HOST= (check the URL says "Ollama is running" first). Then in the same terminal window, you should do "ollama run llama3" again. Hope this helps ☺️
@andreabaffascirocco2934
@andreabaffascirocco2934 5 ай бұрын
@@TechXplainator Thanks. i'll try
@andreabaffascirocco2934
@andreabaffascirocco2934 5 ай бұрын
@@TechXplainator Now all work fine, The problem is i have installed ollama on my ubuntu using snap. Whit this installation the pc try to download LLama 3.1 on pc and not on colab.
@TechXplainator
@TechXplainator 5 ай бұрын
I'm glad it works now ☺️
@fabriciocincunegui5332
@fabriciocincunegui5332 4 ай бұрын
How do i export ollama on my cmd im on windows 11
@TechXplainator
@TechXplainator 4 ай бұрын
I can't verify this on a Windows PC since I don't have one, but based on my research, here's how to export the `OLLAMA_HOST` variable on Windows 11 using Command Prompt: 1. Open Command Prompt as Administrator. 2. Run the command below, replacing `` with your Ngrok URL: setx OLLAMA_HOST "" 3. Close and reopen Command Prompt to apply the changes.
@levantoi2934
@levantoi2934 10 күн бұрын
Why not google cloud notebook?
@TechXplainator
@TechXplainator 9 күн бұрын
no reason, just a familiarity with Google Colab, but I'm sure it will work on google cloud notebook as well
@levantoi2934
@levantoi2934 6 күн бұрын
@@TechXplainator Great. Do you have a solution to schedule the colab task? I tried with google cloud schedule but seem not work and complicate
@TechXplainator
@TechXplainator 5 күн бұрын
sorry - no. I don't
@levantoi2934
@levantoi2934 12 күн бұрын
Hello!
@MarkSmith-ho5ij
@MarkSmith-ho5ij 4 ай бұрын
Coders using apple lol. Please use Linux and stop this...
@rajarshisen5905
@rajarshisen5905 5 ай бұрын
Please help, I can run Ollama in Colab, but while run it from docker as open-web-ui, I am getting the following error while trying to chat to llama3 in the web browser. Ollama: 404, message='Not Found', url=URL('/api/chat')
@TechXplainator
@TechXplainator 5 ай бұрын
Does Ollama work from the terminal? I mean, when running export OLLAMA_HOST= and ollama run llama3, do you get to interact with llama3 in your terminal? And do you see any action in your Colab (you should be seeing the notebook downloading a model and responding to chat)
@merocky5
@merocky5 5 ай бұрын
Yes, I do. Ollama is executing on Colab, when I call it from local computer terminal. Only while using the open-web-ui, following the last command of your python notebook, I get the error as described above. The front end web app starts, but while trying to chat with ollama installed in Colab, I get the error mentioned in above message. I did some internet search and appears that "api" word may be /be not included in the latest version of ollama? Please help how I can resolve this. Thanks a lot
@TechXplainator
@TechXplainator 5 ай бұрын
To make sure we're on the same page, I just want to summarize your setup: 1. You're using a static Ngrok URL. 2. You've successfully connected your local Ollama instance with the one hosted on Colab by running an export command. 3. You've installed OpenWebUI using Docker and replaced the example Ngrok URL with your own static Ngrok URL, as indicated by this command: `docker run -d -p 4000:8080 -e OLLAMA_BASE_URL=example.com -v open-webui:/app/backend/data --name test --restart always ghcr.io/open-webui/open-webui:main` 4. The Docker container was created, but trying to access the Ollama WebUI at `localhost:4000/` results in an error. Please confirm that this summary is accurate so I can help you troubleshoot the issue ☺️
@rajarshisen5905
@rajarshisen5905 5 ай бұрын
@@TechXplainator Yes, the summary is spot on. I have followed all of the above bullet points and got error on last bullet point while trying to post a chat to Ollama using web-UI.
@TechXplainator
@TechXplainator 5 ай бұрын
I was not able to replicate the error, but based on my research, here are a few things you could try: 1. Verify OpenWebUI settings: Access the OpenWebUI settings page (click on your avatar on the bottom left) and verify that the Ollama Server URL is correctly set to your Ngrok URL: Go to “connections”. Under “Ollama Base URL” you should see your static Ngrok URL 2. Network Configuration Ensure that the Docker container can communicate with the Ollama server. Use the --network=host flag to allow the Docker container to use the host network: docker run -d --network=host -v open-webui:/app/backend/data -e OLLAMA_BASE_URL= --name open-webui --restart always ghcr.io/open-webui/open-webui:main I hope this helps. If not, please check out the troubleshooting page from Open WebUI: docs.openwebui.com/troubleshooting/
@СобственникиРиверсайд
@СобственникиРиверсайд 11 күн бұрын
Thanks for the code and interesting video. Does anyone know how to add a CORs setting to this code so that you can send a request from a web page? Currently requests through the browser are blocked.
УНО Реверс в Амонг Ас : игра на выбывание
0:19
Фани Хани
Рет қаралды 1,3 МЛН
-5+3은 뭔가요? 📚 #shorts
0:19
5 분 Tricks
Рет қаралды 13 МЛН
ССЫЛКА НА ИГРУ В КОММЕНТАХ #shorts
0:36
Паша Осадчий
Рет қаралды 8 МЛН
UFC 287 : Перейра VS Адесанья 2
6:02
Setanta Sports UFC
Рет қаралды 486 М.
Run your own AI (but private)
22:13
NetworkChuck
Рет қаралды 1,8 МЛН
Ollama Course - Build AI Apps Locally
2:57:24
freeCodeCamp.org
Рет қаралды 175 М.
Openthread 101: Development environment setup (Part 03)
24:20
Top 10 FREE OSINT tools (with demos) for 2024 - And FREE OSINT course!
1:08:19
How To Host AI Locally: Ollama and Open WebUI
17:44
Naomi Brockwell TV
Рет қаралды 32 М.
Run ALL Your AI Locally in Minutes (LLMs, RAG, and more)
20:19
Cole Medin
Рет қаралды 337 М.
Windsurf vs Cursor: In-Depth AI Code Editor Comparison
18:14
Yifan - Beyond the Hype
Рет қаралды 22 М.
Transformers (how LLMs work) explained visually | DL5
27:14
3Blue1Brown
Рет қаралды 4,2 МЛН
LAB# 5 Dockerfile COPY Command: Demo COPY and Storage Volumes!
22:25
УНО Реверс в Амонг Ас : игра на выбывание
0:19
Фани Хани
Рет қаралды 1,3 МЛН