Clear, crisp and straightforward videos, awesome, thanks a lot ❤💯👋
@СобственникиРиверсайд11 күн бұрын
Thanks for the code and interesting video. Does anyone know how to add a CORs setting to this code so that you can send a request from a web page? Currently requests through the browser are blocked.
@UnknownDigitalCreatorАй бұрын
Very Nice...Good Explanation Deserve Respect...❤❤❤❤
@archilecteur2 ай бұрын
Instead of Terminal, can the call to the model running in Google Colab be inserted in an app?
@marcelocruzeta88223 ай бұрын
Great. Thank you for the class. I Installed Open WebUI from GitHub Repo. No Docker. Can I configure it to run with the remote Ollama? I found it. Have to change in settings. Never mind.
@fabriciocincunegui53324 ай бұрын
thnx for patient
@bnermine97804 ай бұрын
Thank you for the great video! Could the model then be used inside a local python code? I am writing a classification script using an llm but running it on my cpu takes ages. Can I edit my local python code so that the classification is done with the model running on google colab but the results are stored locally? This would also help me apply the same model to different use cases. Thank you!!
@TechXplainator4 ай бұрын
Thank you so much for your kind words! And yes, you can definitely do that. Here is how that could work: 1. Keep the Colab notebook running with Ollama and Ngrok set up as shown in the tutorial. 2. In your local Python script, use the 'requests' library to send classification requests to the Ollama model via the Ngrok URL. 3. Process the responses and store the results locally. I hope that helps. Happy coding ☺️
@chillscripter5 ай бұрын
do the exact thing that you said in the video but i got the error from ollama : the parameter is incorrect how can i solve that?
@TechXplainator5 ай бұрын
Hey there! To help me figure out what's going wrong, could you please tell me: 1. Are you using a fixed Ngrok link or letting Colab create a new one each time? 2. Did you open the link from the notebook in a browser? Does it say "Ollama is running"? 3. Have you correctly linked your local Ollama to Colab-Ollama by setting the OLLAMA_HOST environment variable to your Ngrok URL? (You can usually do this in your terminal with a command like export OLLAMA_HOST=) 4. When you run a model locally (like typing ollama run llama3.1), does the model download to your computer or to Colab? Can you see the download happening in your Colab notebook?
@ardasemsettinoglu5 ай бұрын
When I write the export OLLAMA_HOST it said that" export : The term 'export' is not recognized as the name of a cmdlet" Is it because I am using docker ?
@TechXplainator5 ай бұрын
The error message you're encountering is not related to Docker, but rather to the command shell you're using. The "export" command is specific to Unix-like systems (such as Linux and macOS) and is not recognized in Windows PowerShell or Command Prompt. To set an environment variable in Windows, you should use the "set" command instead of "export". Here's how you can set the OLLAMA_HOST variable in PowerShell: $env:OLLAMA_HOST = "your_value_here" Or in Command Prompt: set OLLAMA_HOST=your_value_here Hope this helps ☺️
@jameschan62775 ай бұрын
Please help if I use windows PC desktop, how can I open terminals like MAC?
@TechXplainator5 ай бұрын
To open terminals on a Windows PC desktop similar to how you would on a Mac, you can use the following methods: Option 1: PowerShell: 1. Press `Windows+X` and select "Windows PowerShell" or "Windows PowerShell (Admin)" from the menu. 2. Alternatively, press `Windows+R`, type `powershell`, and press Enter to open a PowerShell window. Option 2: Command Prompt: 1. Press `Windows+R`, type `cmd`, and press Enter to open a Command Prompt window. 2. You can also search for "Command Prompt" in the Start menu, right-click the result, and select "Run as Administrator" if you need elevated privileges. Hope this helps ☺️
@wowfielder1013 ай бұрын
HELP PLS the "export OLLAMA_HOST=" command is not working in cmd pls help
@TechXplainator3 ай бұрын
You mean on windows? This is how it should work (have not verified it - I'm using mac): 1. Open Command Prompt as Administrator. 2. Run the command below, replacing `` with your Ngrok URL: setx OLLAMA_HOST "" 3. Close and reopen Command Prompt to apply the changes.
@HunterJuniorX6 ай бұрын
is there a way to use models from hugging face?
@TechXplainator6 ай бұрын
Yes there is - if they are available as quantized models (GGUF files). I made a video on how you can import GGUF files from huggingface and use them in Ollama - feel free to check it out: kzbin.info/www/bejne/rKSUpmywZ7pnkKM
@andreabaffascirocco29345 ай бұрын
I have try but seems that the command ollama run llama3.1 download the model on my laptop instead of colab.
@TechXplainator5 ай бұрын
Try running the command export OLLAMA_HOST= (check the URL says "Ollama is running" first). Then in the same terminal window, you should do "ollama run llama3" again. Hope this helps ☺️
@andreabaffascirocco29345 ай бұрын
@@TechXplainator Thanks. i'll try
@andreabaffascirocco29345 ай бұрын
@@TechXplainator Now all work fine, The problem is i have installed ollama on my ubuntu using snap. Whit this installation the pc try to download LLama 3.1 on pc and not on colab.
@TechXplainator5 ай бұрын
I'm glad it works now ☺️
@fabriciocincunegui53324 ай бұрын
How do i export ollama on my cmd im on windows 11
@TechXplainator4 ай бұрын
I can't verify this on a Windows PC since I don't have one, but based on my research, here's how to export the `OLLAMA_HOST` variable on Windows 11 using Command Prompt: 1. Open Command Prompt as Administrator. 2. Run the command below, replacing `` with your Ngrok URL: setx OLLAMA_HOST "" 3. Close and reopen Command Prompt to apply the changes.
@levantoi293410 күн бұрын
Why not google cloud notebook?
@TechXplainator9 күн бұрын
no reason, just a familiarity with Google Colab, but I'm sure it will work on google cloud notebook as well
@levantoi29346 күн бұрын
@@TechXplainator Great. Do you have a solution to schedule the colab task? I tried with google cloud schedule but seem not work and complicate
@TechXplainator5 күн бұрын
sorry - no. I don't
@levantoi293412 күн бұрын
Hello!
@MarkSmith-ho5ij4 ай бұрын
Coders using apple lol. Please use Linux and stop this...
@rajarshisen59055 ай бұрын
Please help, I can run Ollama in Colab, but while run it from docker as open-web-ui, I am getting the following error while trying to chat to llama3 in the web browser. Ollama: 404, message='Not Found', url=URL('/api/chat')
@TechXplainator5 ай бұрын
Does Ollama work from the terminal? I mean, when running export OLLAMA_HOST= and ollama run llama3, do you get to interact with llama3 in your terminal? And do you see any action in your Colab (you should be seeing the notebook downloading a model and responding to chat)
@merocky55 ай бұрын
Yes, I do. Ollama is executing on Colab, when I call it from local computer terminal. Only while using the open-web-ui, following the last command of your python notebook, I get the error as described above. The front end web app starts, but while trying to chat with ollama installed in Colab, I get the error mentioned in above message. I did some internet search and appears that "api" word may be /be not included in the latest version of ollama? Please help how I can resolve this. Thanks a lot
@TechXplainator5 ай бұрын
To make sure we're on the same page, I just want to summarize your setup: 1. You're using a static Ngrok URL. 2. You've successfully connected your local Ollama instance with the one hosted on Colab by running an export command. 3. You've installed OpenWebUI using Docker and replaced the example Ngrok URL with your own static Ngrok URL, as indicated by this command: `docker run -d -p 4000:8080 -e OLLAMA_BASE_URL=example.com -v open-webui:/app/backend/data --name test --restart always ghcr.io/open-webui/open-webui:main` 4. The Docker container was created, but trying to access the Ollama WebUI at `localhost:4000/` results in an error. Please confirm that this summary is accurate so I can help you troubleshoot the issue ☺️
@rajarshisen59055 ай бұрын
@@TechXplainator Yes, the summary is spot on. I have followed all of the above bullet points and got error on last bullet point while trying to post a chat to Ollama using web-UI.
@TechXplainator5 ай бұрын
I was not able to replicate the error, but based on my research, here are a few things you could try: 1. Verify OpenWebUI settings: Access the OpenWebUI settings page (click on your avatar on the bottom left) and verify that the Ollama Server URL is correctly set to your Ngrok URL: Go to “connections”. Under “Ollama Base URL” you should see your static Ngrok URL 2. Network Configuration Ensure that the Docker container can communicate with the Ollama server. Use the --network=host flag to allow the Docker container to use the host network: docker run -d --network=host -v open-webui:/app/backend/data -e OLLAMA_BASE_URL= --name open-webui --restart always ghcr.io/open-webui/open-webui:main I hope this helps. If not, please check out the troubleshooting page from Open WebUI: docs.openwebui.com/troubleshooting/
@СобственникиРиверсайд11 күн бұрын
Thanks for the code and interesting video. Does anyone know how to add a CORs setting to this code so that you can send a request from a web page? Currently requests through the browser are blocked.