Пікірлер
@greatjobbuddy
@greatjobbuddy 3 күн бұрын
Super helpful!!!! Thanks for making this. Going to watch more of your 'actions' videos for sure.
@thom1218
@thom1218 5 күн бұрын
wait... you're on localstack's 3rd party (non-local) website managing your local resources? Something isn't right with this picture - that UI should be hosted locally.
@cgc2300
@cgc2300 10 күн бұрын
Hello
@vanhussen
@vanhussen 13 күн бұрын
it's work! thank you from Indonesia
@exogeo
@exogeo 14 күн бұрын
Thanks for making these videos, Your videos are super helpful & awesome. You deserve success here!!
@cgc2300
@cgc2300 14 күн бұрын
good evening, could you help me understand how this or that works and when to use it and also so that I can clearly understand how to use the workflows which are made available as an example on the site, there is only very little explanation and I don't understand
@Gr4ph1xZ
@Gr4ph1xZ 15 күн бұрын
Can i also use traefik to expose not a container but instead a internal ip (a other vm) and put https externaly to it? :)
@renega991
@renega991 18 күн бұрын
Hi amazing stuff! Is there a way to connect the ngrok to jupyter notebook? Thanks!
@BrazenNL
@BrazenNL 21 күн бұрын
When presenting, enlarging type (your VS Code window) is not a bad thing. Lots of people consume media on a smaller screen nowadays.
@Ankush.8
@Ankush.8 21 күн бұрын
Just wondering... Are you using Oh-my-zsh or any other plugin manager?
@ThisIsTheSan
@ThisIsTheSan 22 күн бұрын
I can run it remotely in the terminal, but unfortunately all tools that use ollama as a backend seem to be unable to connect to it if OLLAMA_HOST is set
@iamderrickfoo
@iamderrickfoo Ай бұрын
This is awesome stuff! Would like to know after this up can we connect this to Webui or Anythingllm?
@shrishubhyadav05
@shrishubhyadav05 Ай бұрын
Found a Gem 💎
@joekustek2623
@joekustek2623 Ай бұрын
How can I run this on my website or in a browser instead of terminal window
@pathsvivi
@pathsvivi Ай бұрын
Thanks for the video. One question though, how can I avoid downloading the language models every time I run Colab notebook? Can I save Ollama and its models in Google drive and retrieve them when running the notebook?
@AfnanQasim-wk8nq
@AfnanQasim-wk8nq Ай бұрын
canw e load 70B model with this same technque ?
@all_tutorials609
@all_tutorials609 Ай бұрын
This is awesome. Would look forward to watching how self hosting for N8N is done.
@DCS-um9oc
@DCS-um9oc Ай бұрын
i got windows machine, do i need ollama locally tooo?
@kylelaker539
@kylelaker539 2 ай бұрын
The A record you make for dev. Is that public or private ip does it matter?
@Justin_Jay
@Justin_Jay Ай бұрын
public ip
@justhackerthings
@justhackerthings 2 ай бұрын
Thanks for the great video! It helped me a lot!
@prashlovessamosa
@prashlovessamosa 2 ай бұрын
very helpful
@thoufeekbaber8597
@thoufeekbaber8597 2 ай бұрын
Thank you. I could run this succesfully in the terminal, but how can access the model or the collab through jupyter notebook instance?
@user-em7se7bm7v
@user-em7se7bm7v 2 ай бұрын
awesome man
@phamlehaibang
@phamlehaibang 2 ай бұрын
Hi bro, Can you please make clear or more details when exporting OLLAMA_HOST? Because when i export OLLAMA_HOST=…. ollama -h zsh: ollama - command is not found Do we need to do the following command line? ngrok http 11434 --host-header="localhost:11434" Please let me know. I am not clear about “export OLLAMA_HOST=…” and how to run ollama remote service in your local terminal at all Totally, your video is awesome. Thanks, bro.
@asdflkjasdfasdlfkj
@asdflkjasdfasdlfkj 2 ай бұрын
Awesome ❤ Thanks
@aryanflory
@aryanflory 3 ай бұрын
hey, how to the export step on windows? I have the ollama installed
@biological-machine
@biological-machine Ай бұрын
just use "set OLLAMA_PATH=the_url"
@techwithmarco
@techwithmarco 3 ай бұрын
Want some more GitHub Actions action? kzbin.info/www/bejne/ZnOxqHR7rL6NoqM
@Steven-wl5wi
@Steven-wl5wi 3 ай бұрын
os.eviron.update({'LD_LIBRARY_PATH': "})'/usr Great for linux, what about windows machines?
@techwithmarco
@techwithmarco 3 ай бұрын
The runbook is being executed on a linux machine
@vg2812
@vg2812 3 ай бұрын
Error: something went wrong, please see the ollama server logs for details am getting this error after running export OLLAMA_HOST= ... what should i do????
@techwithmarco
@techwithmarco 3 ай бұрын
See the other latest comments or check out the new version on github. Should resolve the issue :)
@vg2812
@vg2812 3 ай бұрын
@@techwithmarco okay I will check
@vg2812
@vg2812 3 ай бұрын
@@techwithmarco thank you for the reply
@yanncotineau
@yanncotineau 3 ай бұрын
i got a 403 forbidden error, but replacing run_process(['ngrok', 'http', '--log', 'stderr', '11434']) with run_process(['ngrok', 'http', '--log', 'stderr', '11434', '--host-header="localhost:11434"']) fixed it for me.
@tiagosanti3
@tiagosanti3 3 ай бұрын
Fixed it for me too, thanks
@MR-kh8ve
@MR-kh8ve 3 ай бұрын
for me worked too, thank you!
@nicholasdunaway2605
@nicholasdunaway2605 3 ай бұрын
THANK YOU
@Kursadysr
@Kursadysr 3 ай бұрын
You are a life saver!!!
@techwithmarco
@techwithmarco 3 ай бұрын
great spot! I already updated the script on github :)
@mellio19
@mellio19 3 ай бұрын
but can't run stable diffusion this way?
@abhishekratan2496
@abhishekratan2496 3 ай бұрын
Very usefull video also the code btw i can't get it running on windows what would be the way to set OLLAMA_HOST variable on window set OLLAMA_HOST= "--" doesn't seem to work it still runs on local machine
@techwithmarco
@techwithmarco 3 ай бұрын
I think it depends on the terminal and shell you are using. Are you using the standard windows terminal?
@TirthSheth108
@TirthSheth108 3 ай бұрын
Hii @@techwithmarco , thanks for chiming in. I'm actually experiencing the same issue as @abhishekratan2496 , but I'm running it on the Ubuntu terminal. Setting the OLLAMA_HOST variable doesn't seem to work for me either. Any insights on how to resolve this would be greatly appreciated! Thanks.
@techwithmarco
@techwithmarco 3 ай бұрын
@@TirthSheth108 Okay that's weird. I just used it a few days ago and it worked perfectly. I'll investigate and let you know :)
@AllMindControl
@AllMindControl 2 ай бұрын
did anyone figure this out? it just tells me that export is not a recognized command
@ironnerd2511
@ironnerd2511 3 ай бұрын
What did he do to open the vault at 3:45?
@techwithmarco
@techwithmarco 3 ай бұрын
I entered the password, and then pressed the unlock button :)
@thepsych3
@thepsych3 3 ай бұрын
i get error like 403 forbidden
@ricardomorim9444
@ricardomorim9444 3 ай бұрын
replace: run_process(['ngrok', 'http', '--log', 'stderr', '11434']) with run_process(['ngrok', 'http', '--log', 'stderr', '11434', '--host-header="localhost:11434"']) That fixed it for me.
@paulopatto8283
@paulopatto8283 2 ай бұрын
@@ricardomorim9444 tkx very much guys, solved my issue.
@davidk.3450
@davidk.3450 3 ай бұрын
Can you give me some examples how to backup volumes that are managed in another docker-compose file? (And maybee how to create a weekly-backup AND a monthly-backup within the same configuration) Thanks a lot
@alitokii
@alitokii 4 ай бұрын
Hi Marco, thanks for this video, definitely going to try out starship! Also, I was wondering what text editor/IDE you're using to view your .zshrc? Thank you!
@general_wcj9438
@general_wcj9438 3 ай бұрын
To me it looks like a jet brains product
@techwithmarco
@techwithmarco 3 ай бұрын
This is Jetbrains IntelliJ with the theme 'dark' Have fun checking out starship :)
@bobsmithy3103
@bobsmithy3103 4 ай бұрын
Do you guys not get banned? I got a warning for using ngrok :/
@techwithmarco
@techwithmarco 3 ай бұрын
I am not using it very often. But maybe there are some alternatives which you could check out like pgrok github.com/pgrok/pgrok
@michaelwilliams5092
@michaelwilliams5092 4 ай бұрын
Where did you store the .ipynb file so ollama could access it?
@techwithmarco
@techwithmarco 3 ай бұрын
See your other comment :) it is just a local file
@bennguyen1313
@bennguyen1313 4 ай бұрын
I imagine it's costly to run LLMs.. is there a limit on how much Google Colab will do for free? I'm interested in creating a Python application that uses AI.. from what I've read, I could use ChatGPT4 Assistant API and I as the developer would incur the cost whenever the app is used. Alternatively, I could host a model like Ollama, on my own computer or on the cloud (beam cloud/ Replicate/Streamlit/replit)? As a 3rd option, could Google Colab work in my situation? Is OpenAI's Assistant API totally different from the API to programmatically interact with llama2 , mistral , etc?
@attilavass6935
@attilavass6935 4 ай бұрын
How can we keep our downloaded LLMs permanently, eg. on a mounted Google Drive? It would speed up the start of inference in a new ollama server start.
@techwithmarco
@techwithmarco 3 ай бұрын
Yes, that's a brilliant idea! You can save those in google drive with this snippet for example: import os # Mount Google Drive from google.colab import drive drive.mount('/content/drive') # Create a folder in the root directory !mkdir -p "/content/drive/My Drive/My Folder" # Start Ollama with a path where models are stored OLLAMA_MODELS=/content/drive/My Drive/My Folder ollama serve
@attilavass6935
@attilavass6935 3 ай бұрын
@@techwithmarco that's great, thank you! :)
@dalgardnerd
@dalgardnerd 4 ай бұрын
I have watched so much content on traefik and authelia and struggled so hard until now. Your two videos on the subject are so great. Thanks!
@techwithmarco
@techwithmarco 3 ай бұрын
Glad to hear that! Hope you having fun configuring your instances!
@danr2513
@danr2513 4 ай бұрын
My issues is this error: time="2024-02-20T21:16:41Z" level=debug msg="No default certificate, fallback to the internal generated certificate" tlsStoreName=default time="2024-02-20T21:16:41Z" level=debug msg="Added outgoing tracing middleware noop@internal" middlewareName=tracing middlewareType=TracingForwarder entryPointName=web routerName=web-to-websecure@internal
@nanaarhin5567
@nanaarhin5567 4 ай бұрын
I don't see pricing. Is it the same as support? And the license that will be given, can I use it on multiple devices such as my laptop and phones at the same time?
@techwithmarco
@techwithmarco 3 ай бұрын
The license is actually only needed for phones. Android and iPhones have different apps, so you need to buy it twice. There is no need to have a license to use on your laptop
@michaelwilliams5092
@michaelwilliams5092 4 ай бұрын
Great tutorial. Runs super fast compared to other methods. How would we ingest a document and query its contents? @techwithmarco
@techwithmarco
@techwithmarco 3 ай бұрын
You can either use the a cat command to preview the content for ollama like in minute 5:44 or use langchain for example and make some custom code to read some specific files. There are cool examples on github like this: github.com/ollama/ollama/tree/main/examples/langchain-python-rag-privategpt
@michaelwilliams5092
@michaelwilliams5092 3 ай бұрын
​@@techwithmarco Thank you for the reply. I am on Windows using the beta version of Ollama and $(cat file) doesn't work from the command prompt and Powershell only uses local Ollama not remote. Do you know if there is an equivalent command for cmd?
@paulopatto8283
@paulopatto8283 2 ай бұрын
Thanks!
@primenetwork27
@primenetwork27 4 ай бұрын
How did i reserve proxy outside in docker
@primenetwork27
@primenetwork27 4 ай бұрын
Greate video . But i have problem how do i reserve proxy outside in docker
@lamamemes
@lamamemes 4 ай бұрын
Nice video, short and on point
@techwithmarco
@techwithmarco 4 ай бұрын
Thanks :)
@parthmadhubhai689
@parthmadhubhai689 4 ай бұрын
seems like you now have to pay for n8n, not sure when that change went through :(
@techwithmarco
@techwithmarco 4 ай бұрын
ah okay You can self host the software because the source code is available :) As I said in the video, there will be a video about how to do this in the next few weeks :)
@liammcgarrigle
@liammcgarrigle 2 ай бұрын
You can self host it on the cloud or on your local computer/server completely free. (Hosting on cloud costs $5-$10 month ish, but that is from the cloud provider not n8n) It takes IT knowledge to do that, especially if the service is going to be relied on
@joanandestin4201
@joanandestin4201 4 ай бұрын
Hi, How are you doing? My instance is on prem and I followed your example but it won't work for me. I removed the tag "# staging environment of LE, remove for real certs" but "Certificates": null. I am not sure why it is not working. Are we able to connect or can you help me?
@techwithmarco
@techwithmarco 4 ай бұрын
Sounds like a timing issue when getting certificates. I'd recommend enabling debug logs of traefik and check what is in there! Drop me a mail! You can find it on my about the channel page.
@dontworry7127
@dontworry7127 5 ай бұрын
Hey Marco thank you for the hint. I am trying to combine it with your traefik + crowdsec tutorial which run into error crowdsec | time="2024-02-07T22:08:42+01:00" level=error msg="UnmarshalJSON : invalid character 'i' in literal true (expecting 'r')" line="time=\"2024-02-07T22:08:42+01:00\" level=error msg=\"Failed to retrieve information of the docker client and server host: Cannot connect to the Docker daemon at unix:///var/run/docker.sock. Is the docker daemon running?\" providerName=docker" In the docker-compose.yml of kzbin.info/www/bejne/kGOWc32oh7KIg5Y are labels available, crowdsec want to connect to docker.sock too. Traefik documentation is at the moment a jungle for me.
@techwithmarco
@techwithmarco 4 ай бұрын
sad to hear that it didn't work out so far. Have you tried to play around with access rights for the docker-socket-proxy? Seems like that crowdsec is reading the access logs of traefik, and traefik is not able to gain information of the docker socket. Maybe try to set the rights less restrictive and then go back and see where it fails github.com/Tecnativa/docker-socket-proxy?tab=readme-ov-file#grant-or-revoke-access-to-certain-api-sections