Nice videos and very useful, is there ability to get an AI assistance similar to one in GPT openAI and ability to use its API
@cheyannehutson24122 ай бұрын
Thank you for the video! But some questions how do you change the context length of your model when running it this way? I know usually you would set the context length when using “ollama run [model]” but it seems you don’t get that chance with this configuration. Any help with this would be appreciated, thank you!
@loganhallucinates2 ай бұрын
You can set the context length manually using something like `/set parameter num_ctx 32768`, don't forget to `/save ` as well
@0057beastАй бұрын
Man i needed this for coding thanks@@loganhallucinates
@DevJonny15 күн бұрын
If it's local why do I need ngrok? I was looking for something offline
@loganhallucinates14 күн бұрын
It’s for exposing your local API to the internet so Cursor server can access it. Their logic is in their server.
@MondoBoricuaАй бұрын
still working ?
@boiserunnerАй бұрын
Thanks for the video. Why would anyone want to do this?
@loganhallucinatesАй бұрын
Some local models are quite capable now
@moresignal4 ай бұрын
This is ridiculously dangerous advice given Ollama has no authentication and you are suggesting to make it open to the entire Internet
@loganhallucinates3 ай бұрын
ngrok has authentication you can setup
@sneedtube21 күн бұрын
Doesn't work anymore, to me it looks like the Cursor devs are actively sabotaging every effort from the open source community to democratize that IDE, shameless merchants