How to Use Local LLM in Cursor

  Рет қаралды 4,089

Logan Hallucinates

Logan Hallucinates

Күн бұрын

Пікірлер: 15
@Jake-ky2ed
@Jake-ky2ed Ай бұрын
Thank you. very valuable video for me.
@amrsalem6655
@amrsalem6655 Ай бұрын
Nice videos and very useful, is there ability to get an AI assistance similar to one in GPT openAI and ability to use its API
@cheyannehutson2412
@cheyannehutson2412 2 ай бұрын
Thank you for the video! But some questions how do you change the context length of your model when running it this way? I know usually you would set the context length when using “ollama run [model]” but it seems you don’t get that chance with this configuration. Any help with this would be appreciated, thank you!
@loganhallucinates
@loganhallucinates 2 ай бұрын
You can set the context length manually using something like `/set parameter num_ctx 32768`, don't forget to `/save ` as well
@0057beast
@0057beast Ай бұрын
Man i needed this for coding thanks​@@loganhallucinates
@DevJonny
@DevJonny 15 күн бұрын
If it's local why do I need ngrok? I was looking for something offline
@loganhallucinates
@loganhallucinates 14 күн бұрын
It’s for exposing your local API to the internet so Cursor server can access it. Their logic is in their server.
@MondoBoricua
@MondoBoricua Ай бұрын
still working ?
@boiserunner
@boiserunner Ай бұрын
Thanks for the video. Why would anyone want to do this?
@loganhallucinates
@loganhallucinates Ай бұрын
Some local models are quite capable now
@moresignal
@moresignal 4 ай бұрын
This is ridiculously dangerous advice given Ollama has no authentication and you are suggesting to make it open to the entire Internet
@loganhallucinates
@loganhallucinates 3 ай бұрын
ngrok has authentication you can setup
@sneedtube
@sneedtube 21 күн бұрын
Doesn't work anymore, to me it looks like the Cursor devs are actively sabotaging every effort from the open source community to democratize that IDE, shameless merchants
@nielsdebont270
@nielsdebont270 5 ай бұрын
ollama_origins=* command is not recognized for me
@AndrewQuardex
@AndrewQuardex 2 ай бұрын
SET OLLAMA_ORIGiNS=* if windows
Когда отец одевает ребёнка @JaySharon
00:16
История одного вокалиста
Рет қаралды 15 МЛН
SISTER EXPOSED MY MAGIC @Whoispelagheya
00:45
MasomkaMagic
Рет қаралды 14 МЛН
Un coup venu de l’espace 😂😂😂
00:19
Nicocapone
Рет қаралды 14 МЛН
AI Built My App and It's INSANE! (Galileo, Claude, Cursor)
16:38
AI in practice
Рет қаралды 80 М.
How to Create a Second Brain with Obsidian and Fabric
10:41
Dan Gauerke
Рет қаралды 13 М.
Run ALL Your AI Locally in Minutes (LLMs, RAG, and more)
20:19
Cole Medin
Рет қаралды 182 М.
EASIEST Way to Fine-Tune a LLM and Use It With Ollama
5:18
warpdotdev
Рет қаралды 85 М.
host ALL your AI locally
24:20
NetworkChuck
Рет қаралды 1,2 МЛН
Когда отец одевает ребёнка @JaySharon
00:16
История одного вокалиста
Рет қаралды 15 МЛН