Thx ! And with Bolt.diy ?? An another video with deepseek v3 ?
@maniksahdev42922 күн бұрын
Cannot integrate this into bolt.diy as of yet, I've tried lol
@blenderdad4 күн бұрын
i don't think i've ever been so early to a video. I love Deepseek because of the code quality and the price, though the designs it produces for me sometimes look a bit "old"
@haraldwolte37452 күн бұрын
What are the settings its hard to see in the video. The hyperbolic url and model name etc
@brentpope14974 күн бұрын
Is there any advantage using this over cline? What happens when you hit the context window max?
@ChefBrianCooks2 күн бұрын
not really cline is better. roo cline is better too.
@buggi6664 күн бұрын
Thanks!
@AICodeKing4 күн бұрын
Thanks for the support!
@jacquesdupontd4 күн бұрын
I have a "No body" error in cline when using deepseek with Hyperbolic base url, anyone else ? Thx
@AICodeKing4 күн бұрын
It happens when the API is down. Try a bit later.
@jacquesdupontd4 күн бұрын
@AICodeKing thanks
@silviudx4 күн бұрын
@@AICodeKingIs it working also in roo-cline?
@Augmented_AI4 күн бұрын
Please ask them to make it a vscode extension
@d.d.z.3 күн бұрын
Another honest review ❤
@gani2an14 күн бұрын
is cline not better ?
@felixgraphx3 күн бұрын
How to use deepseek v3 locally with ollama?
@waytae14 күн бұрын
How to revert change please if the change is wrong?
@lancemarchetti86734 күн бұрын
Always have a backup directory in your root folder and save the sessions. 1.html 2.html .3.html etc
@carinebruyndoncx53314 күн бұрын
Not sure why you would use this over aider, since you will need some tech skill to deploy or use any code or app generated
@syvern71973 күн бұрын
I'm sorry im new to these stuff but i wanted to ask, if the deepseek is opensource (free) then why are people calling it cheap, like its not a free and comes with a cheap price
@mubashir33 күн бұрын
You are free to download and run it. If you have the required hardware. And that is a massive if. Most people do not. So, they pay someone else to run it for them.
@syvern71973 күн бұрын
@mubashir3 interesting, and lets say i do want to download it, what is the pc requirements for doing so
@mubashir33 күн бұрын
@@syvern7197 About 350-700GBs of RAM. And if you don't have a lot of CPU cores, it will run very very very slow. In others words, you could spend US $10K on a PC using state of the art components and it will still not be enough to run this model at a usable speed.
@syvern71973 күн бұрын
@@mubashir3 My! that one model out of the question locally, well id it true that if you used it locally you can use it offline, also to use a model locally should we used ollama or there are other methods too
@sm3733 күн бұрын
Also the electricity costs of running the PC and degradation of the hardware make it not practical
@haraldwolte37452 күн бұрын
Are there other providers that host deepseek model if we don't trust deepseek themselves with our data?
@DemocracyDecoded4 күн бұрын
Wow I cannot use gemini 2.0 and deepseek v3, they keep going down, gemini says limit reached in cline and i tried roo-cline too same problem, i tried resetting the API key, same issue, and deepseek crashes... 😂😂😂
@thesaltyone44003 күн бұрын
Its the API for Gemini it's hard capped on the free at 8,000 tokens and as for deepseek if you use the API key it's not free unless you use Ollama and can run the 1tb model
@DemocracyDecoded3 күн бұрын
@thesaltyone4400 ya but the advertisement says free usage and people are reviewing it as its free, but really its capped like you said, which ollama model is 1tb damn that's a lot.
@thesaltyone44003 күн бұрын
@@DemocracyDecodedthat's the deepseek v3 model it is free if you can fit the 670gb download and have the hardware to run it. That's probably what King usually does.
@thesaltyone44003 күн бұрын
I thought I replied a second ago but to answer your question. The Deepseek model is 670gb! If you can download it and run it on your system then it would be free. I suspect King usually downloads the models and uses them when you shows it off
@TomHimanen3 күн бұрын
Can we use DeepSeek v3 with Ollama locally?
@syvern71973 күн бұрын
good question, i would like to hear the answer for thart too
@cryptotrader50723 күн бұрын
you can but that's would cost you alot of money like thousands of dollars
@TomHimanen3 күн бұрын
@@cryptotrader5072 You mean because of GPU requirements? I think that there will be DeepSeek mini, which will be runnable on about 24 gigs of VRAM. Lots of open source models get scaled down dramatically by open source community.
@syvern71972 күн бұрын
@@cryptotrader5072 the money is for the hardwares requirements only not the api recalling or the model itself, right?
@1voice4all3 күн бұрын
This one seems similar to Nexa which has a gradio UI. I need to merge FrameWise and Nexa.
@hasanaqeelabd-alabbas31803 күн бұрын
I have 0 programming or knowledge about these programs you showing , i really dont know what to do