In the next video, I'll show how Claude, with the right hints (and that's the interesting part! How to start the chat, what design clues to provide) single-handled (code-wise) implemented UTF-8 support inside Kilo.
@RiccardoCosenza2 күн бұрын
Form Campobello to Catania is a looooooong viaggiu…. 😅😅😅
@waynetrout6977Сағат бұрын
nice comparison, thank you for some insight into deepseek, I have not used this model yet. You may know this but for modifications i might sugest prompting the AI to utilize CLI functions for quick and accurate modification/replacement in longer code. This works well with C 3.5 I am not sure about deepseek. also, if you are having issues with the AI accepting your code submission sometimes this is due to file type just change it to .txt and it should read this just fine. example .ahk some models will not accept as a file type but will help you with the code if submitted via a .txt.
@GianlucaB73Күн бұрын
davvero interessante e coinvolgente questo format... continua così!
@halemmКүн бұрын
bellissimo questo tipo di video!
@Thelegends96Күн бұрын
Hi Salvatore, thanks so much for this! Loving these recents videos. Some constructive feedback since you are looking to make more (please do!) - While showing how the model can fail was interesting, the back-and-forth interactivity took some of time. Perhaps you can go through previous / pre-prepared chat histories with commentary on what was your thought process? this way you get to pick interesting cases! - It would be great to see examples like using the model for code reviews, brainstorming, or handling bigger projects with long context windows and multiple files. (If you use them for these purposes) P.s I can relate with the "fotta" to make the videos, but get some rest! :p
@antirezКүн бұрын
Thank you! In the next video I am doing exactly what you suggested of having a pre built history. All is still authentic but much more information dense and easier to follow :) thanks.
@DomenicoLupo2 күн бұрын
Another difference i saw is that DeepSeek cannot execute the code, while Claude can do it. I hope they add this feature soon. Great video thank you!
@antirez2 күн бұрын
Yep. In Claude they did something very smart: the model writes the code in Javascript using console.output() for output, so it can run in the user browser. No backend needed for code execution. With webasm they should be able to easily do that for Python as well I guess.
@nonefvnfvnjnjnjevjenjvonej33842 күн бұрын
thank you thank you thank you for doing this.
@ConteMascetti19712 күн бұрын
OOC question: Gentle words are necessary for the models ? Or its mandatory by your sicilian education (like"please")
@antirez2 күн бұрын
@@ConteMascetti1971 comes natural to me for some odd reason :)
@ConteMascetti19712 күн бұрын
@antirez this evening i Will do a check of how much gentle words GPU time costs in a LLM Model like llama
@ninadsachania36526 сағат бұрын
Is it Ghostty?
@antirez4 сағат бұрын
@@ninadsachania3652 yes
@ciao13072 күн бұрын
Hi Salvatore, is there no way to remove the black bands on the sides? Good content anyway :)
@antirez2 күн бұрын
@@ciao1307 next time I'll make sure to use the correct aspect ratio :) thanks
@Techonsapevole2 күн бұрын
Anche Qwen-coder-32B non era male
@erminiottone2 күн бұрын
Why do you use overpriced computers like Apple?
@antirez2 күн бұрын
Unified RAM between GPU / CPU. Unix but without the issues of Linux.