fine tuning llama-2 to code
27:18
8 ай бұрын
Did ChatGPT Wet the Bed?
11:38
10 ай бұрын
The Rise of AI Agent Marketplaces
21:42
Пікірлер
@limlimhooiyahoocom
@limlimhooiyahoocom 9 сағат бұрын
how to use big.js with assembly script
@Crygd-utre1
@Crygd-utre1 3 күн бұрын
and it's faster than crystal, so i slightly more like it. except the syntax, i more like ruby-like syntax rather than nim (pascal-like).
@mrRambleGamble
@mrRambleGamble 3 күн бұрын
The music is loud and the volume is chaotic in the intro
@juliussakalys9600
@juliussakalys9600 3 күн бұрын
So this is what prompt engineering really is
@maskedvillainai
@maskedvillainai 4 күн бұрын
You really need to explain why models ability has nothing to do with its model size. It has to do with the human’s ability to guide those models. GPT went through that already. Now you do it with a new model and boom
@skittlexenu6510
@skittlexenu6510 5 күн бұрын
Please contact me
@Leo-ph7ow
@Leo-ph7ow 5 күн бұрын
I like your videos a lot! Thanks!
@Forwardknowlege
@Forwardknowlege 5 күн бұрын
can I fine tune Llama-3 by meta as well ? example >>> meta-llama/Meta-Llama-3-8B-Instruct
@munkhbayarbyambadelger954
@munkhbayarbyambadelger954 6 күн бұрын
when i run my code, it does not print "hello world". What could be the cause?
@chonkey
@chonkey 8 күн бұрын
disliked
@mardix
@mardix 9 күн бұрын
Good stuff. Thanks!
@maizizhamdo
@maizizhamdo 9 күн бұрын
"Hey boss chris , I just wanted to say that your tutorial on creating an AI agent was super helpful and easy to follow! I'm not tech-savvy at all, but even I was able to understand the steps. Would you consider doing another tutorial using local OLLAMa models? That would be really cool and something I'd definitely want to learn more about. Thanks for sharing your knowledge with us!"
@chrishayuk
@chrishayuk 9 күн бұрын
are you meaning as in finetuning your own model and getting it to run in ollama?
@maizizhamdo
@maizizhamdo 9 күн бұрын
@@chrishayuk using ollama with langchain in place of openai
@chrishayuk
@chrishayuk 9 күн бұрын
@@maizizhamdo yes that's part of the video of the video on using opensource models that i've prepared but yet to record.
@maizizhamdo
@maizizhamdo 9 күн бұрын
@@chrishayuk "Ahaha, no worries! I'm glad you're planning on recording a video on using open-source models. Looking forward to seeing it when you're done recording! Thanks for the update"
@AT-mx3hn
@AT-mx3hn 9 күн бұрын
I like to guess accents. What is your accent?! There is a obvious primary Scottish element but there are also strong hints of American and weaker hints of possibly English and/or Australian... did you move around a lot or are you just trying to sound more American so KZbin can understand you better?!
@chrishayuk
@chrishayuk 9 күн бұрын
i'm like a fine wine with lots of elements of different accents. i'm a scot that lives in england that used to live in ireland, spent a lot of time in india, us and travels a lot.
@AT-mx3hn
@AT-mx3hn 9 күн бұрын
Amazing, thanks for taking the time to answer!
@kenchang3456
@kenchang3456 9 күн бұрын
Wow, how fortunate am I?! I was looking for an example of fine-tuning to change the behavior of a model to act like a counter clerk at an auto parts store and I think I have found it and it's synthetic too. THANK YOU VERY MUCH!
@ruslankhomenko2586
@ruslankhomenko2586 10 күн бұрын
the best video on the topic
@chrishayuk
@chrishayuk 9 күн бұрын
very kind of you to say
@dabunnisher29
@dabunnisher29 11 күн бұрын
I think your channel is really GREAT for Rust learners like me. Thank you.
@JonathanDeCollibus
@JonathanDeCollibus 11 күн бұрын
subscribed.
@chrishayuk
@chrishayuk 11 күн бұрын
Awesome, thank you
@JonathanDeCollibus
@JonathanDeCollibus 11 күн бұрын
chris, fantastic video. i've been looking for this exact answer.
@chrishayuk
@chrishayuk 11 күн бұрын
Super glad this was useful, this vid is a little more raw than normal as my purposely pointing out the errors in the dataset rather than fixing them, but I think it’s useful to understand
@Bayzon08
@Bayzon08 13 күн бұрын
Great video but with all due respect, having a transparent screen showing you in the background with that blue lighting is an unnecessary punch in the eyes.
@manuelbojorge5840
@manuelbojorge5840 13 күн бұрын
Thank you bro
@luisa6511
@luisa6511 16 күн бұрын
I made the npx ipull command and it created a gguf file but with html inside, is that correct?
@HistorIAsImposibles776AC
@HistorIAsImposibles776AC 16 күн бұрын
Wooowowwwowo
@chrishayuk
@chrishayuk 9 күн бұрын
lol
@federicoottomano8619
@federicoottomano8619 17 күн бұрын
I guess it is pretty straightforward using another (eventually open source) LLM instead of GPT, like LlaMa, in your examples?
@chrishayuk
@chrishayuk 9 күн бұрын
i have a follow up video in the pipe on this very thing, i'm hoping to record this weekend, i prepped it a few weeks ago, it's actually a pretty interesting topic
@federicoottomano8619
@federicoottomano8619 7 күн бұрын
@@chrishayuk thanks Chris! Looking forward to it!
@chrishayuk
@chrishayuk 7 күн бұрын
@@federicoottomano8619 and recorded
@v1Broadcaster
@v1Broadcaster 22 күн бұрын
why are you not using d8?
@nikolatanev3293
@nikolatanev3293 23 күн бұрын
Thank you !!! A good video is when you have a take-away. Mine is the possibility to assign stop keywords, the only missing link on how actually React works in practice.
@chrishayuk
@chrishayuk 9 күн бұрын
yeah, that's the actual trick
@i-am-the-slime
@i-am-the-slime 24 күн бұрын
Turbopack just never works, that's the problem.
@ernestuz
@ernestuz 24 күн бұрын
The funny thing is the most complete the vocabulary the less pressure in the upper layers, so it's not only cheaper because of fewer tokens, but in processing, I wonder if somebody has prepared a semi handcrafted tokenizer, where, let's say the first 30K tokens come from a dictionary and the rest is generated.
@MichaelNortonFG2SW
@MichaelNortonFG2SW 25 күн бұрын
How do I configure visual studio for the "code ." to work from the terminal?
@MichaelNortonFG2SW
@MichaelNortonFG2SW 25 күн бұрын
I ran into some problems with linking on my M2 Studio with Sonoma 14.4.1. Reading through the threads, this worked for me. ld hello.o -o hello -lSystem -syslibroot `xcrun -sdk macosx --show-sdk-path` -e _main -arch arm64
@rluijk
@rluijk 25 күн бұрын
ok, that is all very concrete! Awesome. Thanks for this. This seems like a lot of quick wins that are easy to discover, or is that because hindsight by you explaining it so clearly? Anyway, its all a bit new to me. Perhaps, lets say Norway, would be wise to run this with their own tokeniser? Or is that to simplistic thinking?
@goodtothinkwith
@goodtothinkwith 26 күн бұрын
Great stuff.. no nonsense presentation style, clear and technical, as it should be 😅.. question: is there a reason why it’s not better to have common English syllables in the vocabulary? I understand “lov” being there, but I can’t imagine that “el” is a very useful token as part of “Lovelace”.. intuitively, I would think that is should simply be tokenized as “love” and “lace”
@charbakcg
@charbakcg 26 күн бұрын
Excellent demonstration Chris , thanks for sharing!
@leeme179
@leeme179 26 күн бұрын
What are you thought on including space in the tokenizer? I tried it once and the LLM was optimising to predict spaces as those easy wins for the LLM, but I like the way tiktoken has done to keep the space but not space as a token on it own....
@chrishayuk
@chrishayuk 26 күн бұрын
I’m okay with it, if you watch my visualizing embeddings layer video you’ll see that words with spaces and words without spaces are so closely correlated on the initial embeddings layer that it’s basically a non issue. The cost however is the size of the vocabulary and therefore the embeddings layer size. It does however make the model much more efficient not having spaces handled separately. So having words with spaces as its own token makes so much more sense
@leeme179
@leeme179 26 күн бұрын
great video, thank you
@chrishayuk
@chrishayuk 26 күн бұрын
Thank you, glad it was useful
@rogerc7960
@rogerc7960 26 күн бұрын
Why is there some pytorch? Does finetuned or merged versions need it?
@aaravsethi6070
@aaravsethi6070 27 күн бұрын
Im super excited to see the `llama.cpp`, `llama2.c`, etc. category be implemented for llama3!
@chrishayuk
@chrishayuk 26 күн бұрын
Agree
@ArseniyPotapov
@ArseniyPotapov 26 күн бұрын
llama.cpp already supports Llama3
@sergeziehi4816
@sergeziehi4816 27 күн бұрын
dataset creation is the main heavy and critical task in the full process i think. How did you managed it?
@prashantkowshik5637
@prashantkowshik5637 28 күн бұрын
Thanks a lot Chris.
@sumandawnmobile
@sumandawnmobile Ай бұрын
This is an amazing video Chris :) Thanks for the tutorial
@chrishayuk
@chrishayuk 9 күн бұрын
thank you
@IdPreferNot1
@IdPreferNot1 Ай бұрын
Excellent explanation as you go along. Providing a github would perfect it, thx!
@chrishayuk
@chrishayuk Ай бұрын
Examples are in GitHub.com/chrishayuk
@chrishayuk
@chrishayuk Ай бұрын
And super glad it was useful
@simon_greig
@simon_greig Ай бұрын
Awesome video Chris. Really clear and easy to follow. I also understand Langchain a lot more than I did 45 mins ago 🙂
@chrishayuk
@chrishayuk Ай бұрын
Thanks Simon, appreciate it. It wasn’t really meant to teach langchain originally, but it’s kinda hard to talk about the ReAct pattern without covering the basics of langchain. Glad it was useful
@joebarhouch2742
@joebarhouch2742 Ай бұрын
Thank you for providing such a clear explanation!!! No BS, no following the tutorials given in the documentation blatantly, but actually carefully crafted tutorial with explanation of what's going on. Hats off!!
@chrishayuk
@chrishayuk Ай бұрын
Thank you, glad it was useful. Thanks for calling out “carefully crafted”. Most of the time is spent figuring out the narrative. Very kind of you
@cammyjee
@cammyjee Ай бұрын
It's Craig.
@chrishayuk
@chrishayuk Ай бұрын
I thought this video made it obvious it wasn’t. Must try harder
@mindurownbussines
@mindurownbussines Ай бұрын
Thank you so much Chris I truly believe if one has a great understanding of a subject he can teach it clearly and you simply did that ! God bless you
@chrishayuk
@chrishayuk Ай бұрын
You are too kind, and thank you. Glad it was useful
@enlightenment5d
@enlightenment5d Ай бұрын
Good! Where can I find your programs?
@ramsuman6945
@ramsuman6945 Ай бұрын
Great video. Can’t this be achieved using RAG instead of training
@MarlonEnglemam
@MarlonEnglemam Ай бұрын
Amazing video! I feel like I just want to use bun for everything 😅😂
@feniyuli
@feniyuli Ай бұрын
It is very helpful to understand how the tokenization works. Thanks! Do you think data that we encode using tiktoken will be sent to the AI?
@marymary5494
@marymary5494 Ай бұрын
I remember playing Fizzbuzz at school.
@JaxonPetersen-cu8jt
@JaxonPetersen-cu8jt Ай бұрын
Great video. Great explanations.