and it's faster than crystal, so i slightly more like it. except the syntax, i more like ruby-like syntax rather than nim (pascal-like).
@mrRambleGamble3 күн бұрын
The music is loud and the volume is chaotic in the intro
@juliussakalys96003 күн бұрын
So this is what prompt engineering really is
@maskedvillainai4 күн бұрын
You really need to explain why models ability has nothing to do with its model size. It has to do with the human’s ability to guide those models. GPT went through that already. Now you do it with a new model and boom
@skittlexenu65105 күн бұрын
Please contact me
@Leo-ph7ow5 күн бұрын
I like your videos a lot! Thanks!
@Forwardknowlege5 күн бұрын
can I fine tune Llama-3 by meta as well ? example >>> meta-llama/Meta-Llama-3-8B-Instruct
@munkhbayarbyambadelger9546 күн бұрын
when i run my code, it does not print "hello world". What could be the cause?
@chonkey8 күн бұрын
disliked
@mardix9 күн бұрын
Good stuff. Thanks!
@maizizhamdo9 күн бұрын
"Hey boss chris , I just wanted to say that your tutorial on creating an AI agent was super helpful and easy to follow! I'm not tech-savvy at all, but even I was able to understand the steps. Would you consider doing another tutorial using local OLLAMa models? That would be really cool and something I'd definitely want to learn more about. Thanks for sharing your knowledge with us!"
@chrishayuk9 күн бұрын
are you meaning as in finetuning your own model and getting it to run in ollama?
@maizizhamdo9 күн бұрын
@@chrishayuk using ollama with langchain in place of openai
@chrishayuk9 күн бұрын
@@maizizhamdo yes that's part of the video of the video on using opensource models that i've prepared but yet to record.
@maizizhamdo9 күн бұрын
@@chrishayuk "Ahaha, no worries! I'm glad you're planning on recording a video on using open-source models. Looking forward to seeing it when you're done recording! Thanks for the update"
@AT-mx3hn9 күн бұрын
I like to guess accents. What is your accent?! There is a obvious primary Scottish element but there are also strong hints of American and weaker hints of possibly English and/or Australian... did you move around a lot or are you just trying to sound more American so KZbin can understand you better?!
@chrishayuk9 күн бұрын
i'm like a fine wine with lots of elements of different accents. i'm a scot that lives in england that used to live in ireland, spent a lot of time in india, us and travels a lot.
@AT-mx3hn9 күн бұрын
Amazing, thanks for taking the time to answer!
@kenchang34569 күн бұрын
Wow, how fortunate am I?! I was looking for an example of fine-tuning to change the behavior of a model to act like a counter clerk at an auto parts store and I think I have found it and it's synthetic too. THANK YOU VERY MUCH!
@ruslankhomenko258610 күн бұрын
the best video on the topic
@chrishayuk9 күн бұрын
very kind of you to say
@dabunnisher2911 күн бұрын
I think your channel is really GREAT for Rust learners like me. Thank you.
@JonathanDeCollibus11 күн бұрын
subscribed.
@chrishayuk11 күн бұрын
Awesome, thank you
@JonathanDeCollibus11 күн бұрын
chris, fantastic video. i've been looking for this exact answer.
@chrishayuk11 күн бұрын
Super glad this was useful, this vid is a little more raw than normal as my purposely pointing out the errors in the dataset rather than fixing them, but I think it’s useful to understand
@Bayzon0813 күн бұрын
Great video but with all due respect, having a transparent screen showing you in the background with that blue lighting is an unnecessary punch in the eyes.
@manuelbojorge584013 күн бұрын
Thank you bro
@luisa651116 күн бұрын
I made the npx ipull command and it created a gguf file but with html inside, is that correct?
@HistorIAsImposibles776AC16 күн бұрын
Wooowowwwowo
@chrishayuk9 күн бұрын
lol
@federicoottomano861917 күн бұрын
I guess it is pretty straightforward using another (eventually open source) LLM instead of GPT, like LlaMa, in your examples?
@chrishayuk9 күн бұрын
i have a follow up video in the pipe on this very thing, i'm hoping to record this weekend, i prepped it a few weeks ago, it's actually a pretty interesting topic
@federicoottomano86197 күн бұрын
@@chrishayuk thanks Chris! Looking forward to it!
@chrishayuk7 күн бұрын
@@federicoottomano8619 and recorded
@v1Broadcaster22 күн бұрын
why are you not using d8?
@nikolatanev329323 күн бұрын
Thank you !!! A good video is when you have a take-away. Mine is the possibility to assign stop keywords, the only missing link on how actually React works in practice.
@chrishayuk9 күн бұрын
yeah, that's the actual trick
@i-am-the-slime24 күн бұрын
Turbopack just never works, that's the problem.
@ernestuz24 күн бұрын
The funny thing is the most complete the vocabulary the less pressure in the upper layers, so it's not only cheaper because of fewer tokens, but in processing, I wonder if somebody has prepared a semi handcrafted tokenizer, where, let's say the first 30K tokens come from a dictionary and the rest is generated.
@MichaelNortonFG2SW25 күн бұрын
How do I configure visual studio for the "code ." to work from the terminal?
@MichaelNortonFG2SW25 күн бұрын
I ran into some problems with linking on my M2 Studio with Sonoma 14.4.1. Reading through the threads, this worked for me. ld hello.o -o hello -lSystem -syslibroot `xcrun -sdk macosx --show-sdk-path` -e _main -arch arm64
@rluijk25 күн бұрын
ok, that is all very concrete! Awesome. Thanks for this. This seems like a lot of quick wins that are easy to discover, or is that because hindsight by you explaining it so clearly? Anyway, its all a bit new to me. Perhaps, lets say Norway, would be wise to run this with their own tokeniser? Or is that to simplistic thinking?
@goodtothinkwith26 күн бұрын
Great stuff.. no nonsense presentation style, clear and technical, as it should be 😅.. question: is there a reason why it’s not better to have common English syllables in the vocabulary? I understand “lov” being there, but I can’t imagine that “el” is a very useful token as part of “Lovelace”.. intuitively, I would think that is should simply be tokenized as “love” and “lace”
@charbakcg26 күн бұрын
Excellent demonstration Chris , thanks for sharing!
@leeme17926 күн бұрын
What are you thought on including space in the tokenizer? I tried it once and the LLM was optimising to predict spaces as those easy wins for the LLM, but I like the way tiktoken has done to keep the space but not space as a token on it own....
@chrishayuk26 күн бұрын
I’m okay with it, if you watch my visualizing embeddings layer video you’ll see that words with spaces and words without spaces are so closely correlated on the initial embeddings layer that it’s basically a non issue. The cost however is the size of the vocabulary and therefore the embeddings layer size. It does however make the model much more efficient not having spaces handled separately. So having words with spaces as its own token makes so much more sense
@leeme17926 күн бұрын
great video, thank you
@chrishayuk26 күн бұрын
Thank you, glad it was useful
@rogerc796026 күн бұрын
Why is there some pytorch? Does finetuned or merged versions need it?
@aaravsethi607027 күн бұрын
Im super excited to see the `llama.cpp`, `llama2.c`, etc. category be implemented for llama3!
@chrishayuk26 күн бұрын
Agree
@ArseniyPotapov26 күн бұрын
llama.cpp already supports Llama3
@sergeziehi481627 күн бұрын
dataset creation is the main heavy and critical task in the full process i think. How did you managed it?
@prashantkowshik563728 күн бұрын
Thanks a lot Chris.
@sumandawnmobileАй бұрын
This is an amazing video Chris :) Thanks for the tutorial
@chrishayuk9 күн бұрын
thank you
@IdPreferNot1Ай бұрын
Excellent explanation as you go along. Providing a github would perfect it, thx!
@chrishayukАй бұрын
Examples are in GitHub.com/chrishayuk
@chrishayukАй бұрын
And super glad it was useful
@simon_greigАй бұрын
Awesome video Chris. Really clear and easy to follow. I also understand Langchain a lot more than I did 45 mins ago 🙂
@chrishayukАй бұрын
Thanks Simon, appreciate it. It wasn’t really meant to teach langchain originally, but it’s kinda hard to talk about the ReAct pattern without covering the basics of langchain. Glad it was useful
@joebarhouch2742Ай бұрын
Thank you for providing such a clear explanation!!! No BS, no following the tutorials given in the documentation blatantly, but actually carefully crafted tutorial with explanation of what's going on. Hats off!!
@chrishayukАй бұрын
Thank you, glad it was useful. Thanks for calling out “carefully crafted”. Most of the time is spent figuring out the narrative. Very kind of you
@cammyjeeАй бұрын
It's Craig.
@chrishayukАй бұрын
I thought this video made it obvious it wasn’t. Must try harder
@mindurownbussinesАй бұрын
Thank you so much Chris I truly believe if one has a great understanding of a subject he can teach it clearly and you simply did that ! God bless you
@chrishayukАй бұрын
You are too kind, and thank you. Glad it was useful
@enlightenment5dАй бұрын
Good! Where can I find your programs?
@ramsuman6945Ай бұрын
Great video. Can’t this be achieved using RAG instead of training
@MarlonEnglemamАй бұрын
Amazing video! I feel like I just want to use bun for everything 😅😂
@feniyuliАй бұрын
It is very helpful to understand how the tokenization works. Thanks! Do you think data that we encode using tiktoken will be sent to the AI?