More Resources! Review of math concepts: kzbin.info/www/bejne/hYqtmXWgo7GjZqs Gradient Descent: kzbin.info/www/bejne/h33Pga2Eq9yMia8 Linear Regression: kzbin.info/www/bejne/aKeoZHenjMl4jtE PyTorch and Neural Networks: kzbin.info/www/bejne/iabUgY2crMd0rKM First-Principles Framework (Learn Fundamentals): www.gptlearninghub.ai/first-principles-framework Beginner's Blueprint (Build Projects): www.gptlearninghub.ai/beginner-blueprint
@tankapadia3 ай бұрын
sorry im a pytorch peter type of guy
@gptLearningHub3 ай бұрын
Haha same
@basil96333 ай бұрын
This is why I rebuild projects with concepts Ive already somewhat mastered
@gptLearningHub3 ай бұрын
You're on the right track 🤜
@dappavidz2 ай бұрын
1. Gradient descent and pytorch 2. Translate ml papers to code 3. Jump into two but do two points prior, a) fine tune llama b) hack together basic rag workflow
@gptLearningHub2 ай бұрын
Best of luck mate!
@เจษฎาอิ่มจิตร2 ай бұрын
Bro, Daniel's PyTorch tutorial is one of the best on KZbin. You guided everyone to learn PyTorch, but you never actually watched the tutorial yourself.
@gptLearningHub2 ай бұрын
Daniel's "PyTorch in a day" video is fantastic! Wasn't bashing on any specific creators, just emphasizing the need to learn the fundamentals as fast as possible, so that one can start getting their hands dirty with papers sooner as well.
@OPGAMING-rq5up3 ай бұрын
Should I not learn linear algebra stats calculus probability first?
@adityapatil3183 ай бұрын
you should
@zeusolympus16643 ай бұрын
Just the basics should do it. You dont have to cover advanced concepts right off the bat.
@gptLearningHub3 ай бұрын
Linear algebra, calculus, and prob/stat are definitely essential for ML. That being said, you don't have to master these before getting started with the actual ML concepts! Just the basics like matrix multiplication, basic derivatives, should be enough to get started.
@gptLearningHub3 ай бұрын
@@zeusolympus1664 Couldn't have said it better myself!
@aafimalek2 ай бұрын
where do I learn pytorch ?
@nemesis24773 ай бұрын
I think the finetuning part has been replaced by automation. Om the google playground you can literally fine tune any google llm model with your data with just few clicks
@gptLearningHub3 ай бұрын
Yup, there's no-code tools for fine tuning now. I still advocate for fine tuning projects though, since understanding each line of code teaches a ton!
@adityashukla98402 ай бұрын
I Know tensorflow I Created many projects Related to it should i switch to pytorch
@SusBinggusNigusImpostorus3 ай бұрын
this is not kpop video or cocomelno.. don't use 2~3sec cut edit
@picklenickil3 ай бұрын
Time for self reflection maybe?
@gptLearningHub3 ай бұрын
What sort of reflection are you thinking?
@picklenickil3 ай бұрын
@@gptLearningHub the kind that involves self. Maybe go back to basics.. build Shallow network, layer by layer..? A simple RL network maybe, Or take a moonshot and try solving ARC or something.. idk mate.. I used to like your videos, now they are just the same as others in my feed
@gptLearningHub3 ай бұрын
@@picklenickil Building a shallow network layer by layer, with just NumPy, was highly instructive for myself when getting started. I appreciate the honest feedback mate.
@picklenickil3 ай бұрын
@@gptLearningHub you are very welcome. Try your luck on the ARC prize. It's fun to say the least and most probably you'll fail like the reso is but it's damn fun.
@Chadpritai3 ай бұрын
I would recommend to implement autograd in python ❤
@elievanazokom48093 ай бұрын
all your video = pytorch
@gptLearningHub3 ай бұрын
After learning topics like Gradient Descent, Linear Regression, FeedForward Neural Networks, I think learning PyTorch is incredibly beneficial!
@elievanazokom48093 ай бұрын
you confuse ML and Deep Learning
@gptLearningHub3 ай бұрын
Classical ML is definitely still critical in 2024 - However I believe beginners can get started with Deep Learning before learning Classical ML, as long as they review the required maths, and master topics like Gradient Descent, Linear Regression, etc and can come back to Classical ML topics like SVM, K Means, etc later on