NOTE: Gradient boost uses Regression Trees, which are explained in this StatQuest: kzbin.info/www/bejne/nWrGZ2mKit6fkJY Corrections: 4:27 The sum on the left hand side should be in parentheses to make it clear that the entire sum is multiplied by 1/2, not just the first term. 15:47. It should be R_jm, not R_ij. 16:18, the leaf in the script is R_1,2 and it should be R_2,1. 21:08. With regression trees, the sample will only go to a single leaf, and this summation simply isolates the one output value of interest from all of the others. However, when I first made this video I was thinking that because Gradient Boost is supposed to work with any "weak learner", not just small regression trees, that this summation was a way to add flexibility to the algorithm. 24:15, the header for the residual column should be r_i,2. Support StatQuest by buying my book The StatQuest Illustrated Guide to Machine Learning or a Study Guide or Merch!!! statquest.org/statquest-store/
@giuseppefasanella54465 жыл бұрын
Hi, the video is great and gives a detailed insight of the algorithm, so thanks for your work. I have a note on min. 15.47. I think the way the output gamma is defined has the wrong indeces in the summation. To my understanding, for gamma_jm you don't want to sum over R_ij but over all the x_i which belong to R_jm, the same terminal region. Otherwise, if you sum over x_i belonging to R_ij you are jumping from one terminal region to another, while you want R_jm to be fixed and just pick up the different x_i in there. Hope I managed to explain myself. Cheers.
@statquest5 жыл бұрын
@@giuseppefasanella5446 You are correct! That's another typo. One day, when StatQuest is making the Big Bucks, I'm going to hire an editor. That's the dream! :)
@giuseppefasanella54465 жыл бұрын
@@statquest It's a beautiful dream! If you want, from time to time, depending on my working constraints, I could do it for free. You can contact me in private if you want. Cheers!
@statquest5 жыл бұрын
@@giuseppefasanella5446 That would be awesome. I have one on XGBoost math coming up in mid-january. Contact me through my website and I'll send it to you in advance. statquest.org/contact/
@cosworthpower51473 жыл бұрын
@@statquest Hi, I still wonder why there is the similarity between gradient descent and gradient boost regarding trees. Apparently, there is no Partial Derivative by a Parameter in gradient boost. Simply as a decision tree has no model intern parameters in contrast to a Regression model, where it is obvious, that the betas have to be iterarively tweaked in order to lower the applied loss function. It would be great if you could help me out there :)
@romans44363 жыл бұрын
You have what many others lack: clarity and simplicity. The visualization is very good. Thank you!
@statquest3 жыл бұрын
Wow, thank you!
@ulrichwake16565 жыл бұрын
They said "Give a Man a Fish, and You Feed Him for a Day. Teach a Man To Fish, and You Feed Him for a Lifetime." Thank you very much for your video. I really like when you try to explain the algorithm and the math notation. I hope you keep doing that. :)
@statquest5 жыл бұрын
Thank you! Yes, I plan on doing more algorithms for machine learning.
@daniyalahmed44404 жыл бұрын
@@statquest Thanks a lot for these videos, these are simply amazing and super helpful.
@magus32673 жыл бұрын
kayaknya kenal
@marcellusorlando34143 жыл бұрын
I realize it's kinda randomly asking but do anyone know of a good website to stream new series online ?
@arlodamian45653 жыл бұрын
@Marcellus Orlando flixportal :D
@gunnvant5 жыл бұрын
The visual description where you are adding consecutive models is the best summary of the gradient boosting description that I have seen so far.
@statquest5 жыл бұрын
Thank you very much! :)
@HuyLe-nn5ft2 жыл бұрын
This explanation cannot be found anywhere else. You won't ever know how thankful i am, dude. Keep up the good work!
@statquest2 жыл бұрын
Thank you!
@madatbrahma43895 жыл бұрын
Josh, you are the best . Master in simplifying complex topics .
@statquest5 жыл бұрын
Thank you very much! :)
@adityanjsg994 жыл бұрын
@@statquest I know a madat brahma from Bangalore who runs a food business.! You that Brahma?
@jasonfaustino88153 жыл бұрын
Timestamps!! 6:30 - Step 1 - Initialize model with constant value. Comes up to be the average of the target values. Cool math trick 9:10 - Step 2.0 - Set M for number of iterations 10:02 - Step 2.A - Create residuals 12:47 - Step 2.B - Fit a regression tree 14:40 - Step 2.C - Calculate output values (I recommend jotting down notes as a lot is happening in this step) 20:39 - Step 2.D - Make Predictions if m == M, then proceed to step 3, else, repeat step 2 Step 3 - Output FsubM(X) Thank Josh!! Really smoothed out my knowledge for Gradient Boosting methods.
@statquest3 жыл бұрын
Awesome!!!
@thomashirtz3 жыл бұрын
@@statquest If you put it in the description youtube will create chapters for you :)
@statquest3 жыл бұрын
@@thomashirtz Great idea! BAM!
@soumendas5922 жыл бұрын
You are the best, when every shortcut to understanding ML algorithm fails, you come at last as our savior with all the necessary details.
@statquest2 жыл бұрын
Thank you!
@StackhouseBK4 ай бұрын
The content of this channel is what makes internet great
@statquest4 ай бұрын
Thanks!
@lokeshmadasu41465 жыл бұрын
You are one of the best teacher i ever seen,visualization gives me clear understanding of the concept,math behind it.Every time ,i wish the video have been more minutes..
@statquest5 жыл бұрын
Thank you very much! :)
@adarshtiwari7395 Жыл бұрын
This channel is a blessing to prospective machine learning engineers. I am tired after the entire video but a sense of pride towards my efforts a sense of gratitude towards you Joshua made this ride worth while!
@statquest Жыл бұрын
Awesome! :)
@pranavraj30245 жыл бұрын
This is the best explanation for GB regression that i have ever seen/read. Thank you so much explaining it in such simple terms!
@statquest5 жыл бұрын
Thank you very much!
@Shubhamkumar-ng1pm4 жыл бұрын
i have no words for josh starmer.teachers like him deserve a special place in heaven.thnk you josh.
@statquest4 жыл бұрын
Thank you! :)
@flavialan45443 жыл бұрын
@@statquest he really does
@statquest3 жыл бұрын
@@flavialan4544 Thanks!
@meysamamini94733 жыл бұрын
100 % agreeed
@varun05055 жыл бұрын
There are blogs explaining the gradient boosting on a dataset, there are blogs explaining the maths. I was facing difficulty in connecting those two. Hands down! Best video I came across in a long time. Thanks a lot. Please keep up the great work.
@statquest5 жыл бұрын
Thank you! :)
@Sorararawr2 жыл бұрын
Probably the best explanation of this complex statistical method I have ever found in the entire semester. Thank you for all your hard work sir!!!
@statquest2 жыл бұрын
Wow, thank you!
@tomaszbaczkun85727 күн бұрын
I greatly appreciate the effort you put in all the animations. Despite how tedious it must be to create it - it does immensly help to clarify the concepts. Thank you so much!
@statquest6 күн бұрын
Glad you enjoy it!
@hubert1990s5 жыл бұрын
it's unbelievable how well you explain it all. following this, I can even imagine spending a Friday evening learning ML :)
@statquest5 жыл бұрын
Wow! That's quite a complement. :)
@heitornunes62253 жыл бұрын
I'm literally doing this right now hahah
@gunjantoora8633 жыл бұрын
Can't thank god (and you) enough for these videos. All those textbook chapters with just formulas and notations were driving me crazy. YOUR VIDEOS ARE AMAZING!!!!
@statquest3 жыл бұрын
bam! :)
@aimenslamat126410 ай бұрын
from Algeria, u are the best.. none can explain ML like you Master
@statquest10 ай бұрын
Thank you!
@dungnintengtung84179 ай бұрын
bro this is the best explanation on KZbin. I love u man. You explain everything and make complex things so simple with simple word choice
@statquest9 ай бұрын
Thank you!
@matthewmiller36535 жыл бұрын
Absolutely fantastic. I graduated college "on the verge" of higher math knowledge, but never quite put in the work for the courses. I've now jumped into ML research, but have found notation to consistently be the hold-up in a lot of my understanding, despite that the equations often express intuitive concepts. Being able to "translate" as you've done with this video connects many dots in a world that's often unnecessarily thought of as sink or swim. Awesome!
@statquest5 жыл бұрын
I’m so glad to read that you like this video. I want to make more like it, where we just go through complicated sounding, and looking, algorithms step-by-step and show that they are simple things in the end.
@navyasailu184 жыл бұрын
@@statquest Hence the world needs you
@kaicheng97662 жыл бұрын
I don't think I have ever enjoyed this much for a math-intensive video. You are Godsend!
@statquest2 жыл бұрын
Wow, thank you!
@S2ReviewsS23 жыл бұрын
You are a Gem Josh, with so many new and old comments, you have replied to almost all of them. Can't believe such a great person and teacher actually exists. :)
@statquest3 жыл бұрын
Thank you very much! :)
@fgfanta5 жыл бұрын
First explanation of all the GB details I find on-line which is actually easier than reading the original paper. Thanks!
@statquest5 жыл бұрын
Hooray! That was my goal. :)
@davidcho88773 жыл бұрын
I am studying with all the videos in Machine Learning playlist to prepare for my interviews. These videos are all awesome. But this one is especially more awesome. I majored in Statistics and occasionally study the papers to catch up on some recent ML skills. I always had a hard time understanding the steps of algorithms even though I also minored in Mathematics. I have never seen a professor who can teach steps of an algorithm this easy and clear. Thank you Josh for this amazing video. Would really appreciate it if you can make more videos about the fundamental details of ML techniques more (and if you have time, some interesting papers too)! From. Biggest fan of StatQuest
@statquest3 жыл бұрын
Wow!!! Thank you very much! :)
@angels80503 жыл бұрын
Best simplified and visual explanations I haver ever seen on algorithms. I am definitely recommending your channel to anyone who is getting started on ML or that needs some refreshing. Keep on with the awesome work!
@statquest3 жыл бұрын
Wow, thanks!
@jokmenen_2 жыл бұрын
I keep getting amazed by how good your videos are! You are truly a blessing
@statquest2 жыл бұрын
Thank you! :)
@samerrkhann4 жыл бұрын
Holy Smoke! I literally had to take small pauses to double-check if I am really living in reality. My God, how easily he explained all those intimidating math equations and notations. A BIG THANK YOU JOSH!!
@statquest4 жыл бұрын
Hooray! I'm glad the video was helpful.
@sameershah1414 жыл бұрын
There can not be a better and simpler explanation. Kudos for the efforts put in to make the presentation and the video.. (y)
@statquest4 жыл бұрын
Thanks a lot!
@thilinikalpana72064 жыл бұрын
This is awesome, the best I've seen so far that simplifies all the complex algorithms and math. Good job and keep doing more videos like this to simplify complex problems.
@statquest4 жыл бұрын
Thank you very much! :)
@heyim38545 жыл бұрын
Thank you So much for your video. You are the 'Mozart' of the ML. Simple but infinitely subtle! 😊
@shangauri3 жыл бұрын
If the intention is to clearly explain a complex topic, then start with an example and then get into the equations step by step. Most academicians make the mistake of scaring people by showing the equations at the start itself. You are doing this perfectly Josh. Many thanks.
@statquest3 жыл бұрын
Thank you! :)
@charlesstrickland88395 жыл бұрын
Like Josh's videos before watching them. Watched bunch of Josh's videos, all of them are really helpful and easy to understand, thx a lot!
@statquest5 жыл бұрын
Thanks! :)
@saurabhkale44954 жыл бұрын
best explanation available for gradient boast on the PLANET!!!!!!
@statquest4 жыл бұрын
Thank you very much! :)
@Joshua-xz3gm5 жыл бұрын
at 16:05, shouldn't R_i,j under the sum be R_j,m ? Additionally, I don't really understand how one sample could end up in multiple leaves as stated at 21:20.
@sandeepchamunni92644 жыл бұрын
Right, And how x_i can be element of R_j,m? r_i,m can be an element of R_j,m but not x_i
@joacosanh4 жыл бұрын
same doubt here. how is possible a sample ends up in multiple leaves, for the tree a it iteration m?
@nandlalmishra44354 жыл бұрын
one sample does not end up in multiple leaves but a leaf may have multiple samples and that will be averaged.
@Arbmosal3 жыл бұрын
@@nandlalmishra4435 But that is already taken care of in the definition of the gamma_jm, is it not?
@ravindrapotar96333 жыл бұрын
I just checked Wikipedia it is R_j,m only. I think I's value is 1 for leaf where x is present and for other leaves it is 0.
@k.y82742 жыл бұрын
this youtube channel is god damn amazing. cant find any other videos with that kind of clear explanation around the globe.
@statquest2 жыл бұрын
Thanks! :)
@nguyendavid63965 жыл бұрын
"The chainnnn ruleeeee" LOL
@phungtruong66984 жыл бұрын
haha "The chainnnn rulleeeee " :v :v
@himanshutalegaonkar25224 жыл бұрын
By far the best video i've seen across all the platforms for machine learning !! I haven't come across anyone who goes to this extent into explaining the complicated maths behind such algorithms !! Please do more of such mathematical breakdown for famous research papers in ML and DL.
@statquest4 жыл бұрын
Wow, thanks!
@harshvardhanr50623 жыл бұрын
Legends say that Josh is so cool that he replies to comments even after 2 years
@statquest3 жыл бұрын
Bam
@anjulkumar91834 жыл бұрын
Never seen a better video tutorial such as yours...I love you man....a lot of respect for you...you really are doing a great job...I really am going to recommend everyone to watch your videos and I hope you would keep helping in the form these videos to teach ML in the most fascinating and beautiful way...
@statquest4 жыл бұрын
Thank you very much!!!! I'm glad you liked the StatQuest! :)
@2050techgeek5 жыл бұрын
Excellent video! You are the best! Can you please make one on how XGBoost achieves superior performance ?
@7pri25 жыл бұрын
Yes, please !
@MugiwaraSuponji Жыл бұрын
man the way you sound like a preschool teacher is making me emotional, you really made the first trauma-free math class 👍🏻👍🏻👍🏻👍🏻👍🏻
@statquest Жыл бұрын
BAM! :)
@LiviaOhana75 жыл бұрын
Thank you for making me laugh watching this mind blowing algorithm (I was reading the article and I was hopeless)
@statquest5 жыл бұрын
Hooray! :)
@abhijeetmhatre97543 жыл бұрын
I have become fan of you after going through all your first video of ML. I haven't seen anyone explaining topics better than you. You explain any complex topic such that after watching it, viewer seems it as a simple topic. I started learning ML and deep learning since past 6 months, and I am learning a lot from your videos and your videos have given a lot of boost and confidence to learn more. I saw multiple study materials explaining gradient boosting, but it's only your video that made me help to fully understand it in a single go. Very big thank you to you sir for such wonderful video course on ML.
@statquest3 жыл бұрын
Thank you! I'm glad my videos are helpful! :)
@yiqiwang45065 жыл бұрын
Anyone watching his amazing videos on accelerated playback speed? Like 1.75?
@samtine48125 жыл бұрын
1.5 here !!
@InsightsWithAkshay5 жыл бұрын
2.0 here !
@robertomontalti3064 Жыл бұрын
Insane content and very well exaplained! I appreciated a lot your correction in the description for 21:08 "With regression trees, the sample will only go to a single leaf, and this summation simply isolates the one output value of interest from all of the others. However, when I first made this video I was thinking that because Gradient Boost is supposed to work with any "weak learner", not just small regression trees, that this summation was a way to add flexibility to the algorithm." . Thank you!
@statquest Жыл бұрын
Glad it was helpful!
@jimip6c124 жыл бұрын
When I am ready to yell "the chain rule" together with Josh Josh: ....the chain rule....
@statquest4 жыл бұрын
BAM! :)
@АлександраРыбинская-п3л Жыл бұрын
Special thanks for correction on 21:08. I was thinking about it and was preparing to ask a question how it was possible that one sample ended in multiple leaves. Now there is no need to ask this question :)
@statquest Жыл бұрын
bam!
@youngcheong2121Ай бұрын
It was hard for me to get the idea behind this GB. But with combination of LR concept and visualization of GB together, it made me very easy to understand the idea very clearly. I do deeply appreciate your efforts!
@statquestАй бұрын
Thanks!
@rickandelon93745 жыл бұрын
Holy I finished this and actually understood everything you tried to make me understand!! The best man on youtube! Deeply grateful, Thanks a lot!!
@rickandelon93745 жыл бұрын
It was like a Quest in a beautiful puzzling game, just what the name 'StatQuest' implies!
@statquest5 жыл бұрын
Awesome! This a hard video to get through, so congratulations!!!
@pyarepiyush5 жыл бұрын
You're making math interesting for me. I've love hate relationship with math, but because of the work i do (data scientist), I've to keep on coming back to the math behind the algorithms. Your videos are joy to watch ... please continue to make these awesome videos
@statquest5 жыл бұрын
Hooray! I'm glad you find my videos useful. :)
@milay65275 жыл бұрын
I can't believe how clearly this guy explains everything
@statquest5 жыл бұрын
Thank you very much!!! :)
@debabrotbhuyan48124 жыл бұрын
Thank you so much for this video Josh. I never thought Boosting algorithms could be explained so clearly. Wish I had known about your channel one year back.
@statquest4 жыл бұрын
Thanks! :)
@carazhang74163 жыл бұрын
I wish the lecturers in uni are half as good as you. This is just treasure.
@statquest3 жыл бұрын
Thanks!
@ineedtodothingsandstuff90224 жыл бұрын
I never seen a more clear explanation(literally), thank you so much!
@statquest4 жыл бұрын
Great to hear!
@markaitkin5 жыл бұрын
easily the best video on youtube, can't wait for part 3 and 4.
@statquest5 жыл бұрын
Thank you!
@viswanathpotladurthy33834 жыл бұрын
WOW!!! How can it be so simple.I understand you take a lot of time to make it simple.Thanks on behalf of learning community!!
@statquest4 жыл бұрын
Thank you very much! :)
@manojtaleka95411 ай бұрын
The best video tutorial for Gradient Boosting. Thank you very much.
@statquest11 ай бұрын
Thanks!
@edkaprost36232 жыл бұрын
after watching some of ur videos i understand why it is so simple to understand ypur material comparing it with with other sources. Most of them just gives the theory without examples, u show example and then theory (use of induction). I hope that next generetaion of statistics' lecturers will use your videos as state of art in teaching field
@statquest2 жыл бұрын
Thank you! :)
@koshrai7080 Жыл бұрын
It took some time but I think I was able to figure out how (or why) this works? We basically just make a base prediction, and then compute a step (the pseudo-residual) in the direction of the actual value. Then we model these steps with a decision tree, and use that model to slowly improve upon our previous prediction, and just do this over and over. Great Video. Very Intuitive.
@statquest Жыл бұрын
bam!
@nsp75372 жыл бұрын
excellent to see someone making a video of both the concepts, followed by the math concepts. Will subscribe for more of those
@statquest2 жыл бұрын
Thanks!
@lenkahasova94284 жыл бұрын
I love the way you present this, it's exactly what my brain needs!
@statquest4 жыл бұрын
Hooray! :)
@enicay75623 ай бұрын
Clear and easy. When i saw the algorithm for the first time, i was like : WHAT THE HECK IS THAT now i'm like : that's all ? Thank you so much !
@statquest3 ай бұрын
bam! :)
@abhasupadhayay64204 жыл бұрын
Just started watching your videos and I am extremely glad I found you. The explanation is simply as detailed as it can get. Sometimes I wonder if you are overfitting our minds, lol..Thanks a lot
@statquest4 жыл бұрын
Bam! :)
@SteveCamilleri5 жыл бұрын
Finally, a mathematical explanation that can be understood! Than You
@statquest5 жыл бұрын
Thanks! :)
@SourabhSomvanshi4 жыл бұрын
You Sir are just awesome!!! Saying awesome is just an understatement. You make the learning fun and interesting. I found these topics so difficult to understand from other sources. You make it so simple. There are many people who know how these things but its really an art to teach these topics with so much ease. Take a bow!!! A big fan of yours. Hope to see more such videos in the times to come :) BAM!!!
@statquest4 жыл бұрын
Wow, thanks!
@AdityaSingh-yp9jn8 ай бұрын
Best BEST BESTESTTTTT Lecture I have ever seen and heard. Literally, this is so engaging and maths seems so funny. I am from maths background and really loved the way of explanation. Bro HATS-OFF. Please continue making such content. Especially the core maths concept and its intuition are really missing now-a-days from a lot of explanations. KEEP it UP Man! Press 'F'
@statquest8 ай бұрын
Wow, thank you!
@zhenli19654 жыл бұрын
This is the best explanation that I have ever seen. Thank you so much, Josh!
@statquest4 жыл бұрын
Thanks! :)
@sharanchhibbar70473 жыл бұрын
Hats off to your way of teaching. Wish you the best!
@statquest3 жыл бұрын
Thank you! :)
@honza8939 Жыл бұрын
In schools that teach data science and other statistics, I would play your videos. Because I don't know a teacher who can explain it that simply.
@statquest Жыл бұрын
Thank you very much! :)
@kalpaashhar65224 жыл бұрын
Beautifully simple explanation for a complicated algorithm ! Thank you!
@statquest4 жыл бұрын
Thank you very much! :)
@taochen7462 жыл бұрын
Really appreciated your hard work, this is the best videos for stats and machine learning ever!
@statquest2 жыл бұрын
Glad you think so!
@silentsuicide45442 жыл бұрын
i love this, thank you! i find learning algorithm s through math the best way to understand them, but sometimes the math behind them looks awful, but the idea and calculations are simple, and this is what I needed to be honest. The same goes for other algorithms, i can take a "math recipe" and go through it with your explanation in the background, like i did with adaboost. Thank you!
@statquest2 жыл бұрын
bam! :)
@justfoundit5 жыл бұрын
Thanks for clarifying me the tree building logic. Using simple regression tree looked illogical to me, but using it on the gradient AND providing values for the leaves based on the actual loss function: now it makes sense :)
@statquest5 жыл бұрын
Awesome! :)
@aniketdatir26334 жыл бұрын
Wonderful video Josh......very clearly explained !!!! I appreciate it...Please keep posting such lectures. Thanks
@statquest4 жыл бұрын
Thank you! :)
@pratibhasingh89194 жыл бұрын
Great work! The way you explained was outstanding. It can be easily understood by a layman.
@statquest4 жыл бұрын
Thank you! :)
@trisa_halder10 ай бұрын
i'm so glad i found this channel, thankyou so much!
@statquest10 ай бұрын
Glad you enjoy it!
@musasall57404 жыл бұрын
Best explanation on Gradient boosting!
@statquest3 жыл бұрын
Wow, thanks!
@emirhankartal12305 жыл бұрын
that's the best explanation than I've seen so far...
@statquest5 жыл бұрын
Thank you! :)
@veronikaberezhnaia2483 жыл бұрын
thank you for a (much!) clearer explanations than my professors in ML faculty have
@statquest3 жыл бұрын
Glad I can help! :)
@RaviShankar-jm1qw4 жыл бұрын
Words evade me while praising Josh !!!
@statquest4 жыл бұрын
Thank you! :)
@marryrram Жыл бұрын
Excellent way of explaining each and every step. Thank you very much
@statquest Жыл бұрын
Thank you!
@trillerperviu27525 жыл бұрын
Bro i am from Russia and i barely understand English. But i understand all stuff in this video,get pleasures + you make me some laughs. I think i will understand the math of quantum physics if you will explain it. YOU ARE THE BEST, THANK YOU!!!
@statquest5 жыл бұрын
Awesome! Thank you so much!
@jaivratsingh99665 жыл бұрын
I wonder why would someone dislike this video. This is great stuff!
@statquest5 жыл бұрын
Thank you! I often wonder the same thing. What's not to like? I'm not sure.
@SimoneIovane3 жыл бұрын
Really really good tutorials. I always watch them when I feel I want to revise some concepts. Thanks!
@statquest3 жыл бұрын
BAM! :)
@SimoneIovane3 жыл бұрын
@@statquest you mean... Triple Bam 💣
@statquest3 жыл бұрын
@@SimoneIovane YES!
@Misha-yh9wd5 жыл бұрын
Great work! The best explanation in the internet!!!
@jaikishank4 жыл бұрын
It was an awesome explanation to the granular level.Kudos to your great effort ...
@statquest4 жыл бұрын
Thanks a ton!
@devran41692 жыл бұрын
statquest > my university machine learning courses TRIPLE BAMM!!
@statquest2 жыл бұрын
Thanks!
@meysamamini94733 жыл бұрын
U ARE THE BEST TEACHER EVER!
@statquest3 жыл бұрын
Thank you! :)
@aracelial91883 жыл бұрын
You are a really good teacher, thanks a lot for your videos!!!
@statquest3 жыл бұрын
Thank you! 😃
@luattran53184 жыл бұрын
Much appreciated for your thorough and detailed explanation, wish u all the best!
@statquest4 жыл бұрын
Thank you very much! :)
@elnurazhalieva12624 жыл бұрын
I do appreciate the time and effort you spent making this awesome StatQuest. I wish my college professors were as good as you :). Thanks!
@statquest4 жыл бұрын
Thank you very much! :)
@teelee35433 жыл бұрын
your college professor will be never as good as josh in terms of machine learning teaching skills
@deepranjan34742 жыл бұрын
best explanation till now for me.
@statquest2 жыл бұрын
Thank you!
@mathematicalmusings4293 жыл бұрын
this is amazing, you are a gifted teacher Josh.
@statquest3 жыл бұрын
Thank you! :)
@15Nero922 жыл бұрын
I was struggling with this, and you are helping me a lot. thankyou so much !
@statquest2 жыл бұрын
Happy to help!
@sandeepm6255 жыл бұрын
awesome. connecting the reasoning with math = applied math. thanks for the education.
@statquest5 жыл бұрын
I'm glad you're enjoying StatQuest! :)
@NA-rq5dw5 жыл бұрын
Great video! I found the explanation of the mathematical notation to be very helpful and would love to see more examples for other machine learning concepts. Thanks
@statquest5 жыл бұрын
I'm glad to hear you appreciated the attention to the mathematical notation. I'll try to do more videos like this.
@ИванКравцов-в4б5э3 жыл бұрын
This is very great to explain the math like you do! It is awesome! Thank you!
@statquest3 жыл бұрын
Glad it was helpful!
@ОсинцевМихаил3 ай бұрын
The best explanation ever heard, thx so much!
@statquest3 ай бұрын
Thank you!
@bevansmith32105 жыл бұрын
Thank you so much Josh, I was going through these algorithms in Elements etc. and it was so difficult to figure out. Awesome explanation!