Let's get mathematical. SVM Intuition Video: • Support Vector Machine...
Пікірлер: 186
@stanlukash333 жыл бұрын
This guy is underrated for real. KZbin - throw him into recommendations.
@jmspiers2 жыл бұрын
I know... I recommend him all the time on Reddit.
@backstroke08102 жыл бұрын
True! He deserves way more subscription. He should prepare a booklet like statquest did but of his own. Would definitely buy it!
@aravind_selvam2 жыл бұрын
True!!
@supersql84063 жыл бұрын
This guy is super smart and he takes sophisticated concepts and explains it in a way where it's digestible without mocking the theory! What a great teacher!
@ragyakaul60272 жыл бұрын
I can't explain how grateful I am for your channel! I am doing an introductory machine learning course at Uni and it's extremely challenging as it's full of complex concepts and the basics aren't explored throughly. Many videos I came across on youtube were too overly simplified and only helped me very briefly to make sense of my course. However, your videos offer the perfect balance, you explore the complex maths and don't oversimplify it, but do so in a way that's easy to understand. I read through this concept several times before watching your video, but only now do I feel as if I TRULY understand it. I HIGHLY appreciate the work you do and look forward to supporting your channel.
@maged40872 жыл бұрын
same
@velevki2 жыл бұрын
You answered all the questions I had in mind without me even asking them to you. This was an amazing walkthrough. Thank you!
@pavelrozsypal89562 жыл бұрын
Another great video on SVM. As a mathematician I do appreciate your succinct yet accurate exposition not playing around with irrelevant details.
@FPrimeHD1618 Жыл бұрын
Just to add onto all the love, I'm a data scientist in marketing and you are my number one channel for reviewing concepts. You are a very talented individual!
@srivatsa11933 жыл бұрын
This is the best and the most intuitive explanation for SVM. It is really hard for me to actually read research papers and understand what story each line of the equation is telling. But you made it soo intuitive. Thanks a ton! Please Please make more videos like this
@shusrutorishik81593 жыл бұрын
This has been simultaneously the simplest, most detailed and yet most concise explanation of this topic I've come across so far. Much appreciated! I hope you keep making awesome content!
@ritvikmath3 жыл бұрын
Glad it was helpful!
@vedantpuranik86192 жыл бұрын
This is the best and most comprehensible math video on hard margin SVM I have seen till date!
@KARINEMOOSE2 жыл бұрын
I'm a PhD student studying data mining and I just wanted commend you for this SUPERB explanation. I can't thank you enough for the explaining this so clearly. Keep up the excellent work!!
@nikkatalnikov3 жыл бұрын
Great video as usual! A possible side note - I find 3d picture even more intuitive. Adding z-direction which is basically can be shrunk to [-1;1] is our class prediction dimension and x1 x2 are feature dimensions. Hence, the margin hyperplane "sits" exactly on (x1; x1; 0) This is also helpful for further explanation of what SVM kernels are and why kernel alters the norms (e.g. distances) between data points, but not the data points themselves.
@stephonhenry-rerrie3997 Жыл бұрын
I think this might be top 5 explanations of SVM mathematics all-time. Very well done
@yangwang96883 жыл бұрын
Very easy to follow the concept! Thanks for this wonderful video! Looking forward to seeing next video!
@nickmillican222 жыл бұрын
Question on the notation. The image shows that the vector between the central line and decision line is w. So, I think, that w is the length of the decision boundary. But then we go on to show that the length of the decision boundary is k=1/||w||. So I'm not clear on what w (or k, for that matter) are actually representing.
@WassupCarltonАй бұрын
I too expected k to equal the length of that vector w :-/
@usmanabbas72 жыл бұрын
You and statquest are the perfect combination :) Thanks for all of your hardwork.
@luchomame19 ай бұрын
Dude thank you! now these equations don't feel like they were pulled out of thin air. and the best part is I can work them out too! I haven't done linear algebra in almost a decade so I got stuck on the ||w||/(w*w) part for a good bit but this pushed me to refresh some concepts and figure it out! Thank you
@lisaxu18482 жыл бұрын
studying my masters in data science and this is a brilliant easy to understand explanation tying graphical and mathematical concepts - thank you!
@polarbear9862 жыл бұрын
I finally get svm after watching a lot of tutorial on KZbin. Clever explanation. Thank you
@clifftondouangdara6249 Жыл бұрын
Thank you so much for this video! I am learning about SVM now and your tutorial perfectly breaks it down for me!
@Shaan11s2 ай бұрын
your videos are what allowed me to take a spring break vacation bro, saved me so much time thank you
@ritvikmath2 ай бұрын
Great to hear!
@mindyquan31412 жыл бұрын
So simple, so clear!!! Wish all the teachers are like this!
@honeyBadger5823 жыл бұрын
That's what i've been waiting for! Thanks a lot. Great video!
@ritvikmath3 жыл бұрын
Glad it was helpful!
@nishanttailor4786 Жыл бұрын
Just Amazing Clarity of Topics!!
@gdivadnosdivad61856 ай бұрын
I love your channel. You explain difficult concepts that could be explained to my dear grandmother who never went to college. Excellent job sir! You should become a professor one day. You would be good.
@lakhanpal1987 Жыл бұрын
Great video on SVM. Simple to understand.
@aashishkolhar81553 жыл бұрын
Great, thanks for this lucid explanation about the math behind SVM
@jaibhambra2 жыл бұрын
Absolutely amazing channel! You're a great teacher
@zz-94633 жыл бұрын
very informative and helpful video to help understand the SVM! Thanks for such a great video! You deserve more subscribers
@houyao21473 жыл бұрын
It's so easy to understand thi s math stuff! Best explanation ever in such a short video.
@TheWhyNotSeries3 жыл бұрын
At 5:10, I don't get how you obtain K from the last simplification. Can you/someone please explain? Btw beautiful video!
@ritvikmath3 жыл бұрын
thanks! I did indeed kind of skip a step. The missing step is that the dot product of a vector with itself is the square of the magnitude of the vector. ie. w · w = ||w||^2
@TheWhyNotSeries3 жыл бұрын
@@ritvikmath right, thank you!!
@badermuteb45523 жыл бұрын
Thank you so much. This is what i have been looking for so long time. would you please do the behind other ML and DL algorithms.
@ht22393 жыл бұрын
You explained this topic really well and helped me a lot! Great work!
@emid68112 жыл бұрын
Such a clear explanation! Thank you!!!
@chimetone2 ай бұрын
Best high-level explanation of SVMs out there, huge thanks
@ritvikmath2 ай бұрын
Glad it was helpful!
@maheshsonawane87378 ай бұрын
🌟Magnificient🌟I actually understood this loss function in by watching once. Very nice explanation of math. I saw lot of other lectures but you cant understand math without graphical visualization.
@techienomadiso8970 Жыл бұрын
This is a serious good stuff video. I have not seen a better svm explanation
@ifyifemanima3972 Жыл бұрын
Thank you for this video. Thanks for simplifying SVM.
@more-uv4nlАй бұрын
this guy explained what my professors couldn't explain in 2 hours 😂😂😂
@user-ik5vu8rf9d2 ай бұрын
Thanks man great explaination , was trying to understand the math for 2 days , finally got it
@ritvikmath2 ай бұрын
Glad it helped!
@dcodsp_8 ай бұрын
Thanks for such brilliant explanation really appreciate your work!!
@WassupCarltonАй бұрын
This is giving "Jacked Kal Penn clearly explains spicy math" and | am HERE for it
@sejmou8 ай бұрын
In case you're also having trouble figuring out how we arrive at k=1/||w|| from k * (w*w/||w||) = 1: remember that the dot product of any vector with itself is equal to its squared magnitude. Then, w*w can also be expressed as ||w||^2. ||w||^2/||w|| simplifies to just ||w||. Finally bring ||w|| to the other side by dividing the whole equation by ||w||, and you're done :) if you also have trouble understanding why exactly the dot product of any vector with itself is equal to its squared magnitude it also helps to know that the magnitude of a vector is the square root of the sum of squares of its components and that sqrt(x) * sqrt(x) = x I hope that somehow makes sense if you're struggling, surely took me a while to get that lol
@FootballIsLife005 ай бұрын
I almost forget this rule, thank you brother for saving my day
@SreehariNarasipur Жыл бұрын
Excellent explanation Ritvik
@himanshu10562 жыл бұрын
Best video on large margin classifiers 👍
@ziaurrahmanutube3 жыл бұрын
Amazing explanation from the theoretical to the mathematical. Please tell me how you do it? So i can self-learn myself how you are able to understand and then explain these concepts or other concepts. what resources do you use ?
@BlueDopamine Жыл бұрын
I am very happy that I found Your YT Channel Awsome Videos I was unable to Understand SVM UntilNow !!!!
@learn50813 жыл бұрын
very helpful! I always wanted to learn math behind the model! thanks!
@maurosobreira86952 жыл бұрын
Amazing teaching skills - Thanks, a lot!
@TheOilDoctor7 ай бұрын
great, concise explanation !
@mensahjacob34532 жыл бұрын
Thank you Sir . You really simplified the concept. I have subscribed already waiting patiently for more videos 😊
@godse543 жыл бұрын
Pls also make one for svm regression.. you are amazing
@NiladriBhattacharjya Жыл бұрын
Amazing explanation!
@AnDr3s03 жыл бұрын
Nice explanation and really easy to follow!
@Jayanth_mohan2 жыл бұрын
This really helped me learn the math of svm thanks !!
@asharnk9 ай бұрын
What an amazing video bro. Keep going.
@jingzhouzhao860920 күн бұрын
thank you for your genius explanation. At 5:11, before getting the value k, the equation k * ( w * w) / (magnitude of w) = 1 contains w * w, why the output k doesn't have w in the end.
@zarbose5247 Жыл бұрын
Incredible video
@trishulcurtis18102 жыл бұрын
Great explanation!
@yashshah41723 жыл бұрын
Hey Ritvik, Nice video, can you please cover the kernalization part too.
@harshalingutla73182 жыл бұрын
brilliant explanation!
@akashnayak61442 жыл бұрын
Loved it!
@Snaqex4 ай бұрын
Youre so unbelieveble good in explaining :)
@superbatman14623 жыл бұрын
Easily Explained 👍, Can you also explain how does SVM works with respect to regression problems?
@pedrocolangelo584410 ай бұрын
Once again, ritvikmath being a lifesaver for me. If I understand the underlying math behind this concepts, it is because of him
@borisshpilyuck35608 күн бұрын
Great video ! Why we can assume that right hand side of wx - b in those three lines is 1, 0, -1 ?
@SESHUNITR Жыл бұрын
very informative and intuitive
@amanbagrecha3 жыл бұрын
what about the points within the margin? are they support vectors as well?
@fengjeremy78782 жыл бұрын
Hi ritvik! I wonder what is the geometric intuition of the vector w? We want to minimize ||w||, but what does w look like on the graph?
@walfar5726 Жыл бұрын
Very well explained, thank you !
@sorrefly2 жыл бұрын
I'm not sure but I think you forgot to say that in order to have margin = +-1 you should scale multiplying constants to w and b. Otherwise I don't explain how we could have distance of 1 from the middle The rest of the video is awesome, thank you very much :)
@bhuvaneshkumarsrivastava9063 жыл бұрын
Eagerly waiting for your video on SVM Soft margin :D
@germinchan Жыл бұрын
This is very clearly defined. Thank you. But could someone explain to me what w is? How can I visualize it and calculate it.
@rndtnt Жыл бұрын
Hi, how exactly did you choose 1 and -1, the values for wx -b where x is a support vector? wx-b = 0 for x on the separating line makes sense however. Could it have other values?
@sukritgarg3175Ай бұрын
Holy shit what a banger of a video this is
@Reojoker3 жыл бұрын
Are SVMs only useful for binary classification, or can they be extended to multi-class predictions?
@wildbear787710 ай бұрын
You explained this topic perfectly! Amazing!
@ritvikmath10 ай бұрын
Glad you think so!
@kanishksoman783011 ай бұрын
Hi Ritvik, you are a great teacher of stats, calculus and ML/DL! I have one question regarding the equations. Why is the decision boundary equation W.X - b = 0? Shouldn't it be W.X + b = 0. I know the derivations and procedure to find the maximal margin is not affected but I don't understand -b. Please let me know if the sign is inconsequential. If it is, why is it? Thanks!
@fengjeremy78782 жыл бұрын
Thank you! I am wodering why do we use "+1 and -1" instead of "+1 and 0" to classify these two areas?
@user-wr4yl7tx3w2 жыл бұрын
Wow, that was so well explained.
@suckockshititties259910 ай бұрын
You are an amazing elucidator👍
@stephanecurrie13042 жыл бұрын
That was crystal clear !
@robfurlong88685 ай бұрын
@ritvikmath - Thanks for this great explanation. I have noticed other material online advises the equation for the hyperplan is w.x+b=0 rather than w.x-b=0. Can you confirm which is accurate
@joyc57842 жыл бұрын
On the other references they use the plus (+) sign on w x - b = 0. Why on your example this was changed to minus sign? w x - b = 0. or wx - b > 1. Hope you could answer. Thanks
@zeinramadan3 жыл бұрын
great video as always. thank you
@ritvikmath3 жыл бұрын
Glad you enjoyed it!
@TheCsePower2 жыл бұрын
You should mention that your W is an arbitrary direction vector of the hyperplane. (it is not the same size as the margin)
@kmishy Жыл бұрын
How you chose equation of two blue parallel lines? I mean how did you get 1 in the upper line and -1 for bottom line?
@user-bp5go3ds5t10 ай бұрын
phenomenal
@ramankutty12453 жыл бұрын
Great explanation
@blackforest4492 жыл бұрын
So good .. ThnQ
@salzshady87943 жыл бұрын
Could you do the math behind each Machine learning algorithm, also would you be doing Neural Networks in the future?
@marthalanaveen3 жыл бұрын
along with the assumptions of supervised and un-supervised ML algorithms that deals specifically with structured data.
@ritvikmath3 жыл бұрын
Yup neural nets are coming up
@jjabrahamzjjabrhamaz15683 жыл бұрын
@@ritvikmath CNN's and Super Resolution PLEASE PLEASE PLEASE
@mykhailoseniutovych60998 күн бұрын
Great video, with easy to follow explanation. However, you formulated the optimization problem that needs to be solved by the end of thevideo. The most ineteresting question now is how to actually solve this optimization problem. Can you give some directions on how this problem is actually solved?
@madshyom6257 Жыл бұрын
Bro, you're a superhero
@Pazurrr15012 жыл бұрын
BRILLIANT!
@almonddonut1818 Жыл бұрын
Thank you so much!
@xviktorxx3 жыл бұрын
Great video, great underappreciated channel! Thank you and keep up the good work!
@TheCsePower2 жыл бұрын
Great Viideo!. I found your notation for x to be quite confusing. I think the small x should be x11 x12 x13 to x1p. Say GPA is xi1 and MCAT is xi2. Then the student data for these two features will be: student 1(x11,x12) student 2 (x21, x22) student 3(x31,x32)
@maged40872 жыл бұрын
does b affect finding the slope?
@user-hu5qf6lg8u4 ай бұрын
you are my savior
@acidaly Жыл бұрын
Equation for points on margins are: w.x - b = 1 w.x - b = -1 That means we have fixed our margin to "2" (from -1 to +1). But our problem is to maximize the margin, so shouldn't we keep it a variable? like: w.x - b = +r w.x - b = -r where maximizing r is our goal?
@davud752511 ай бұрын
Have you figured it out?
@Max-my6rk3 жыл бұрын
Smart! This is the easiest way to come up with the margin when given theta (or weight)... gosh..