SVM Dual : Data Science Concepts

  Рет қаралды 48,259

ritvikmath

ritvikmath

Күн бұрын

Пікірлер: 94
@robertcanberkozturk7725
@robertcanberkozturk7725 Жыл бұрын
The only man who actually explains the concepts on the internet. God bless your soul Ritvik
@ritvikmath
@ritvikmath Жыл бұрын
🎉
@mental4508
@mental4508 5 ай бұрын
Man I swear you are one of the very few that actually understands what it takes to make these concepts understandable.
@jjabrahamzjjabrhamaz1568
@jjabrahamzjjabrhamaz1568 3 жыл бұрын
Ritvik you my man are a godsend. Thank you for sharing your extremely technical expertise on youtube for free. You teach this material better than any platform (university or online).
@teegnas
@teegnas 3 жыл бұрын
Important video for those who want to understand SVM properly ... thanks for uploading!
@ritvikmath
@ritvikmath 3 жыл бұрын
Glad it was helpful!
@SS-xh4wu
@SS-xh4wu 2 жыл бұрын
Thank you so much for your videos. As a stats major person, I still learned so much from your channel. In many cases I learned the math in school, but no one ever talked about the intuition behind. Really appreciate your work here
@alaham2590
@alaham2590 5 ай бұрын
You just give the exact amount of details needed to grasp these concept ! very nice !! thx
@SHREEANSHMOHANTY
@SHREEANSHMOHANTY 9 ай бұрын
I have watched 6 times now, still can't wrap my head around it. Enjoy your views!!!
@Pleaseletmenamemyselfmeme
@Pleaseletmenamemyselfmeme 9 ай бұрын
I soooo agree😂😂
@SHREEANSHMOHANTY
@SHREEANSHMOHANTY 7 ай бұрын
@@Pleaseletmenamemyselfmeme we dumb dumbs
@pranjalpatil9659
@pranjalpatil9659 2 жыл бұрын
althogh this dosen't include all the maths in depth but this is enough for most of us to understand SVM formulation very well
@atrus3823
@atrus3823 Жыл бұрын
Along with StatQuest, these are by far the best ML videos on the KZbin. Thank you!
@ritvikmath
@ritvikmath Жыл бұрын
Wow, thanks!
@atrus3823
@atrus3823 Жыл бұрын
@@ritvikmath No problem! I took SVM's in school, and have read about them many times, and I have only ever seen 1 of 2 levels of explanation: 1) just stating the results without question, or 2) diving deeply into measure theory and other crazy math that though sounds interesting, but I don't really have time for. This is the first source I've found that explains the gist of how the kernel trick works without diving super deeply into the math.
@mohamedgaal5340
@mohamedgaal5340 Жыл бұрын
Thank you so much for explaining why we need the dual formation. This video made me understand some of the concepts I jotted down from other videos without understanding.
@ritvikmath
@ritvikmath Жыл бұрын
Great to hear!
@CodeEmporium
@CodeEmporium 3 жыл бұрын
I love the way you throw the marker and hope to catch it. And if it's too far, you don't snap and point at the camera. Ah. And Nice videos. It comforts my soul
@ritvikmath
@ritvikmath 3 жыл бұрын
hahaha, thanks for noticing something I wasn't even aware of :)
@Trubripes
@Trubripes 4 ай бұрын
Amazing that SVM is derived from KKT conditions. Never noticed that until watching this.
@adam_long_unique_name
@adam_long_unique_name Жыл бұрын
Thanks for these great videos. I would suggest to reorder the playlists to single topics - for example SVM, and add in the description links to the playlist or all other videos on the topic
@amrdel2730
@amrdel2730 6 ай бұрын
Thanks for the explanation of crucial points in this topic , thank you for the effort
@ritvikmath
@ritvikmath 6 ай бұрын
Glad it was helpful!
@yitang7577
@yitang7577 2 жыл бұрын
Thanks for the wonderful video!!! It's really benefit for me who don't know the Lang and Dual problem.
@kally3432
@kally3432 3 жыл бұрын
Great explanation, I enjoyed it a lot. Thanks
@ritvikmath
@ritvikmath 3 жыл бұрын
Glad it was helpful!
@user-xi5by4gr7k
@user-xi5by4gr7k 3 жыл бұрын
Fantastic video. I have been binging your content. Any chance you will make a series on stochastic calculus?
@ritvikmath
@ritvikmath 3 жыл бұрын
thanks for the suggestion!
@muhammadaliyu3076
@muhammadaliyu3076 3 жыл бұрын
I think you are one of the great teachers on youtube. But I think the only reason I understood this video is because I already have knowledge about Optimization, linear algebra and Multivariable Calculus. And also because I already understood dual SVM problem. To be honest, some of your videos are totally not for beginners. I think you should try to be proving the math behind the algorithms you are explaining from the first principal. Its better for beginners.
@ritvikmath
@ritvikmath 3 жыл бұрын
thanks for the feedback! It's important for me to strike a balance between making the videos accessible to everyone and covering complex topics.
@jjabrahamzjjabrhamaz1568
@jjabrahamzjjabrhamaz1568 3 жыл бұрын
@@ritvikmath I think CrimaCode might not be fair here, because if you even come to this video for SVM's you will have atleast some knowledge about the realm of linear algebra and multivariate calculus. Both are basic things taught at most highschool levels. I think ritvik is doing a great job and video is dope for beginners.
@kroth5810
@kroth5810 3 жыл бұрын
I found the video very helpful despite not being well versed in svm before watching it. People learn in different ways :)
@quant-prep2843
@quant-prep2843 3 жыл бұрын
@@kroth5810 okay dicc
@thirumurthym7980
@thirumurthym7980 3 жыл бұрын
@@ritvikmath agree. May be you can try to cover basics of those basics separately for the pure beginners. But all your videos are awesome.
@marisa4942
@marisa4942 2 жыл бұрын
Thank you so much!! Very clear and intuitive explanation!!
@annakotova3508
@annakotova3508 2 жыл бұрын
Very-very clear explanation! Thanks!
@lilianaaa98
@lilianaaa98 4 ай бұрын
The math this video covers is kind of sophisticated :(
@johnspivack6520
@johnspivack6520 9 ай бұрын
Wonderful video thank you
@sejmou
@sejmou Жыл бұрын
Thanks for another wonderful video! However, there's one thing I really want to understand: in the formulation of the hard margin SVM problem (the 'SVM Math' video) you indeed stated that we want to minimize ||w||, not 1/2 ||w||^2. where does this difference come from, and why are both approaches equivalent? can anybody shed some light on this?
@k_1_1_2_3_5
@k_1_1_2_3_5 8 ай бұрын
I am in the same boat.. really would like to understand that
@shashankbarole
@shashankbarole 3 жыл бұрын
Please make more videos on SVM!! Thanks a lot
@ritvikmath
@ritvikmath 3 жыл бұрын
more coming up :)
@anthonyburn6505
@anthonyburn6505 6 ай бұрын
Excellent videos
@ritvikmath
@ritvikmath 6 ай бұрын
Glad you like them!
@e555t66
@e555t66 Жыл бұрын
Really explained well. If you want to get the theoretical concepts one could try doing the MIT micromasters. It’s rigorous and demands 10 to 15 hours a week.
@arminhejazian5306
@arminhejazian5306 Жыл бұрын
great explanation
@ritvikmath
@ritvikmath Жыл бұрын
Thanks!
@TheKingLikesArt
@TheKingLikesArt 2 жыл бұрын
You are a legend my man
@lumos309
@lumos309 3 жыл бұрын
Hello, regarding the efficiency of the two forms of the problem: What about the inner product gives the dual form a complexity of N^2 rather than NP? It seems that the inner product operation would have complexity O(P) since it is dependent on the number of terms in the input vectors x(i) and x(j); similarly W(T)x(i) in the primal form also has complexity O(P). So this O(P) term shows up in the same place in both forms, which would mean they are both dependent on P. Is there some other place in which only the primal form complexity scales with P? Or is complexity not even the right way to analyse this?
@bryankelly8820
@bryankelly8820 2 жыл бұрын
Thank you for posting this informative video. You mention that alphas only need to be calculated for support vectors. That does simplify things considerably; however, how can one in practice determine which vectors are support vectors without doing the minimisation?
@nimeesha0550
@nimeesha0550 3 жыл бұрын
Keep going buddy!! Amazing work. Really helpful :) Thank You.
@winstonong9593
@winstonong9593 11 күн бұрын
How are the support vectors determined in the first place, such that only pairs of support vectors need to be considered for the final minimisation problem?
@abhishekchandrashukla3814
@abhishekchandrashukla3814 2 жыл бұрын
apostle of machine learning !!
@SwapravaNath
@SwapravaNath 6 ай бұрын
11:30 but while solving the optimization, we don't know what are the support vectors. so, we need to solve the convex program for all cross terms but we'll find \lambda to be zero for most of them.
@emptygirl296
@emptygirl296 2 жыл бұрын
You are a legend really
@liq3395
@liq3395 3 жыл бұрын
Thanks for the great teaching!!! Just have 1 question that why it was Max{Σα_i - 1/2(Σ.....)} after substituting w, but not Max{Σα_i + 1/2(Σ.....)}? I tried for several times and still got "+" not "-", might I ask for illustration on this calculation?
@mateuszserocki4026
@mateuszserocki4026 2 жыл бұрын
Hi man, I stuck with the same problem but we did it wrong! Resolved it finaly, first element w^T*w is reduced to 0! The secound element - sum(alpha_i * y_i * w^T*x_i +... you have to extend that w^T here then you have double sum that is not reduced to zero and here you go with this minus. Hope it helps you
@adefajemisin
@adefajemisin Жыл бұрын
When you substitute, the first two terms are the same, except the first one in multiplied by 1/2. 0.5x - x = -0.5x.
@kanishkgarg423
@kanishkgarg423 Жыл бұрын
thanks, this was very helpful 😀😀
@johnstephen8041
@johnstephen8041 6 ай бұрын
Bro - Please discuss on the VC dimension
@xinking2644
@xinking2644 2 жыл бұрын
good video with clearly explain!!!!
@alexz3346
@alexz3346 2 жыл бұрын
Cool jacket!
@nuamaaniqbal6373
@nuamaaniqbal6373 2 жыл бұрын
You are the boss!
@LingkarPeduli
@LingkarPeduli Жыл бұрын
can you make video explain about twin support vector machine. thank's in advances
@TheJuankpunk
@TheJuankpunk 8 ай бұрын
Hi @ritvkikmath, thank you for these videos. In what type of degree do you usually cover these subjects? I would like to enroll in one.
@shaktijain8560
@shaktijain8560 2 жыл бұрын
Thanks for the video 😄😄
@prajwalsatannavar4576
@prajwalsatannavar4576 Жыл бұрын
Thank u...
@ritvikmath
@ritvikmath Жыл бұрын
Welcome 😊
@nieeel9112
@nieeel9112 2 жыл бұрын
super clear! really helpful:)
@ziadtarek633
@ziadtarek633 3 жыл бұрын
you're brilliant
@TheInevitable360
@TheInevitable360 Жыл бұрын
Thank you very much for this great video. I have a question: you said that for non-support vectors the alpha values would be zero due to the fact that they have no contribution to the final solution. But here we have considered the Hard Margin version of the SVM. how about the soft margin. because in the soft margin, all the points have a contribution to the solution and we can't ignore non-support vectors. are they still 0 or not?
@sagarpadhiyar3666
@sagarpadhiyar3666 2 жыл бұрын
Hello ritvik, superb video of dual SVM. nicely explained. but I have one doubt in final equation of dual form of SVM. Where is that xj term coming from?? actually, what is that term?? if xi is our training data then what is xj??
@SwapravaNath
@SwapravaNath 6 ай бұрын
a different training instance.
@zhangeluo3947
@zhangeluo3947 Жыл бұрын
Hello, sir I have 2 doubts: 1. Once you found the optimal values for alpha, how to determine the optimal value for b(bias)? 2. You said that for non-SVs, the alpha value is just 0, but how to determine those non-SVs in the real data set? I know those non-SVs are outside the bottom lines for -1 and 1, but in practice, for given alpha values for each data point, you can get the optimal values for w or b , then you can get the decision boundary given w and b based on that set of alpha values, so to use that decision boundary to evaluate the support vectors and non-SVs? Is that true? Thank you!
@GanadoFO
@GanadoFO 2 жыл бұрын
Once we get the point where we are trying to minimize the equation on the right of the screen, I don't understand how you actually do the minimization. Let's say we have 3 support vectors, so we will have a function of three variables, alpha_1, alpha_2, alpha_3. How do you minimize a multi-variable function? What does that even mean? I've only ever done minimization of one variable.
@RahulDable
@RahulDable 3 жыл бұрын
thanks
@tryfonmichalopoulos5656
@tryfonmichalopoulos5656 2 жыл бұрын
Using stationarity we can get rid of 'w' but how did we get rid of the 'b' that does not appear in the dual formulation of the problem on the top right?
@gautamchopra9939
@gautamchopra9939 2 жыл бұрын
I think this is due to the dL/db = Σα_i*y_i. When we plug these value we have b*Σα_i*y_i such that the term containing b becomes 0. We don't have an explicit statement that tells us that b = 0.
@AmithAdiraju1994
@AmithAdiraju1994 2 жыл бұрын
_/\_ for this. I do have on question though, regarding non-support vectors requiring alpha(i)'s to be zeros. Intuitively, that would mean only support vector data points ( which are very few ) would contribute to optimal weights of the model, wouldn't that be bad for model weights ? i.e., only few contributions ? I'm wondering if model would not generalize well with only few contributions from few data points ?
@johnspivack6520
@johnspivack6520 9 ай бұрын
@prysrek8858
@prysrek8858 3 жыл бұрын
You're great
@mwave3388
@mwave3388 2 жыл бұрын
Thank you. But from scratch, it is too difficult.
@ayushtiwari6932
@ayushtiwari6932 3 жыл бұрын
The term |w| is squared only for mathematical convenience ?
@SwapravaNath
@SwapravaNath 6 ай бұрын
yes, and makes the objective function differentiable as well.
@ryanleyland5565
@ryanleyland5565 Жыл бұрын
Kumar?
@johnspivack6520
@johnspivack6520 9 ай бұрын
Wonderful video thank you
@EW-mb1ih
@EW-mb1ih 2 жыл бұрын
At 6:30 Why do we take the alpha that MAXIMIZE the solution to the inner minimization?
@johnspivack6520
@johnspivack6520 9 ай бұрын
Wonderful video thank you
@johnspivack6520
@johnspivack6520 9 ай бұрын
Wonderful video thank you
@johnspivack6520
@johnspivack6520 9 ай бұрын
Wonderful video thank you
@hari8568
@hari8568 3 жыл бұрын
Hey when you code this out how exactly would he choose the alphas like what are the upper bounds of alpha?
@PF-vn4qz
@PF-vn4qz 3 жыл бұрын
how did you get rid of b?
Making a Board Game using MCMC!
8:20
ritvikmath
Рет қаралды 9 М.
SVM Kernels : Data Science Concepts
12:02
ritvikmath
Рет қаралды 72 М.
HAH Chaos in the Bathroom 🚽✨ Smart Tools for the Throne 😜
00:49
123 GO! Kevin
Рет қаралды 14 МЛН
Cute
00:16
Oyuncak Avı
Рет қаралды 12 МЛН
Ozoda - Lada (Official Music Video)
06:07
Ozoda
Рет қаралды 10 МЛН
SVM (The Math) : Data Science Concepts
10:19
ritvikmath
Рет қаралды 102 М.
Support Vector Machines: All you need to know!
14:58
Intuitive Machine Learning
Рет қаралды 146 М.
The Kernel Trick in Support Vector Machine (SVM)
3:18
Visually Explained
Рет қаралды 261 М.
Soft Margin SVM : Data Science Concepts
12:29
ritvikmath
Рет қаралды 50 М.
Support Vector Machines : Data Science Concepts
8:07
ritvikmath
Рет қаралды 69 М.
Support Vector Machine (SVM) in 2 minutes
2:19
Visually Explained
Рет қаралды 592 М.
HAH Chaos in the Bathroom 🚽✨ Smart Tools for the Throne 😜
00:49
123 GO! Kevin
Рет қаралды 14 МЛН