you have been teaching me the fundementals of SVMs better than my expensive professor at my university. thank you, man.
@mitsuinormal11 ай бұрын
same
@KoriKosmos9 ай бұрын
+1
@zilaleizaldin18342 ай бұрын
same 😂😂😂😂😂
@Gibson-xn8xk2 жыл бұрын
I started learning SVM looking for some material that would provide an intuitive understanding of how this model works. By this time, i have already covered in depth all the mathematics behind it and I have spent almost a month on it. It sounds like a eternity, but i can’t feel myself confident, until i consider everything in details. In my opinion, basic intuition is the most important thing in model’s exploration and you did this extremely cool. Thank you for your time and work. For those, who are new to this channel, i highly recommend you to subscribe. This guy makes an awesome content!
@matattz Жыл бұрын
Hey, i love that everything we learn in the video is already written on the board. It's so clean and compact, yet so much information. Just great man
@ritvikmath Жыл бұрын
Thanks so much !
@flvstrahl Жыл бұрын
By far the best explanation of kernels that I've seen/read. Fantastic job!
@samruddhideshmukh59283 жыл бұрын
Amazing explanation!! Finally kernels are way more clearer to me than they have been in the past.
@norebar58482 жыл бұрын
You are blowing my mind sir, thank you for this amazing explanation! No one else has been able to teach the subject of SVM this well.
@tgross211 ай бұрын
Was stuck for 3 days on kernels looking at numerous lectures online. You just made it clear. Thank you so much!
@DavidLolxe2 жыл бұрын
As someone who's searched everywhere for an explanation about this topic, this is the only good one out there. Thanks so much!
@guygirineza40013 жыл бұрын
Might be one of the best videos I have seen on SVM. Crazy
@xKikero Жыл бұрын
This is the best video I've seen on this topic. Thank you, sir.
@DevanshKhandekar3 жыл бұрын
Great Man. After months of stumbling over the convex optimization theories and KKT and whatnot, this video made everything clear . Highly appreciated.👏👏
@moravskyvrabec Жыл бұрын
Dude, like the other commenters say, you are so good at just laying stuff out in plain English. Just for this and the prior video I'm going to hit subscribe...you deserve it!
@ritvikmath Жыл бұрын
Wow, thanks!
@1MrAND7 ай бұрын
Dude, you are a legend. Finally I understood the power of Kernel functions. Thanks!
@bztomato31316 ай бұрын
When some one has tries a lot to know something, he can explain it much better than others, thanks a lot.
@gufo__49222 жыл бұрын
I found you by case and this was a damn miracle, will constantly check for new videos
@obakasan31 Жыл бұрын
This is the clearest explanation of this topic I've seen so far. Thank you
@AndBar2833 жыл бұрын
Huge, big thank you, for your hard work and spreading the knowledge. Nice, brave explanation.
@ritvikmath3 жыл бұрын
My pleasure!
@JohnsonZhongJunoStellar3 ай бұрын
The best explaination video about SVM, better than my professor at my university, thank you !
@alimurtaza4904 Жыл бұрын
This explanation cleared up everything for me! Amazing work, I can’t thank you enough!
@johnstephen804110 ай бұрын
Bro - Thanks much!!' The way that you are teaching and your understanding is crazy!
@ritvikmath10 ай бұрын
Happy to help!
@amaramar496910 ай бұрын
Amazing, Amazing, you are my true guru while I prepare for the university exam. You are far far above my college professors whom I barely understand. Hope you get your true due some how. Subscribed already. 🙏
@SiddhantSethi02 Жыл бұрын
Hey man, Just wanted to admire you for your beautiful work on bringing some of the key complex fundamentals such as this to ease. :D.
@anurondas38532 жыл бұрын
Much better than other youtubers explaining the same concept.
@qiguosun1292 жыл бұрын
You summed up all the needed knowledge about svm, and the discussion in this episode is more philosophical, thank you very much for the course.
@mehdi_mbh Жыл бұрын
You are amazing! Thank you so much for explaining the math and the intuition behind all of this. Fantastic teaching skills.
@danielbriones61712 жыл бұрын
Been struggling to grasp this even after watching a bunch of KZbin videos. Finally understand! Must be the magic of the white board!
@adelazhou590010 ай бұрын
The two paths diagram explains everything so clearly! Thank you!!
@ritvikmath10 ай бұрын
You're very welcome!
@axadify3 жыл бұрын
Thats the best video I have seen on kernels on YT! great content
@yt-11613 жыл бұрын
Your data science concepts video series is one of a kind
@uoohknk68812 жыл бұрын
You spittin knowledge, GD! This needs to go viral
@Stardust_Byproduct Жыл бұрын
I'd love to see a video on Gaussian Process Regression, or just Gaussian Processes in general! Thanks for this video - very helpful
@BiKey91 Жыл бұрын
dude I like before even watching the vids because I know I won't be disappointed
@Palapi_H Жыл бұрын
Cant thanks you enough to explain it so simply.
@DeltaPi3143 жыл бұрын
Marketer studying Data Science here. Amazing content!
@ritvikmath3 жыл бұрын
Glad you enjoy it!
@CodeEmporium3 жыл бұрын
Good stuff
@ritvikmath3 жыл бұрын
Thanks for the visit!
@Ranjithbhat4442 жыл бұрын
Can’t get any better explanation than this 👌🏼
@aalailayahya3 жыл бұрын
Absolutely great !
@twincivet96682 жыл бұрын
Note, to get the inner product after transformation to be equivalent to (1+x_i * x_j)^2, the transformation will need to have some constants. Specifically, the transformation should be [x1, x2] --> [1, sqrt(2)*x1, sqrt(2)*x2, x1^2, x2^2, sqrt(2)x1*x2]
@durgeshmishra-fn6kx Жыл бұрын
Instead ignore the coefficients (for example will have a term 2 xi^(1) xj^(1) so only consider xi^(1) xj^(1) and drop the 2 in the expansion you will get the match).
@alessandro58473 жыл бұрын
Such a great explanation. First time I get it after many attempts
@zzzzzzzmr9759 Жыл бұрын
Very clear and well-organized explanation. Thank you!
@ritvikmath Жыл бұрын
Glad it was helpful!
@pranavjain97993 жыл бұрын
This is an incredible explanation. It helped me alot. Thank you so much.
@Anbuteam79 ай бұрын
That's a great video. Thank you for making this.
@softerseltzer3 жыл бұрын
Your videos are of exquisite quality.
@liat978 Жыл бұрын
this is the first time i get it! thank you
@thecamelbackfiles36852 жыл бұрын
Smart AND fit - these videos are like candy for my eyes and brain 🧠 😂
@process69963 жыл бұрын
Awesome explanation. Thank you!
@ritvikmath3 жыл бұрын
Glad it was helpful!
@morisakomasaru80203 жыл бұрын
I finally understood what a kernel does! Thanks!
@martian.07_ Жыл бұрын
Very underrated video
@Daily_language9 ай бұрын
Clearly explained! Thank you!
@zilaleizaldin18342 ай бұрын
waaaaaaaw!!! You are really amazing!!!! I hope to work with you and see how it is 😊
@hazema.61502 жыл бұрын
Masha'Allah man, like really Masha'Allah. This is just beautiful and truly a piece of gold. Thank you for this
@abdelrahmantaha9785 Жыл бұрын
very well explained, thank you!
@kottybeats2 ай бұрын
amazing video, thanks a lot!
@shaktijain85602 жыл бұрын
Simply amazing 🤩
@eyuelmelese9442 жыл бұрын
This is amazing
@javiergonzalezarmas82502 жыл бұрын
Beautiful
@ritvikmath2 жыл бұрын
Thank you! Cheers!
@eacd2743 Жыл бұрын
Great video man thanks a lot!
@jasonwang99903 жыл бұрын
Amazing explanation!
@alexzhai59447 күн бұрын
Quick question. The point of inner products is to see how similar the two data points are right? And the point of “transforming” the original points to higher dimension is to see relationships between the og data points that aren’t clear in 2D? Like for example, how similar the 2 dot products of xy(where x and y are two features) are for data point 1 and 2, and then this similarity would be used to seperate the variables? Thank you
@jalaltajdini79592 жыл бұрын
Thanks, this was just what I wanted 😙
@loveen31862 жыл бұрын
amazing teacher
@ritvikmath2 жыл бұрын
Glad you think so!
@zwitter689 Жыл бұрын
You have done a very good job here - Thank You! How about a list of youtube videos you have done? ( I just subscribed)
@JOHNREINKER9 ай бұрын
this video is goated
@mahdimoosavi21092 жыл бұрын
dude I love you
@geogeo140002 жыл бұрын
Very insightful thanks a lot
@zahratebiyaniyan1592 Жыл бұрын
You are GREAT!
@manishbolbanda98723 жыл бұрын
we get inner products of high dimensional data with out even converting data into high dimension, thats the conclusion i drew, correct me if am wrong.
@ritvikmath3 жыл бұрын
Yup, that's exactly the main point !
@asdadasasdsaasd6 ай бұрын
Nice explanation
@harshitlamba1553 жыл бұрын
Hi Ritvik, this is an excellent explanation of the kernel trick concept. I have a doubt though. When we apply 2-degree polynomial trick to the dot product of the two vectors we will apply (a+b+c)**2 formula. Doing this will introduce a factor of 2 for a few terms. Is it ignored since it will just scale the dot product?
@durgeshmishra-fn6kx Жыл бұрын
Ignore the coefficients (for example will have a term 2 xi^(1) xj^(1) so only consider xi^(1) xj^(1) and drop the 2 in the expansion you will get the match).
@e555t66 Жыл бұрын
Really explained well. If you want to get the theoretical concepts one could try doing the MIT micromasters. It’s rigorous and demands 10 to 15 hours a week.
@GAZ___8 ай бұрын
This is a good explanation, but I'm a bit confused about the terms on the bottom right corner. Did we reach this by squaring the parentheses And then taking? That's gonna result in the sum of the terms, so what did we do next, take each term independently and set it as a term?
@nimeesha05503 жыл бұрын
Great Job! Thank you soo much!!
@oscargonzalez-barrios95023 жыл бұрын
Wow, thank you so much!
@dungtranmanh78202 жыл бұрын
Thank you very much ❤, you save us a lot of time and effort, hope I can work with you someday
@lechx32 Жыл бұрын
Thank you. I just imagined what a hard time I would have if I tried to grind through all of this math on my own. It is not a good idea for a beginner)
@mainakmukherjee3444 Жыл бұрын
Why we calculate the the inner products ? I understand the data points need to be transformed in higher dimensions, so that they can be linearly sepereble. But why we calculate the 6 dimensional space for that ?, say we have 2d space (original feature space), we can transform it to 3d space to make things done.
@moatzmaloo7 ай бұрын
Thats correct applyinf polynomial kernel quadratic for example will convert it to 3d dimensions but rdf can convert it to infinite dimensions
@maged40872 жыл бұрын
i love you man. i am vt student. i wish that i knew this a month a go :(
@sumeyrakoc8926Ай бұрын
You are hero
@arvinds71822 жыл бұрын
quality👏
@MauroAndretta6 ай бұрын
What is not clear for me is that, is the output of the kernel function a scalar?
@kevinmeyer38633 жыл бұрын
Hi Ritvik, in the end you have to sum the values in the 6-tuple to get the equivalent to the kernel output, right? (in order to get a proper scalar from the scalar product)
@victorsun98023 жыл бұрын
Amazing explanation! Thanks for making these series of video on SVM. One question is that does kernel/kernel trick can also be applied on other model like logistic regression? I saw some online posts saying kernel can be applied on logistic regression but seems like it's very unpopular. Wonder if it's because the logistic regression and other models can't really get the dot product term, which makes computation expensive or other reasons? Thanks!
@durgeshmishra-fn6kx Жыл бұрын
Little late but still, It can be applied to any ML algorithm, for example Linear regression (Kernelized) and so on, to include higher dimensional polynomial features instead of linear attributes.
@samuelrojas37667 ай бұрын
I am still confused about how you developed the kernels in the first place. I know what they do but don't know how to obtain them without using the transformed space.
@ireoluwaTH Жыл бұрын
Your videos rank pretty high on the 'binge-ability' matrix...
@ritvikmath Жыл бұрын
Thanks!
@damialesh21092 жыл бұрын
If we plugged in the kernel function output(similarity of our points in higher dimensional space) into the primal version of the cost function i.e use the similarity instead of the inputs themselves. Would it be equivalent to solving the dual function? Just a lot more inefficient?
@PF-vn4qz3 жыл бұрын
Thank you!
@manishbolbanda98723 жыл бұрын
what do you mean by Inner products of original data?
@walidghazouani94272 жыл бұрын
what is xj exactly? am i understanding it right if i can consider it as the triangle data point and xi are the x data points...? so xj is like feature variables within our data...?
@Kirill-xp9jq3 жыл бұрын
What is the purpose of finding the relationship between two separate vectors? Why can't you just take the polynomial of a vector with respect to itself (xi_1^T xi_1+c)^2? Wouldn't your number of terms just blow up when you have to find K(xa,xb) for every a and b in X?
@Fat_Cat_Fly Жыл бұрын
magic
@thirdreplicator3 жыл бұрын
Ritvik for president!
@ritvikmath3 жыл бұрын
haha!
@hussameldinrabah50182 жыл бұрын
why do we add 1 term to the dot product in Kernel?
@richardbloemenkamp85322 жыл бұрын
He did not derive the kernel. He showed that if you use (1 + )^2 as a kernel, then if you work it out, you get exactly the same terms as when you explicitly compute (except for a few factors 2). If you would take the kernel ()^2 then you would not get the same terms. Probably some clever person invented the kernel: (1 + )^2 , but it is not explained here how he/she found it. Note there are also other kernel functions that work well for SVM, but with different basis functions.
@murilopalomosebilla29993 жыл бұрын
Thanks!
@iidtxbc3 жыл бұрын
Why does 1 mean in the transformed matrix?
@ritvikmath3 жыл бұрын
1 is just for the "intercept". It's like the "b" term in the linear equation "y=mx+b"
@mattkunq2 жыл бұрын
Can someone elaborate how a kernal exactly does that? At the end of the day, we still need the higher demsion data no? I'm confused.
@revycayolivia3 жыл бұрын
sorry may I ask? how if we have 4/5 class ? how we describe or using it?
@ccuuttww3 жыл бұрын
The Phi is always impossible to compute directly If u don't mind I can give u a simple kernel PCA example to help viewers because this concept is hard to understand if u are new to this topics
@ritvikmath3 жыл бұрын
sure! any resources are always welcome
@KernaaliKehveli3 жыл бұрын
Hey, I know your videos are according to the current theme, but would be great to have a projector matrix/subspace video at some point in the future! Keep up the great content