SVM Kernels : Data Science Concepts

  Рет қаралды 79,799

ritvikmath

ritvikmath

Күн бұрын

Пікірлер: 127
@cassie8324
@cassie8324 2 жыл бұрын
you have been teaching me the fundementals of SVMs better than my expensive professor at my university. thank you, man.
@mitsuinormal
@mitsuinormal 11 ай бұрын
same
@KoriKosmos
@KoriKosmos 9 ай бұрын
+1
@zilaleizaldin1834
@zilaleizaldin1834 2 ай бұрын
same 😂😂😂😂😂
@Gibson-xn8xk
@Gibson-xn8xk 2 жыл бұрын
I started learning SVM looking for some material that would provide an intuitive understanding of how this model works. By this time, i have already covered in depth all the mathematics behind it and I have spent almost a month on it. It sounds like a eternity, but i can’t feel myself confident, until i consider everything in details. In my opinion, basic intuition is the most important thing in model’s exploration and you did this extremely cool. Thank you for your time and work. For those, who are new to this channel, i highly recommend you to subscribe. This guy makes an awesome content!
@matattz
@matattz Жыл бұрын
Hey, i love that everything we learn in the video is already written on the board. It's so clean and compact, yet so much information. Just great man
@ritvikmath
@ritvikmath Жыл бұрын
Thanks so much !
@flvstrahl
@flvstrahl Жыл бұрын
By far the best explanation of kernels that I've seen/read. Fantastic job!
@samruddhideshmukh5928
@samruddhideshmukh5928 3 жыл бұрын
Amazing explanation!! Finally kernels are way more clearer to me than they have been in the past.
@norebar5848
@norebar5848 2 жыл бұрын
You are blowing my mind sir, thank you for this amazing explanation! No one else has been able to teach the subject of SVM this well.
@tgross2
@tgross2 11 ай бұрын
Was stuck for 3 days on kernels looking at numerous lectures online. You just made it clear. Thank you so much!
@DavidLolxe
@DavidLolxe 2 жыл бұрын
As someone who's searched everywhere for an explanation about this topic, this is the only good one out there. Thanks so much!
@guygirineza4001
@guygirineza4001 3 жыл бұрын
Might be one of the best videos I have seen on SVM. Crazy
@xKikero
@xKikero Жыл бұрын
This is the best video I've seen on this topic. Thank you, sir.
@DevanshKhandekar
@DevanshKhandekar 3 жыл бұрын
Great Man. After months of stumbling over the convex optimization theories and KKT and whatnot, this video made everything clear . Highly appreciated.👏👏
@moravskyvrabec
@moravskyvrabec Жыл бұрын
Dude, like the other commenters say, you are so good at just laying stuff out in plain English. Just for this and the prior video I'm going to hit subscribe...you deserve it!
@ritvikmath
@ritvikmath Жыл бұрын
Wow, thanks!
@1MrAND
@1MrAND 7 ай бұрын
Dude, you are a legend. Finally I understood the power of Kernel functions. Thanks!
@bztomato3131
@bztomato3131 6 ай бұрын
When some one has tries a lot to know something, he can explain it much better than others, thanks a lot.
@gufo__4922
@gufo__4922 2 жыл бұрын
I found you by case and this was a damn miracle, will constantly check for new videos
@obakasan31
@obakasan31 Жыл бұрын
This is the clearest explanation of this topic I've seen so far. Thank you
@AndBar283
@AndBar283 3 жыл бұрын
Huge, big thank you, for your hard work and spreading the knowledge. Nice, brave explanation.
@ritvikmath
@ritvikmath 3 жыл бұрын
My pleasure!
@JohnsonZhongJunoStellar
@JohnsonZhongJunoStellar 3 ай бұрын
The best explaination video about SVM, better than my professor at my university, thank you !
@alimurtaza4904
@alimurtaza4904 Жыл бұрын
This explanation cleared up everything for me! Amazing work, I can’t thank you enough!
@johnstephen8041
@johnstephen8041 10 ай бұрын
Bro - Thanks much!!' The way that you are teaching and your understanding is crazy!
@ritvikmath
@ritvikmath 10 ай бұрын
Happy to help!
@amaramar4969
@amaramar4969 10 ай бұрын
Amazing, Amazing, you are my true guru while I prepare for the university exam. You are far far above my college professors whom I barely understand. Hope you get your true due some how. Subscribed already. 🙏
@SiddhantSethi02
@SiddhantSethi02 Жыл бұрын
Hey man, Just wanted to admire you for your beautiful work on bringing some of the key complex fundamentals such as this to ease. :D.
@anurondas3853
@anurondas3853 2 жыл бұрын
Much better than other youtubers explaining the same concept.
@qiguosun129
@qiguosun129 2 жыл бұрын
You summed up all the needed knowledge about svm, and the discussion in this episode is more philosophical, thank you very much for the course.
@mehdi_mbh
@mehdi_mbh Жыл бұрын
You are amazing! Thank you so much for explaining the math and the intuition behind all of this. Fantastic teaching skills.
@danielbriones6171
@danielbriones6171 2 жыл бұрын
Been struggling to grasp this even after watching a bunch of KZbin videos. Finally understand! Must be the magic of the white board!
@adelazhou5900
@adelazhou5900 10 ай бұрын
The two paths diagram explains everything so clearly! Thank you!!
@ritvikmath
@ritvikmath 10 ай бұрын
You're very welcome!
@axadify
@axadify 3 жыл бұрын
Thats the best video I have seen on kernels on YT! great content
@yt-1161
@yt-1161 3 жыл бұрын
Your data science concepts video series is one of a kind
@uoohknk6881
@uoohknk6881 2 жыл бұрын
You spittin knowledge, GD! This needs to go viral
@Stardust_Byproduct
@Stardust_Byproduct Жыл бұрын
I'd love to see a video on Gaussian Process Regression, or just Gaussian Processes in general! Thanks for this video - very helpful
@BiKey91
@BiKey91 Жыл бұрын
dude I like before even watching the vids because I know I won't be disappointed
@Palapi_H
@Palapi_H Жыл бұрын
Cant thanks you enough to explain it so simply.
@DeltaPi314
@DeltaPi314 3 жыл бұрын
Marketer studying Data Science here. Amazing content!
@ritvikmath
@ritvikmath 3 жыл бұрын
Glad you enjoy it!
@CodeEmporium
@CodeEmporium 3 жыл бұрын
Good stuff
@ritvikmath
@ritvikmath 3 жыл бұрын
Thanks for the visit!
@Ranjithbhat444
@Ranjithbhat444 2 жыл бұрын
Can’t get any better explanation than this 👌🏼
@aalailayahya
@aalailayahya 3 жыл бұрын
Absolutely great !
@twincivet9668
@twincivet9668 2 жыл бұрын
Note, to get the inner product after transformation to be equivalent to (1+x_i * x_j)^2, the transformation will need to have some constants. Specifically, the transformation should be [x1, x2] --> [1, sqrt(2)*x1, sqrt(2)*x2, x1^2, x2^2, sqrt(2)x1*x2]
@durgeshmishra-fn6kx
@durgeshmishra-fn6kx Жыл бұрын
Instead ignore the coefficients (for example will have a term 2 xi^(1) xj^(1) so only consider xi^(1) xj^(1) and drop the 2 in the expansion you will get the match).
@alessandro5847
@alessandro5847 3 жыл бұрын
Such a great explanation. First time I get it after many attempts
@zzzzzzzmr9759
@zzzzzzzmr9759 Жыл бұрын
Very clear and well-organized explanation. Thank you!
@ritvikmath
@ritvikmath Жыл бұрын
Glad it was helpful!
@pranavjain9799
@pranavjain9799 3 жыл бұрын
This is an incredible explanation. It helped me alot. Thank you so much.
@Anbuteam7
@Anbuteam7 9 ай бұрын
That's a great video. Thank you for making this.
@softerseltzer
@softerseltzer 3 жыл бұрын
Your videos are of exquisite quality.
@liat978
@liat978 Жыл бұрын
this is the first time i get it! thank you
@thecamelbackfiles3685
@thecamelbackfiles3685 2 жыл бұрын
Smart AND fit - these videos are like candy for my eyes and brain 🧠 😂
@process6996
@process6996 3 жыл бұрын
Awesome explanation. Thank you!
@ritvikmath
@ritvikmath 3 жыл бұрын
Glad it was helpful!
@morisakomasaru8020
@morisakomasaru8020 3 жыл бұрын
I finally understood what a kernel does! Thanks!
@martian.07_
@martian.07_ Жыл бұрын
Very underrated video
@Daily_language
@Daily_language 9 ай бұрын
Clearly explained! Thank you!
@zilaleizaldin1834
@zilaleizaldin1834 2 ай бұрын
waaaaaaaw!!! You are really amazing!!!! I hope to work with you and see how it is 😊
@hazema.6150
@hazema.6150 2 жыл бұрын
Masha'Allah man, like really Masha'Allah. This is just beautiful and truly a piece of gold. Thank you for this
@abdelrahmantaha9785
@abdelrahmantaha9785 Жыл бұрын
very well explained, thank you!
@kottybeats
@kottybeats 2 ай бұрын
amazing video, thanks a lot!
@shaktijain8560
@shaktijain8560 2 жыл бұрын
Simply amazing 🤩
@eyuelmelese944
@eyuelmelese944 2 жыл бұрын
This is amazing
@javiergonzalezarmas8250
@javiergonzalezarmas8250 2 жыл бұрын
Beautiful
@ritvikmath
@ritvikmath 2 жыл бұрын
Thank you! Cheers!
@eacd2743
@eacd2743 Жыл бұрын
Great video man thanks a lot!
@jasonwang9990
@jasonwang9990 3 жыл бұрын
Amazing explanation!
@alexzhai5944
@alexzhai5944 7 күн бұрын
Quick question. The point of inner products is to see how similar the two data points are right? And the point of “transforming” the original points to higher dimension is to see relationships between the og data points that aren’t clear in 2D? Like for example, how similar the 2 dot products of xy(where x and y are two features) are for data point 1 and 2, and then this similarity would be used to seperate the variables? Thank you
@jalaltajdini7959
@jalaltajdini7959 2 жыл бұрын
Thanks, this was just what I wanted 😙
@loveen3186
@loveen3186 2 жыл бұрын
amazing teacher
@ritvikmath
@ritvikmath 2 жыл бұрын
Glad you think so!
@zwitter689
@zwitter689 Жыл бұрын
You have done a very good job here - Thank You! How about a list of youtube videos you have done? ( I just subscribed)
@JOHNREINKER
@JOHNREINKER 9 ай бұрын
this video is goated
@mahdimoosavi2109
@mahdimoosavi2109 2 жыл бұрын
dude I love you
@geogeo14000
@geogeo14000 2 жыл бұрын
Very insightful thanks a lot
@zahratebiyaniyan1592
@zahratebiyaniyan1592 Жыл бұрын
You are GREAT!
@manishbolbanda9872
@manishbolbanda9872 3 жыл бұрын
we get inner products of high dimensional data with out even converting data into high dimension, thats the conclusion i drew, correct me if am wrong.
@ritvikmath
@ritvikmath 3 жыл бұрын
Yup, that's exactly the main point !
@asdadasasdsaasd
@asdadasasdsaasd 6 ай бұрын
Nice explanation
@harshitlamba155
@harshitlamba155 3 жыл бұрын
Hi Ritvik, this is an excellent explanation of the kernel trick concept. I have a doubt though. When we apply 2-degree polynomial trick to the dot product of the two vectors we will apply (a+b+c)**2 formula. Doing this will introduce a factor of 2 for a few terms. Is it ignored since it will just scale the dot product?
@durgeshmishra-fn6kx
@durgeshmishra-fn6kx Жыл бұрын
Ignore the coefficients (for example will have a term 2 xi^(1) xj^(1) so only consider xi^(1) xj^(1) and drop the 2 in the expansion you will get the match).
@e555t66
@e555t66 Жыл бұрын
Really explained well. If you want to get the theoretical concepts one could try doing the MIT micromasters. It’s rigorous and demands 10 to 15 hours a week.
@GAZ___
@GAZ___ 8 ай бұрын
This is a good explanation, but I'm a bit confused about the terms on the bottom right corner. Did we reach this by squaring the parentheses And then taking? That's gonna result in the sum of the terms, so what did we do next, take each term independently and set it as a term?
@nimeesha0550
@nimeesha0550 3 жыл бұрын
Great Job! Thank you soo much!!
@oscargonzalez-barrios9502
@oscargonzalez-barrios9502 3 жыл бұрын
Wow, thank you so much!
@dungtranmanh7820
@dungtranmanh7820 2 жыл бұрын
Thank you very much ❤, you save us a lot of time and effort, hope I can work with you someday
@lechx32
@lechx32 Жыл бұрын
Thank you. I just imagined what a hard time I would have if I tried to grind through all of this math on my own. It is not a good idea for a beginner)
@mainakmukherjee3444
@mainakmukherjee3444 Жыл бұрын
Why we calculate the the inner products ? I understand the data points need to be transformed in higher dimensions, so that they can be linearly sepereble. But why we calculate the 6 dimensional space for that ?, say we have 2d space (original feature space), we can transform it to 3d space to make things done.
@moatzmaloo
@moatzmaloo 7 ай бұрын
Thats correct applyinf polynomial kernel quadratic for example will convert it to 3d dimensions but rdf can convert it to infinite dimensions
@maged4087
@maged4087 2 жыл бұрын
i love you man. i am vt student. i wish that i knew this a month a go :(
@sumeyrakoc8926
@sumeyrakoc8926 Ай бұрын
You are hero
@arvinds7182
@arvinds7182 2 жыл бұрын
quality👏
@MauroAndretta
@MauroAndretta 6 ай бұрын
What is not clear for me is that, is the output of the kernel function a scalar?
@kevinmeyer3863
@kevinmeyer3863 3 жыл бұрын
Hi Ritvik, in the end you have to sum the values in the 6-tuple to get the equivalent to the kernel output, right? (in order to get a proper scalar from the scalar product)
@victorsun9802
@victorsun9802 3 жыл бұрын
Amazing explanation! Thanks for making these series of video on SVM. One question is that does kernel/kernel trick can also be applied on other model like logistic regression? I saw some online posts saying kernel can be applied on logistic regression but seems like it's very unpopular. Wonder if it's because the logistic regression and other models can't really get the dot product term, which makes computation expensive or other reasons? Thanks!
@durgeshmishra-fn6kx
@durgeshmishra-fn6kx Жыл бұрын
Little late but still, It can be applied to any ML algorithm, for example Linear regression (Kernelized) and so on, to include higher dimensional polynomial features instead of linear attributes.
@samuelrojas3766
@samuelrojas3766 7 ай бұрын
I am still confused about how you developed the kernels in the first place. I know what they do but don't know how to obtain them without using the transformed space.
@ireoluwaTH
@ireoluwaTH Жыл бұрын
Your videos rank pretty high on the 'binge-ability' matrix...
@ritvikmath
@ritvikmath Жыл бұрын
Thanks!
@damialesh2109
@damialesh2109 2 жыл бұрын
If we plugged in the kernel function output(similarity of our points in higher dimensional space) into the primal version of the cost function i.e use the similarity instead of the inputs themselves. Would it be equivalent to solving the dual function? Just a lot more inefficient?
@PF-vn4qz
@PF-vn4qz 3 жыл бұрын
Thank you!
@manishbolbanda9872
@manishbolbanda9872 3 жыл бұрын
what do you mean by Inner products of original data?
@walidghazouani9427
@walidghazouani9427 2 жыл бұрын
what is xj exactly? am i understanding it right if i can consider it as the triangle data point and xi are the x data points...? so xj is like feature variables within our data...?
@Kirill-xp9jq
@Kirill-xp9jq 3 жыл бұрын
What is the purpose of finding the relationship between two separate vectors? Why can't you just take the polynomial of a vector with respect to itself (xi_1^T xi_1+c)^2? Wouldn't your number of terms just blow up when you have to find K(xa,xb) for every a and b in X?
@Fat_Cat_Fly
@Fat_Cat_Fly Жыл бұрын
magic
@thirdreplicator
@thirdreplicator 3 жыл бұрын
Ritvik for president!
@ritvikmath
@ritvikmath 3 жыл бұрын
haha!
@hussameldinrabah5018
@hussameldinrabah5018 2 жыл бұрын
why do we add 1 term to the dot product in Kernel?
@richardbloemenkamp8532
@richardbloemenkamp8532 2 жыл бұрын
He did not derive the kernel. He showed that if you use (1 + )^2 as a kernel, then if you work it out, you get exactly the same terms as when you explicitly compute (except for a few factors 2). If you would take the kernel ()^2 then you would not get the same terms. Probably some clever person invented the kernel: (1 + )^2 , but it is not explained here how he/she found it. Note there are also other kernel functions that work well for SVM, but with different basis functions.
@murilopalomosebilla2999
@murilopalomosebilla2999 3 жыл бұрын
Thanks!
@iidtxbc
@iidtxbc 3 жыл бұрын
Why does 1 mean in the transformed matrix?
@ritvikmath
@ritvikmath 3 жыл бұрын
1 is just for the "intercept". It's like the "b" term in the linear equation "y=mx+b"
@mattkunq
@mattkunq 2 жыл бұрын
Can someone elaborate how a kernal exactly does that? At the end of the day, we still need the higher demsion data no? I'm confused.
@revycayolivia
@revycayolivia 3 жыл бұрын
sorry may I ask? how if we have 4/5 class ? how we describe or using it?
@ccuuttww
@ccuuttww 3 жыл бұрын
The Phi is always impossible to compute directly If u don't mind I can give u a simple kernel PCA example to help viewers because this concept is hard to understand if u are new to this topics
@ritvikmath
@ritvikmath 3 жыл бұрын
sure! any resources are always welcome
@KernaaliKehveli
@KernaaliKehveli 3 жыл бұрын
Hey, I know your videos are according to the current theme, but would be great to have a projector matrix/subspace video at some point in the future! Keep up the great content
@skelgamingyt
@skelgamingyt Жыл бұрын
india se ho kya bhai?
@nuclearcornflakes3542
@nuclearcornflakes3542 Жыл бұрын
let him cook
EM Algorithm : Data Science Concepts
24:08
ritvikmath
Рет қаралды 79 М.
SVM Dual : Data Science Concepts
15:32
ritvikmath
Рет қаралды 54 М.
진짜✅ 아님 가짜❌???
0:21
승비니 Seungbini
Рет қаралды 10 МЛН
번쩍번쩍 거리는 입
0:32
승비니 Seungbini
Рет қаралды 182 МЛН
Support Vector Machines : Data Science Concepts
8:07
ritvikmath
Рет қаралды 77 М.
The Kernel Trick - THE MATH YOU SHOULD KNOW!
7:30
CodeEmporium
Рет қаралды 183 М.
How I'd learn ML in 2025 (if I could start over)
16:24
Boris Meinardus
Рет қаралды 206 М.
SVM10 The Kernel Trick (Part1: Basis Expansion)
16:06
Zardoua Yassir
Рет қаралды 23 М.
16. Learning: Support Vector Machines
49:34
MIT OpenCourseWare
Рет қаралды 2 МЛН
Backpropagation : Data Science Concepts
19:29
ritvikmath
Рет қаралды 42 М.
Support Vector Machines: All you need to know!
14:58
Intuitive Machine Learning
Рет қаралды 174 М.
Lecture 14 - Support Vector Machines
1:14:16
caltech
Рет қаралды 315 М.
The Kernel Trick in Support Vector Machine (SVM)
3:18
Visually Explained
Рет қаралды 302 М.
진짜✅ 아님 가짜❌???
0:21
승비니 Seungbini
Рет қаралды 10 МЛН