Lecture 15 - Kernel Methods

  Рет қаралды 223,205

caltech

caltech

Күн бұрын

Пікірлер: 83
@guptaachin
@guptaachin 6 жыл бұрын
This makes me add him to the category of artists like Prof Gilbert Strang. I wonder how do you develop such skill of so lucidly explaining the most intricate concepts?
@clarkupdike6518
@clarkupdike6518 2 жыл бұрын
Obviously one would have to know the material inside and out but also at both a theoretical and applied level. I'm guessing he has honed his technique through years of experience with students and is able to dial into that sweet spot of just enough, but not too much, detail--while providing lots of context and interlinking of related concepts. He only goes theoretical, and judiciously at that, when it adds insight... instead of grandstanding to show off. He is truly a master at bringing students along for the ride on complex subjects.
@amoughnieh
@amoughnieh 3 ай бұрын
This man answered a lot of lingering questions I had, even after reading multiple articles, papers, and watching experts on KZbin.
@sakules
@sakules 8 жыл бұрын
"if i went to enter the Z space, you would have never heard from me again" haha so great
@chilling00000
@chilling00000 7 жыл бұрын
lecture 14,15,16 are the best SVM videos on youtube
@saiprasad8311
@saiprasad8311 10 жыл бұрын
Very valuable addition to all ML text books, in which one can easily get drowned in the mathematics involved. He is superb to elicit the meaning of mathematics without going into the complexities of the same. Thanks for this course.
@livnatje
@livnatje 11 жыл бұрын
An amazing lecturer. His talks are perfectly clear, insightful and interesting. Thanks for putting this online!
@alvincay100
@alvincay100 7 жыл бұрын
Just to reiterate what other commenters are saying.... simply excellent. I've found multiple sources and could not wrap my head around the kernel trick until I found these lectures. Abu-Mostafa separates the important concepts out from the mathematical details so that you can understand the important concepts at hand. It is easy to fill in the details later once you understand the important concept.
@Kristofcoddens
@Kristofcoddens 5 жыл бұрын
By far best the explanation on kernels in SVMs I found online.
@vasjaforutube1
@vasjaforutube1 3 жыл бұрын
Professor Abu-Mostafa is such a cheerful person. His explanation is very clear, but I still have to pause the video every once in a while just to have a laugh.
@TheCpHaddock
@TheCpHaddock 7 жыл бұрын
Sir you are one of the best professors ever! And not just in machine learning!
@ahlammallak8853
@ahlammallak8853 9 жыл бұрын
you are just amazing ^^ many thanks professor, I hope if you can add more videos for more techniques such as PCA, ICA and deep learning
@gip8507
@gip8507 10 жыл бұрын
These lectures are really great. Exceptionally clear and fun to watch. Thank you so much for this
@zuodongzhou3334
@zuodongzhou3334 9 жыл бұрын
excellent lecture ! Best explanation I have ever seen.
@etiennedoumazane7556
@etiennedoumazane7556 3 жыл бұрын
I think you just gave me a bit of intuition of what that mysterious kernel trick is... thanks!
@jakebruce1617
@jakebruce1617 11 жыл бұрын
This has been extremely helpful. Thanks for posting!
@brainstormingsharing1309
@brainstormingsharing1309 3 жыл бұрын
Absolutely well done and definitely keep it up!!! 👍👍👍👍👍
@ShaymaaKhalifa
@ShaymaaKhalifa 7 жыл бұрын
This professor is brilliant!
@acgalt
@acgalt 3 жыл бұрын
Excellent lecture. Congratulations!
@JulienAmelot
@JulienAmelot 12 жыл бұрын
Conclusion : What happens in the Z space stays in the Z space :P
@markh1462
@markh1462 5 жыл бұрын
Lol
@kevinlin4157
@kevinlin4157 7 жыл бұрын
Thank you Professor Yaser, it clearly explain how Kernal reduce computation.
@fengzhengkite
@fengzhengkite 9 жыл бұрын
You are excellent
@5up5up
@5up5up 6 жыл бұрын
im the happiest guy in the world, i finally understood what's the kernel freaking trick.. thank you! شكراً جدا لحضرتك
@denoleyn
@denoleyn 8 жыл бұрын
Thank you for this great lecture. Everything explained veryvery clearly.
@dergarten776
@dergarten776 5 жыл бұрын
excellent demonstration of kernel methods!
@DAsiaView_
@DAsiaView_ 4 жыл бұрын
Awesome lecture, had my interest the entire time!
@nicktgr152
@nicktgr152 10 жыл бұрын
Fantastic presentation. Thank you very much.
@fierydino9402
@fierydino9402 4 жыл бұрын
Thank you for the precious lectures!!
@AndyLee-xq8wq
@AndyLee-xq8wq Жыл бұрын
Great explanation!
@pt77780
@pt77780 9 жыл бұрын
"... terms will be dropping like flies" lol
@go2chayan
@go2chayan 7 жыл бұрын
I burst into laughs when he described positive-semidefinite matrix in terms of a "sleeping vector" and "standing vector" at 45:00
@vman049
@vman049 11 жыл бұрын
18:17 blew my mind. LOL'ed at 26:30. All the elements of a great lecture! Excellent!
@jandal487
@jandal487 6 жыл бұрын
Excellent course on Introduction to ML. Thank you professor :)
@Darshanhegde
@Darshanhegde 12 жыл бұрын
and again at 41:42 Prof. Yaser says: "Third way to check that kernel is valid is 'who cares' for mercer's theorem :)"
@dipanjans
@dipanjans 8 жыл бұрын
Thanks a lot Prof. Yaser.
@BreatheThePureSerene
@BreatheThePureSerene 8 жыл бұрын
Brilliant teacher
@sarnathk1946
@sarnathk1946 4 ай бұрын
You are Pedhanna (Big brother) from now on! Thank you!
@emademad4
@emademad4 5 жыл бұрын
a question : he said an objective function of number of miss classification is NP hard , why ? and if it is so in soft margin SVM, the amount of violation need to be minimized and to perform this u need to check every sample whether they are violating or not, so its the same action he called np-hard. any one who knows where im wrong id be glad to hear it.
@aztmln
@aztmln 8 жыл бұрын
very useful lecture. Thanks Prof ! Would love to hear more
@chaitanyatanwar8151
@chaitanyatanwar8151 10 жыл бұрын
Thanks Superb Lecture..
@fatimatayeb677
@fatimatayeb677 5 жыл бұрын
Great dude . Keep it up .
@michaelmellinger2324
@michaelmellinger2324 2 жыл бұрын
@45:30 Establish that the Z-space exists even if we don’t know what it is
@3198136
@3198136 11 жыл бұрын
Thank you so much, it's much better than the class I attended for Pattern Recognition!
@mohamedsalem9806
@mohamedsalem9806 2 жыл бұрын
This is brilliant!
@Darshanhegde
@Darshanhegde 12 жыл бұрын
It's hilarious :) Prof. Yaser at 31:30 says: " If I had gone to Z space (Which is infinite here), you would have never heard from me again :D "
@DiegoAToala
@DiegoAToala 2 жыл бұрын
Great lecture! thank you
@achronicstudent
@achronicstudent 16 күн бұрын
I am a rookie MSc student this is my first time learning these.. uh.. whatever these are... and everyone in the comments saying "Woah now i understand great explanation" etc etc. and I am just looking at the screen and feel dumb.
@alisiena7009
@alisiena7009 8 жыл бұрын
i have problem with dataSet if very small between [-1;0] and i have the approximation target between [0;1] but always the trainig performance is not go to thebest solution how can i solis problem
@hson198
@hson198 7 жыл бұрын
can you explain at 1:03:18 (slide 19) that why 0
@NicolaPiovesan
@NicolaPiovesan 11 жыл бұрын
24:00 "so by doing this operation you have done an inner product in a infinite dimensional space. Congratulations!" - LOL :D
@111rave
@111rave 6 жыл бұрын
you are a really good lecturer!!! "Okay" :D
@kennethnavarro3496
@kennethnavarro3496 2 жыл бұрын
I am not sure but i am pretty sure that the equation for b at minute 36:59 is wrong. When I solved I got almost the same thing except instead of y of m i got 1/(y of m ) n the same spot.
@ajayram198
@ajayram198 6 жыл бұрын
In slide 1 at 5:04 he talks about using SVM with non linear transform. Could someone there explain the difference between h and H? (Complex h but simple H)
@sddyl
@sddyl 12 жыл бұрын
Fabulous! Great Intuition!
@Nestorghh
@Nestorghh 12 жыл бұрын
Great class! Thanks a lot!
@nishanthkanala
@nishanthkanala 8 жыл бұрын
Just Brilliant!!!
@明焕李
@明焕李 6 жыл бұрын
讲的真好!
@michaelmellinger2324
@michaelmellinger2324 2 жыл бұрын
@39:50 The whole idea of the kernel is that you don’t visit the Z-Space
@apeman5291
@apeman5291 11 жыл бұрын
56:51 - I don't know, that still looks pretty complicated. 59:01 - Okay, that was pretty neat. 59:29 - Jaw hit the floor.
@siddharthsvnit
@siddharthsvnit 6 жыл бұрын
1:02:30 can't slack still be zero ? as 0*0 = 0 , so condition is still satisfied
@markh1462
@markh1462 5 жыл бұрын
No, because you're also maximizing beta. So, the only reason we would ever let beta be zero is when zeta is nonzero.
@mementomori6734
@mementomori6734 5 жыл бұрын
Mark H I dont understand
@gcgrabodan
@gcgrabodan 8 жыл бұрын
If you take the derivative of the lagrangian of the soft-margin SVM, with respect to w, why does Xi (the error) drop out? It should depend on w, doesnt it? i.e. different margins will have different errors. So it seems to me like a super complicated problem... Thanks for help ;)
@Bing.W
@Bing.W 7 жыл бұрын
Different margins do have different errors, but different margins do not have different Ws. That's why Xi does not depend on W. In other words, for a same hyperplane (fixed W), you can define different allowed errors (Xi).
@0xaugustus
@0xaugustus 7 жыл бұрын
Absolute genius !
@reinerwilhelms-tricarico344
@reinerwilhelms-tricarico344 Жыл бұрын
I like this a lot for his great clarity. Except this: When you get to "Then call your quadratic programming code to hand over the alpha's", you may end up with a big can of worms, because no body seems to know how to call any of the damn quadratic programming software that is available. There seem to be hundreds of codes around with usually miserable documentation. May be left with role your own. 😁
@PradiptoDas-SUNYBuffalo
@PradiptoDas-SUNYBuffalo 11 жыл бұрын
56:53 - did not see that coming - why was he proud over the equation? 59:46 - memorial service for beta! Classic!
@jiewang7713
@jiewang7713 10 жыл бұрын
execellent "OKs"
@jackeown
@jackeown 5 жыл бұрын
The previous lecture is very helpful for understanding this: kzbin.info/www/bejne/m3nWdqWiha-Ki7c
@behrozkhan2000
@behrozkhan2000 12 жыл бұрын
Ok!
@roknyakhavein5833
@roknyakhavein5833 4 жыл бұрын
We R in Z space.
@diegoiruretagoyenaoliveri6050
@diegoiruretagoyenaoliveri6050 6 жыл бұрын
OKAY
@MohamedAtia
@MohamedAtia 11 жыл бұрын
ok?
@brainstormingsharing1309
@brainstormingsharing1309 3 жыл бұрын
👍👍👍👍👍
@nooneknown
@nooneknown 5 жыл бұрын
31:20
@petar29able
@petar29able 3 жыл бұрын
Im to stupid for this why am i here anyway
@tianchi1989
@tianchi1989 10 жыл бұрын
This is almost the best explanation about kernel I find. But the tone he uses makes me really sleepy. :(
@Nestorghh
@Nestorghh 11 жыл бұрын
haha. great
@robbertkarry4392
@robbertkarry4392 8 жыл бұрын
like russian
@nha1481
@nha1481 7 жыл бұрын
Who cares?
@Waynema8
@Waynema8 11 жыл бұрын
Great lecture !
Lecture 14 - Support Vector Machines
1:14:16
caltech
Рет қаралды 312 М.
Lecture 16 - Radial Basis Functions
1:22:08
caltech
Рет қаралды 168 М.
What's in the clown's bag? #clown #angel #bunnypolice
00:19
超人夫妇
Рет қаралды 25 МЛН
VAMPIRE DESTROYED GIRL???? 😱
00:56
INO
Рет қаралды 8 МЛН
Ouch.. 🤕⚽️
00:25
Celine Dept
Рет қаралды 19 МЛН
小蚂蚁会选到什么呢!#火影忍者 #佐助 #家庭
00:47
火影忍者一家
Рет қаралды 120 МЛН
Statistical Machine Learning Part 19 - The reproducing kernel Hilbert space
51:13
Tübingen Machine Learning
Рет қаралды 25 М.
What are Kernel Methods? (Machine Learning, Support-Vector Machines)
10:34
Super Data Science: ML & AI Podcast with Jon Krohn
Рет қаралды 947
16. Learning: Support Vector Machines
49:34
MIT OpenCourseWare
Рет қаралды 2 МЛН
SVM Kernels : Data Science Concepts
12:02
ritvikmath
Рет қаралды 74 М.
Lecture 12 - Regularization
1:15:14
caltech
Рет қаралды 134 М.
Part 25-Support Vector Machines, the Kernel trick
28:11
Pedram Jahangiry
Рет қаралды 3,8 М.
Kernel Functions
30:00
IIT Madras - B.S. Degree Programme
Рет қаралды 12 М.
What's in the clown's bag? #clown #angel #bunnypolice
00:19
超人夫妇
Рет қаралды 25 МЛН