Machine Learning Lecture 37 "Neural Networks / Deep Learning" -Cornell CS4780 SP17

  Рет қаралды 14,144

Kilian Weinberger

Kilian Weinberger

Күн бұрын

Lecture Notes:
www.cs.cornell.edu/courses/cs4...

Пікірлер: 84
@kirtanpatel797
@kirtanpatel797 4 жыл бұрын
Just completed all 37 Lectures :) This is the only course that forced me to come back, and complete entire series. It's only because of you Great Sir! Thank you so much for sharing these !
@Biesterable
@Biesterable 5 жыл бұрын
This was wonderfull!! It's strange that not more people are watching this. Thank you so much for sharing!
@amuro9616
@amuro9616 5 жыл бұрын
Exactly. One of most approcahble and intutive lectures on ML there is.
@cricketjanoon
@cricketjanoon 3 жыл бұрын
I started learning ML in 2017 when I was an undergrad student and now I am a graduate student. I took many courses and read many books but these lectures cleared many tiny details and concept which I was missing. Spend my COVID-19 summer watching the whole series. Thank Killian!
@linxingyao9311
@linxingyao9311 3 жыл бұрын
All I can say it is the Holy Grail of Machine Learning lectures. Thank you, Professor Kilian.
@xuanwu8045
@xuanwu8045 5 жыл бұрын
This is a wonderful machine learning course. I watched several machine learning/deep learning related courses on KZbin. This is my favorite one. In my opinion, a good teacher generally has one of the 2 traits: 1. Make the learning process easier for students by giving Illuminating lectures. 2. Want the students to learn from the heart and motivate students to learn by displaying his/her own passion about the subject. Professor Kilian has both traits. This makes me really enjoy watching this course. Thank you Kilian!
@jachawkvr
@jachawkvr 4 жыл бұрын
This class was so amazing and I learnt so many useful concepts. What I loved most was Dr.Weinberger's engaging and intuitive delivery which made the complex concepts so easy to grasp. He is also funny as hell, which made the classes a lot of fun. A big thank you from my side to Dr.Weinberger for sharing these wonderful lectures as well as the assignments.
@satviktripathi6601
@satviktripathi6601 3 жыл бұрын
It took me two months to complete this course, and my knowledge level has drastically changed! Thank you so much!
@kilianweinberger698
@kilianweinberger698 3 жыл бұрын
Great job!
@satviktripathi6601
@satviktripathi6601 3 жыл бұрын
@@kilianweinberger698 Sir, I can't believe you replied, I am a high school senior and have applied to Cornell! - Really hope to meet you one day!
@benxneo
@benxneo 8 ай бұрын
@@satviktripathi6601 did you get into Cornell
@jy9p4
@jy9p4 4 жыл бұрын
This was hands down the best lecture series I have seen in my life. I watched at least one video over the past three weeks, wrote notes along the way, and even tried the homework problems. Wow, what a ride. Thanks, Professor Weinberger!
@tubakias1
@tubakias1 4 жыл бұрын
Where can we find the homeworks? Thanks
@jy9p4
@jy9p4 4 жыл бұрын
@@tubakias1 Here's the link! www.dropbox.com/s/tbxnjzk5w67u0sp/Homeworks.zip?dl=0
@dantemlima
@dantemlima 5 жыл бұрын
Thank you, professor Kilian! What a great teacher you are! I learned a lot and laughed a lot. Awesome!
@zelazo81
@zelazo81 4 жыл бұрын
It took me 4 months but I've finally completed watching your series of lectures! You made it extremely informative, intuitive and fun and you have a great teaching style :) Thank you!
@halfmoonliu
@halfmoonliu 4 жыл бұрын
Dear Prof. Weinberger, It's a privilege to be able to listen to the whole series, from the very beginning to the very end. It really helped me getting through some parts that I was not very sure about. Thank you very much!
@nicksaraev
@nicksaraev 2 жыл бұрын
Thank you for the delightful class, Kilian! With ML making significant strides over the last few months, I was looking for a course that thoroughly and sufficiently explained the foundations behind it. This was it. Dutifully recommended you to all of my friends who are interested in the subject.
@autrepseudo1980
@autrepseudo1980 4 жыл бұрын
Just finished the series. It was great, I'm kinda sad now! Thanks professor Weinberger. I wish I had you as a prof in college!
@yaolinxing1968
@yaolinxing1968 5 жыл бұрын
Very Illuminating lectures. This series should have been popular as Andrew.NG's classic one. Thank you, professor Kilian.
@karansawhney2906
@karansawhney2906 4 жыл бұрын
Hello Dr. Weinberger. Your videos are hands down the best I've ever seen in terms of setting up intuition and explaining the concepts in the easiest way possible. This has helped me immensely in my studies. Thank you so much!!
@zeroes_ones
@zeroes_ones 6 ай бұрын
Thank you Kilian, your lectures has bought in a completely new perception/understanding(which was missing earlier) on how machine learning algorithms work. Your lectures also made me to appreciate Machine Learning even more. Thank you is a small word. May you always be blessed with good health and happines.
@gowtham6071
@gowtham6071 Жыл бұрын
I just love this course, everything is both intuitive and mathematically deep. Loved the course so much that I finished everything in 21 days.
@chaowang3093
@chaowang3093 3 жыл бұрын
Today, I am going to complete all the lectures!!! This is a legendary course that should have a similar number of views as Dr.Gilbert strang's linear algebra. Thank you so much, Dr. Kilian!!!
@kilianweinberger698
@kilianweinberger698 3 жыл бұрын
Well done!!
@sharique7214
@sharique7214 4 жыл бұрын
This is such s wonderful course. I have come across so many machine learning courses, blogs, videos but this was the best I came across. I sort of binged watched in during quarantine, playing back the lecture to note down so many things you explained. Thanks a lot Professor Killian!
@yogeshdhingra4070
@yogeshdhingra4070 4 жыл бұрын
I hope you are safe and sound!! Just wanted to say Thank you for the amazing lecture series. I have tear in my eyes... Professor Killian..you're the best!! I hope you add more videos related to Machine learning and Deep learning in the future.
@saitrinathdubba
@saitrinathdubba 5 жыл бұрын
Thanks a lot for brilliant lectures, prof. Kilian. It was Awesome fun and extremely insightful !!
@Jeirown
@Jeirown 3 жыл бұрын
I came here only to learn about gaussian processes. I ended up watching ~10 hours, as if this was a TV series. Even watched lectures on things I already knew well, but just wanted your perspective. Best course really. Thank you
@davejung8732
@davejung8732 4 жыл бұрын
Just Loved the whole lecture series :) It's so hard to find a series of lectures on youtube which motivates you to go back and go through the whole thing, but your lectures I succeeded in watching every one of them and also doing the homeworks :)) Thank you for the resource and love your sense of humor LOLLL
@manogyakaushik8924
@manogyakaushik8924 3 жыл бұрын
Completed all the lectures and absolutely loved them! Professor, you are really inspiring. Thank you so much for sharing these here.
@sashwotsedhai2836
@sashwotsedhai2836 9 ай бұрын
Thank you, professor Kilian! Thank you for these amazing lectures. Finally finished the whole series, and I feel like this is just the beginning.
@icewave3031
@icewave3031 7 ай бұрын
I lost it at the cinnamon roll part. Thanks for posting these! They have been very helpful for studying
@jordankuzmanovik5297
@jordankuzmanovik5297 3 жыл бұрын
I just wanna say Thank you very much. You are really the best teacher for this stuff. i can't thank you enough. And please make new courses even if they are not free, i think a lot of people would like to pay for your courses
@TrentTube
@TrentTube 4 жыл бұрын
I've completed your lecture series! Thank you for your generous contribution to my understanding of machine learning!
@andresguzman5665
@andresguzman5665 3 жыл бұрын
Amazing and inspiring course. Thank you so much Professor Kilian. Your ML course was the first that I watched complete .All the 37 lectures helped me so much. And when I read new ML material, very often remember the content that I watched in your course (More frequently Gaussian Distribution, Bagging and Boosting :)) thank you so much!
@rahulchowdhury9739
@rahulchowdhury9739 2 жыл бұрын
Thank you so much, Professor, for sharing your perspectives and knowledge to the world.
@dnalor8753
@dnalor8753 2 жыл бұрын
your humor made these lectures very enjoyable
@Ankansworld
@Ankansworld 3 жыл бұрын
Onto the last one now! but yeah, feels sad as this course comes to end. Quite interesting, informative, and highly engaging :) All thanks to our amazing professor! Please share a few more course lectures at Cornell! We'd love to level up ourselves...
@108PRS
@108PRS 3 жыл бұрын
An outstanding class! Filled with technical rigor and humor.
@ugurkap
@ugurkap 4 жыл бұрын
Thanks for sharing this, I believe it is one of the best out courses out here.
@RHardy25
@RHardy25 3 жыл бұрын
This was an amazing course, thank you Prof. Kilian!
@michaelmellinger2324
@michaelmellinger2324 2 жыл бұрын
2:58 Current research on Deep Learning 5:10 We lose information when working on images when we use a regular fully connected network. Images are translationally invariant 9:30 Convolutional layer explanation 13:30 We are restricting network to only learn functions functions that are translation invariant 16:50 Research on CovNets - Nvidia presentation 21:40 Residual networks. Skip networks. Stochastic depth 26:55 Impotent layers. Robustness because no layer is too important 28:25 Dense connectivity - DenseNet 30:30 Image Manifold - Images lie on a sub-manifold - Add/remove beards to faces 43:25 Dropout is used less these days and BatchNormalization is more common 44:20 Demo - Machine Learning for Data Science - Learn to discover structure in data - Manifolds
@StarzzLAB
@StarzzLAB 3 жыл бұрын
Thank you! I am sure that this course will blow up someday!
@divykala169
@divykala169 3 жыл бұрын
What an amazing journey, thank you professor!
@HhhHhh-et5yk
@HhhHhh-et5yk 3 жыл бұрын
Adding lectures on unsupervised learning , would have taken this lecture series to an another level!☄♥️।.
@daniilzaytsev2040
@daniilzaytsev2040 11 күн бұрын
Legendary course!
@thinkingaloud1833
@thinkingaloud1833 3 жыл бұрын
Thank you professor Kilian! The lecture is really great.
@danallford7144
@danallford7144 2 жыл бұрын
Thank you so much for putting these lectures online. I have enjoyed them all massively. I came across them while reading about decision trees, watched all of them and over the last 2 weeks have sat in my office every night and made my way through the whole course. Everyone I learnt alot and I now feel I have a way better understanding of ML to ground the rest of my learning (after I go and spend some time making up for my absence to my wife and kid :D) Would be great if you had a link to some site where I could buy you a drink, I feel like I'm in debt :)
@madhurgarg4114
@madhurgarg4114 2 жыл бұрын
Finally completed. Thank you very much prof !
@kilianweinberger698
@kilianweinberger698 Жыл бұрын
Great job!
@gregmakov2680
@gregmakov2680 2 жыл бұрын
yeah, exactly. NN learns non-linear relationships naturally and thus it can learn manifold easily.
@amarshahchowdry9727
@amarshahchowdry9727 3 жыл бұрын
I honestly can't thank you enough for this series. Thank you so much Kilian. Just wanted to confirm that this translational invariance is due to the combination of Conv layers as well as a pooling layer right. Cause Conv layers by themselves are translational equivarient. With the presence of a Pooling layer after them, we can achieve translational invariance for certain section of the image (If the image is taken to an opposite corner, the final rep. fed to the FC layers will be different right??), since the output even slight changes in position would lead to a slight change in the output of the conv layer, but maxing or avging in the region would give us the same output, at least for small shifts. Hence, we won't we require a lot of data ( faces in every position) to generalize. Am I right here???
@arshtangri5210
@arshtangri5210 3 жыл бұрын
I also used believed the same but there is some recent research that says otherwise.
@kilianweinberger698
@kilianweinberger698 3 жыл бұрын
I you have many layers then the receptive field (i.e. the pixels it is influenced by) of each neuron in the last layer is huge and translation invariance becomes less of an issue. So yes you are right, but creating many layers really helps in that respect.
@jeet3797
@jeet3797 4 жыл бұрын
Couldn't resist commenting, My first youtube comment since ever. A BIG THANK YOU!
@sunilkumarmeena2450
@sunilkumarmeena2450 Жыл бұрын
Killian, Thank you. ❤️
@louis6720
@louis6720 4 жыл бұрын
you are a god my man
@gregmakov2680
@gregmakov2680 2 жыл бұрын
yeah, great experiment!!
@susansun4130
@susansun4130 3 жыл бұрын
Thank you so much for explaining everything so clearly. So exactly how many electrons are there in the universe XD
@kilianweinberger698
@kilianweinberger698 3 жыл бұрын
a lot ...
@Shkencetari
@Shkencetari 5 жыл бұрын
Thank you very much. These lectures were great. Could you please publish the lectures for other classes like the one you mentioned about, called, "Machine Learning for Data Science" as well?
@ugurkap
@ugurkap 4 жыл бұрын
Other classes were not taught by him. I am not aware of any lecture recordings, but you might find some of the assignments and slides here: www.cs.cornell.edu/courses/cs4786/2019sp/index.htm
@user-kf9tp2qv9j
@user-kf9tp2qv9j 2 жыл бұрын
@@ugurkap hi Kaplan, does the classes you mentioned above has a video on line?
@alexstar8512
@alexstar8512 3 жыл бұрын
Hi! Thank you for the wonderful course! Are past exams available as I would like to test my knowledge now that I have completed the course
@clubmaster2012
@clubmaster2012 4 жыл бұрын
Is it fair to say that the idea of stochastic depth is similar to the randomization of dimensions we do before each greedy search in a random forest? Great lectures btw!
@kilianweinberger698
@kilianweinberger698 4 жыл бұрын
Not entirely. Stochastic depth is more a form of regularization as it forces the layers in a neural network to be similar.
@galhulli5081
@galhulli5081 3 жыл бұрын
Hi professor Killian, Once again thank you very much for the great material. I have a quick question regarding to the NN in general. I apologize in advance if I miss this part in one of the lectures (or comments). Is feature selection necessary before any nn (or deep learning) algorithm? One would think that since it is built to solve this central problem in representation as well as the weights, it should be automatically handled...
@kilianweinberger698
@kilianweinberger698 3 жыл бұрын
If you have enough data (and you normalize your features) the neural net can learn if some features are irrelevant. However, you can make its life easier (and get away with less training data) if you identify useless features before you do learning. Put it this way: Anything that the network doesn't have to learn itself makes its life easier.
@galhulli5081
@galhulli5081 3 жыл бұрын
thank you very much for the help! Cheers, Gal
@user-kf9tp2qv9j
@user-kf9tp2qv9j 2 жыл бұрын
the water bucket in PCA is really impressive🤣
@gregmakov2680
@gregmakov2680 2 жыл бұрын
hahha, gooood experience :D:D:D we can only unfold it when we know before hand its structure.
@71sephiroth
@71sephiroth 4 жыл бұрын
I am trying to play with this idea but at 35:29 I don't understand how this image is represented, what is the coordinate system? Is it like axes represent weights and biases and for each one you have an entry such as w1*x1 etc. ? At 36:46 why is it meaningful to take gradient descent to reconstruct this image? If we have w1*x1 do you take gradient descent with the respect to x1?
@dude8309
@dude8309 4 жыл бұрын
Is the last layer of a deep network still considered a linear classifier even if it has a non-linear activation function? If not, does that assumption still hold?
@kilianweinberger698
@kilianweinberger698 4 жыл бұрын
Yes. Assuming you fix the previous layers, and treat them as feature extractors, then the last (linear) layer is essentially very similar to e.g. logistic regression. Note that logistic regression also has a (non-linear) sigmoid as output s(w'x). The key is that the function s() here acts as a thresholding / scaling function, that essentially makes sure we have output probabilities. Because it is strictly monotonic, it preserves the linearity of the decision boundary. If s() was a sin() function instead of a sigmoid, the classifier would not be linear. Hope this helps.
@kc1299
@kc1299 3 жыл бұрын
DenseNet!
@shrishtrivedi2652
@shrishtrivedi2652 2 жыл бұрын
3:30
@gregmakov2680
@gregmakov2680 2 жыл бұрын
hahhah, exactly PCA is good enough to handle many situations.
@gregmakov2680
@gregmakov2680 2 жыл бұрын
yeah, the sur-real fact about researchers, scientists :D:D
@itachi4alltime
@itachi4alltime 2 жыл бұрын
Damn, I am sad
@user-kf9tp2qv9j
@user-kf9tp2qv9j 2 жыл бұрын
i feel a little say too in the end
On the Importance of Deconstruction in Machine Learning Research
29:44
Kilian Weinberger
Рет қаралды 6 М.
THEY WANTED TO TAKE ALL HIS GOODIES 🍫🥤🍟😂
00:17
OKUNJATA
Рет қаралды 23 МЛН
WHO LAUGHS LAST LAUGHS BEST 😎 #comedy
00:18
HaHaWhat
Рет қаралды 20 МЛН
THE POLICE TAKES ME! feat @PANDAGIRLOFFICIAL #shorts
00:31
PANDA BOI
Рет қаралды 25 МЛН
GEOMETRIC DEEP LEARNING BLUEPRINT
3:33:23
Machine Learning Street Talk
Рет қаралды 172 М.
Machine Learning Lecture 18 "Review Lecture II" -Cornell CS4780 SP17
50:48
This is why Deep Learning is really weird.
2:06:38
Machine Learning Street Talk
Рет қаралды 370 М.
Machine Learning Lecture 26 "Gaussian Processes" -Cornell CS4780 SP17
52:41
Lecture 5 "Perceptron" -Cornell CS4780 SP17
49:57
Kilian Weinberger
Рет қаралды 43 М.
THEY WANTED TO TAKE ALL HIS GOODIES 🍫🥤🍟😂
00:17
OKUNJATA
Рет қаралды 23 МЛН