Tutorial 27- Ridge and Lasso Regression Indepth Intuition- Data Science

  Рет қаралды 325,744

Krish Naik

Krish Naik

Күн бұрын

Пікірлер: 405
@hipraneth
@hipraneth 4 жыл бұрын
Lucid explanation at free of cost . Your passion to make the concept crystal clear is very much evident in your eyes...Hats Off!!!
@shubhamkohli2535
@shubhamkohli2535 4 жыл бұрын
Only person who is providing this level of knowledge at free of cost. Really appreciate it .
@iamfavoured9142
@iamfavoured9142 2 жыл бұрын
100 years of blessing for you. You just gained a subscriber!
@YouKnowMe-123y
@YouKnowMe-123y Жыл бұрын
You are helping many of the ML enthusiasts free of cost... Thank you
@yadikishameer9587
@yadikishameer9587 3 жыл бұрын
I never watched your videos but after watching this video I regret for ignoring your channel. You are a worthy teacher and a data scientist.
@aelitata9662
@aelitata9662 4 жыл бұрын
I'm in crisis to learn this topic and all I know is y=mx+c. I think this is the clearest one I've watched on youtube. Thank you sooooo much and love your enthusiasm when you tried to explain the confusing parts
@AnkJyotishAaman
@AnkJyotishAaman 4 жыл бұрын
This guy is legit !! Hat's off for the explanation!! Loved it sir, Thanks
@tenvillagesahead4192
@tenvillagesahead4192 3 жыл бұрын
Brilliant. I searched all over the net but couldn't find such an easy yet detailed explanation of Regularization. Thank you very much! Very much considering joining the membership
@harshstrum
@harshstrum 4 жыл бұрын
Thank You bhaiya. It feels like every mroning when I watch your videos my career slope will increase. Thank you for this explaination.
@auroshisray9140
@auroshisray9140 4 жыл бұрын
Hats offf...grateful for valuable content at 0 cost
@koderr100
@koderr100 2 жыл бұрын
Now I finally got about key L2 and L3 difference. Thanks a lot!
@HammadMalik
@HammadMalik 4 жыл бұрын
Thanks Krish for explaining the intuition behind Ridge and Lasso regression. Very helpful.
@ganeshrao405
@ganeshrao405 3 жыл бұрын
Thank you soo much Krish, Linear regression + Ridge + Lasso cleared my concepts with your videos.
@dollysiharath4205
@dollysiharath4205 Жыл бұрын
You're the best trainer!! Thank you!
@ChandanBehera-jp2me
@ChandanBehera-jp2me 3 жыл бұрын
i found your free videos better than some other paid tutorials...thanx for ur work
@TheR4Z0R996
@TheR4Z0R996 4 жыл бұрын
Keep up the good work, blessing from italy My friend :)
@marijatosic217
@marijatosic217 4 жыл бұрын
Great video! I appreciate how hard his effort is to help us really understand the material!
@adijambhulkar1742
@adijambhulkar1742 2 жыл бұрын
Hats off... What a way... What a way to explain man... Clear...all doubts
@datafuturelab_ssb4433
@datafuturelab_ssb4433 2 жыл бұрын
Best explanation on lasso n ridge regression ever on KZbin... Thanks krish... You nailed it...
@vishalaaa1
@vishalaaa1 4 жыл бұрын
This naik is excellent. He is solving every ones problem.
@vladimirkirichenko1972
@vladimirkirichenko1972 Жыл бұрын
This man has a gift.
@mithunmiranda
@mithunmiranda Жыл бұрын
I wish I could like his videos multiple times. You are a great teacher, Kind Sir.
@rishu4225
@rishu4225 5 ай бұрын
Thanks, the enthusiasm with which you teach also carries over to us. 🥰
@mumtahinhabib4314
@mumtahinhabib4314 4 жыл бұрын
This is where I have found the best explanation of ridge regression after searching a lot of videos and documentations. thank you sir
@ajithsdevadiga1603
@ajithsdevadiga1603 9 ай бұрын
Thank you so much for this wonderful explanation, truly appreciate your efforts in helping the data science community.
@cyborg69420
@cyborg69420 Жыл бұрын
just wanted to say that I absolutely loved the video
@gerardogutierrez4911
@gerardogutierrez4911 4 жыл бұрын
if you pause the video and just watch his facial movements and body movements, he looks like hes trying his best to convince you to stay with him during a break up. Then you turn on the audio and its like hes yelling at you to get you to understand something. Clearly, this man is passionate about teaching Ridge regression and knows a lot. I think its easier to follow when hes like checking up on you by saying, you need to understand this, and repeats words and uses his voice to emphasize concepts. I wish he could explain other things to me besides data science.
@TheMrIndiankid
@TheMrIndiankid 4 жыл бұрын
he will explain u the meaning of life too
@MrBemnet1
@MrBemnet1 4 жыл бұрын
my next project is counting head shakes in a youtube video .
@tanmay2771999
@tanmay2771999 3 жыл бұрын
@@MrBemnet1 Ngl that actually sounds interesting.
@juozapasjurksa1400
@juozapasjurksa1400 3 жыл бұрын
Your explanations are sooo clear!
@MuhammadAhmad-bx2rw
@MuhammadAhmad-bx2rw 3 жыл бұрын
Extraordinary talented Sir
@indrasenareddyadulla8490
@indrasenareddyadulla8490 4 жыл бұрын
Sir, you have mentioned in your lecture this concept is complicated but never I felt it is so. you have explained very excellent.👌👌👌👌👌👌👌👌👌👌👌👌👌👌👌👌👌👌👌👌
@abdulnafihkt4245
@abdulnafihkt4245 2 жыл бұрын
Best best best bestttttt class...hats off maaan
@fratcetinkaya8538
@fratcetinkaya8538 3 жыл бұрын
Here is where I understood that damn issue. I’m appreciated too much, thanks my dear friend :)
@aish_waryaaa
@aish_waryaaa 2 жыл бұрын
Krish Sir you are saving my masters literally,up to date explanation,and the efforts you are putting to help us understand,Thank You so Much Sir.😇🥰
@sincerelysilvia
@sincerelysilvia 2 жыл бұрын
This is the most clearest and best explanation about this topic on youtube. I can't express how thankful I am for this video for finally understanding the concept
@143balug
@143balug 4 жыл бұрын
Hi Krish, Your are making our confidence more on data scince with the clear explanations
@nehasrivastava8927
@nehasrivastava8927 4 жыл бұрын
best tutorials for machine learning with indepth intuition...i think there is no tutorial on utube like this...Thankuu sir..
@loganwalker454
@loganwalker454 3 жыл бұрын
Regularization was a very abstruse and knotty topic. However, after watching this video; it is a piece of cake Thank you, Krish
@BipinYadav-wn1pm
@BipinYadav-wn1pm Жыл бұрын
after going through tons of videos, finally found the best one, thnx!!
@adinathshelke5827
@adinathshelke5827 8 ай бұрын
perfect explanationnnnnnnn. WAs wandering around for whole day. And at the end of the day, found this one.
@Zizou_2014
@Zizou_2014 4 жыл бұрын
Brilliantly done! Thanks Krish
@anirbandey8999
@anirbandey8999 3 ай бұрын
Very good video to understand the intuition behind L1, L2
@rahul281981
@rahul281981 3 жыл бұрын
Very nicely explained, thank God I found your posts on KZbin while searching the stuff👍
@rayennenounou7065
@rayennenounou7065 3 жыл бұрын
I have a mémoire master 2 about lasso régression i need informations more informations about régression de lasso but in frensh can you help me
@moe45673
@moe45673 Жыл бұрын
Thank you! I thought this was a great explanation (as someone who has listened to a bunch of different ones trying to nail my understanding of this)
@316geek
@316geek 3 жыл бұрын
you make it look so easy, kudos to you Krish!!!
@aravindvasudev7921
@aravindvasudev7921 Жыл бұрын
Thank you. Now I got a clear idea on both these regression techniques.
@MohsinKhan-rv7jj
@MohsinKhan-rv7jj 2 жыл бұрын
The kind of explanation is truly inspirational. I am truly overfitted by knowledge after seeing your video.❤
@abhishekkumar465
@abhishekkumar465 2 жыл бұрын
Reduce the rate of learning, this may help you as per Ridge regression :P
@JEEVANKUMAR-hf4ex
@JEEVANKUMAR-hf4ex 3 жыл бұрын
good explanation without touching any complex maths derivations.
@Sumta555
@Sumta555 4 жыл бұрын
18:35 How the features get removed where |slope| is very very less? Hats off for this fantastic clarity on the topic.
@dhirajkumarsahu999
@dhirajkumarsahu999 4 жыл бұрын
The regression works in such a way that, the coefficients of the inputs which are of less importance will keep on decreasing in every iteration. In case of ridge regression, the unimportant coefficients exponentially decrease (or asymptotically decrease). In which it will come very close to zero, but will never become zero. (Look the exponential graph for refrence) This is not the case with lasso regression. Hope this helps.
@JoseAntonio-gu2fx
@JoseAntonio-gu2fx 4 жыл бұрын
Muchas gracias por compartir. Se agradece mucho el esfuerzo por aclarar los conceptos que es la base de partida para la resolución. Saludos desde España!
@sridhar7488
@sridhar7488 3 жыл бұрын
sí, es un tipo genial ... también me encanta ver sus videos!
@ArunKumar-yb2jn
@ArunKumar-yb2jn 2 жыл бұрын
At 8:15 you say "a steep slope will always lead to overfitting case, why? I will just tell you now..." But I couldn't find where later on you have explained this.
@somnathpatnaik2277
@somnathpatnaik2277 3 жыл бұрын
i have tried 4 very reputed organizations for doing courses all claim faculty from IIT and xyz high profile name. My feedback is if you are from IIT then that doesnt mean you are a good teacher, for teaching they should have passion like you had. When i see your lectures i enjoy learnings. Thank you
@TheOntheskies
@TheOntheskies 3 жыл бұрын
Thank you, for the crystal clear explanation. Now I will remember Ridge and Lasso.
@ZubairAzamRawalakot
@ZubairAzamRawalakot Жыл бұрын
Very informative lecture dear. You explained with maximum detail. thanks
@yitbarekmirete6098
@yitbarekmirete6098 2 жыл бұрын
you are awesome, better than our professors in explaining such complex topics.
@BoyClassicall
@BoyClassicall 4 жыл бұрын
Concept well explained. I've watch a lot of videos on Ridge regression but most well explained has shown mathematically the effect of lambda on slope.
@Amir-English
@Amir-English 7 ай бұрын
You made it so simple! Thank you.
@heplaysguitar1090
@heplaysguitar1090 3 жыл бұрын
Just one word, Fantastic.
@gandhalijoshi9242
@gandhalijoshi9242 3 жыл бұрын
Very nice explanation. I have started watching your videos and your teaching style is very nice . Very nice you tube channel for understanding data science-Hats Off!!
@kanhataak1269
@kanhataak1269 4 жыл бұрын
After watching this lecture is not complicated... good teaching sir
@sahilzele2142
@sahilzele2142 4 жыл бұрын
so the basic idea is: 1)steeper slope leads to overfitting @8:16 (what he basically means is that the overfitting line we have has a steeper slope which does not justify his statement on the contrary) 2)adding lambda*(slope)^2 will increase the value of cost function for the overfitted line, which will lead to reduction of slopes or 'thetas' or m's (there are all the same things) @10:03 3)now that the value of cost function for overfitted line is not minimum, another best line is selected by reducing the slopes or 'thetas' or m's which will also reflect in addition of lambda*(slope)^2 ,just this time slope added will be less. @13:45 4)doing this will overcome overfitting as the new best fit line will have less variance(more successful for training data) and less bias than our previous line @14:10 , the bias maybe more because it was 0 for overfitted line ,then it will be a bit more for the new line 5)lambda can be also called as scaling factor or inflation rate to manipulate the regularization. as for the question ,what happens if we have overfitted line with less steeper slope?, then i think we'll find the best fit line with even less steep slope(maybe close to slope~0 but !=0) @16:30 and tadaa!!!! we have reduced overfititng successfully!! please correct me if anything's wrong
@faizanzahid490
@faizanzahid490 4 жыл бұрын
I've same queries bro.
@supervickeyy1521
@supervickeyy1521 4 жыл бұрын
for 1st point. What if test data has the same slope value as that of train data? in such case there won't be overfitting correct ?
@angshumansarma2836
@angshumansarma2836 4 жыл бұрын
just remember the 4 th point that the main goal of regularization we just wanted to generalize better for the test dateset while having some errors in the test dateset
@chetankumarnaik9293
@chetankumarnaik9293 4 жыл бұрын
First of all, no linear regression can be built with just two data points. He is not aware of degree of freedom.
@Kmrabhinav569
@Kmrabhinav569 4 жыл бұрын
the basic idea is to use lambda (i.e. also known as the regularization parameter) to reduce the product term of Lambda*(slope). Here slope implies various values of m, such as if y = m1x1+m2x2 and so on... we have many values of m(i). So here, we try to adjust the value of lambda such that, the existence of those extra m(i) doesn't matter. And hence we are then able to remove them, i.e. remove the extra features from the model. And we are doing this as one of the major causes of overfitting is due to the addition of extra features. Hence by getting rid of these features, we can curb the problem of overfitting. Hope this helps.
@SahanPradeepthaThilakaratne
@SahanPradeepthaThilakaratne 5 ай бұрын
Your explanations are superbbb!
@robertasampong
@robertasampong Жыл бұрын
Absolutely excellent explanation!
@askpioneer
@askpioneer 2 жыл бұрын
well explained krish. thank you for creating . great work
@belllamoisiere8877
@belllamoisiere8877 2 жыл бұрын
Hello from México. Thank you for your tutorials, they are as if one of my class mates was explaining concepts to me in simple words. A suggestion, please include a short tutorial on ablation of Deep Learning Models.
@aseemjain007
@aseemjain007 4 ай бұрын
Brilliantly explained !! thankyou !!
@ibrahimibrahim6735
@ibrahimibrahim6735 3 жыл бұрын
Thanks, Krish, I want to correct one thing here, the motivation behind the penalty is not to change the slop; it is to reduce the model's complexity. For example, consider the flowing tow models: f1: x + y + z + 2*x^2 + 5y^2 + z^2 =10 f2: 2*x^2 + 5y^2 + z^2 =15 f1 is more complicated than f2. Clearly, a complicated model has a higher chance of overfitting. By increasing lambda (the complexity factor), it is more likely to have a simpler model. Another example: f1: x + 2y + 10z + 5h + 30g = 100 f2: 10z + 30g = 120 f2 is simpler than f1. If both models have the same performance on the training data, we would like to use f2 as our model. Because it is a simpler model and a simpler model has less chance for overfitting.
@prashanths4455
@prashanths4455 4 жыл бұрын
Krish An excellent explanation. Thank you so much for this wonderful in-depth intuition.
@dianafarhat9479
@dianafarhat9479 8 ай бұрын
Amazing explanation, thank you!
@anshulmangal2755
@anshulmangal2755 4 жыл бұрын
Sir great channel on KZbin for machine learning
@mohammedfaisal6714
@mohammedfaisal6714 4 жыл бұрын
Thanks a lot for your Support
@MsGeetha123
@MsGeetha123 3 жыл бұрын
Excellent video!!! Thanks for a very good explanation.
@maheshurkude4007
@maheshurkude4007 4 жыл бұрын
thanks for explaining Buddy!
@abhishekchatterjee9503
@abhishekchatterjee9503 4 жыл бұрын
You did a great job sir.... It helped me a lot in understanding this concept. In 20min I understood the basic of this concept. Thank you💯💯
@sandipansarkar9211
@sandipansarkar9211 4 жыл бұрын
Great explanation Krish.I think I a understanding a little bit about L1 andL2 regression.Thanks
@Bedivine777angelprayer
@Bedivine777angelprayer Жыл бұрын
Thanks is there articles i can refer any blogs you recommend thanks again great content
@kanuparthisailikhith
@kanuparthisailikhith 4 жыл бұрын
The best tutorial I have seen till date on this topic. Thanks so much for clarity
@shaurabhsinha4121
@shaurabhsinha4121 3 жыл бұрын
Krish,but equation of a line with best generalized fit,eg y=Mx+c,can be possible with M being high-->AS ACTUAL DATAPOINT CAN BE closer and crowded near Y axis.So,steep slope can't be criteria.
@yamika.
@yamika. 2 жыл бұрын
thank you for this! finally understood the topic
@Mars7822
@Mars7822 2 жыл бұрын
only one word = brilliance
@sameerpandey5561
@sameerpandey5561 3 жыл бұрын
Superbly explained
@gunjanagrawal8626
@gunjanagrawal8626 2 жыл бұрын
Very well explained!🙌
@bhuvaraga
@bhuvaraga 2 жыл бұрын
Loved your energy sir and your conviction to explain and make it clear to your students. I know it is hard to look at the camera and talk - you nailed it. This video really helped me to understand the overall concept. My two cents, 1) Keep the camera focus on the white board I think it is autofocussing between you and the white board and maybe that is why you get that change in brightness also.
@R_SinghRajput
@R_SinghRajput 2 жыл бұрын
Beautifully explained sir ☺️
@thulasirao9139
@thulasirao9139 4 жыл бұрын
You are doing awesome job. Thank you so much
@veradesyatnikova2931
@veradesyatnikova2931 2 жыл бұрын
Thank you for the clear and intuitive explanation! Will surely come in handy for my exam
@fatriantobong
@fatriantobong Жыл бұрын
i think you need to emphasize on the high variance toward the test data, and low variance toward the training data, the problem with overfitting is that this low variance on the training data comes at the expense of high variance on the test data. When the model is exposed to new, unseen data (the test data), it struggles to generalize because it has essentially memorized the noise and intricacies of the training data. This results in a significant difference between the model's predictions and the true values on the test data, indicating high variance on the test data.
@RinkiKumari-us4ej
@RinkiKumari-us4ej 4 жыл бұрын
Best explanation sir.
@saitcanbaskol9897
@saitcanbaskol9897 2 жыл бұрын
Amazing explanations.
@nehabalani7290
@nehabalani7290 3 жыл бұрын
Too good and short for ppl clear with basic modeling concepts
@bhooshan25
@bhooshan25 3 жыл бұрын
understood the whole video very well
@tsrnihar
@tsrnihar 2 жыл бұрын
Small correction - For Lasso regression, it is sum of mod of coefficients multiplied by the regularization parameter. You wrote it as mod of sum of coefficients multiplied by the regularization It is lambda*(|m1| + |m2| + ..) and not lambda*(|m1+ m2 + ...)
@unpluggedsaurav3186
@unpluggedsaurav3186 Жыл бұрын
True
@binnypatel7061
@binnypatel7061 4 жыл бұрын
Awesome job.....keep up with the good work!
@kanavsharma9562
@kanavsharma9562 3 жыл бұрын
I have watched more than 8 videos and 2-3 articles but didn't get how lambda value effect the slope ur video explain it best. Thanks
@dineshpramanik2571
@dineshpramanik2571 4 жыл бұрын
Excellent explanation sir...thanks
@ahmedaj2000
@ahmedaj2000 3 жыл бұрын
THANK YOU SO MUCH!!!!!!! great explanation!
@_cestd9727
@_cestd9727 3 жыл бұрын
super clear, thanks for the video!
@kiran082
@kiran082 4 жыл бұрын
Thank you Krish very detailed explanation
@Markmania510
@Markmania510 Жыл бұрын
What a great video thank you so much for posting. Can anyone tell me what happens if you’re overfitting with a smaller slope? I.e. you need to increase the slope to generalise the model more. Is this when you need to make the lambda parameter negative?
@taruchitgoyal3735
@taruchitgoyal3735 9 ай бұрын
Hello Sir, As I understand, we apply Regularization [Ridge and Lasso] to reduce features or reduce contribution of features. As you also shared we took sum of squared residuals and then added the regularization constant to it. Thus, we get cost value which is equal to sum of squared residuals and regularization penalty and we get a numerical result as the output of the computation. At this point, How do we perform the feature selection or control contribution of each input variable in feature space? Thank you
@MrLoker121
@MrLoker121 3 жыл бұрын
Good video for beginners, a couple of pointers though: 1. The lasso regression would lead to |m1|+|m2|+|m3| +.... and not |m1+m2+m3+m4....| 2. The explanation why coefficients in L1 regularization would go to zero and not for L2 is missing. Probably can expand upon it theoretically.
Tutorial 28- Ridge and Lasso Regression using Python and Sklearn
9:51
Friends make memories together part 2  | Trà Đặng #short #bestfriend #bff #tiktok
00:18
Human vs Jet Engine
00:19
MrBeast
Рет қаралды 185 МЛН
Regularization Part 1: Ridge (L2) Regression
20:27
StatQuest with Josh Starmer
Рет қаралды 1,1 МЛН
Regularization in a Neural Network | Dealing with overfitting
11:40
Ridge And Lasso Regression Indepth Maths Intuition In hindi
17:47
Krish Naik Hindi
Рет қаралды 73 М.
Intuitive Explanation of Ridge / Lasso Regression
18:50
PawarBI
Рет қаралды 12 М.
Ridge vs Lasso Regression, Visualized!!!
9:06
StatQuest with Josh Starmer
Рет қаралды 258 М.
Regularization - Explained!
12:44
CodeEmporium
Рет қаралды 16 М.