Tutorial 28-MultiCollinearity In Linear Regression- Part 2

  Рет қаралды 102,572

Krish Naik

Krish Naik

Күн бұрын

In regression, "multicollinearity" refers to predictors that are correlated with other predictors. Multicollinearity occurs when your model includes multiple factors that are correlated not just to your response variable, but also to each other. In other words, it results when you have factors that are a bit redundant
github link: github.com/kri...
Please join as a member in my channel to get additional benefits like materials in Data Science, live streaming for Members and many more
/ @krishnaik06
github url: github.com/kri...
#Regularization
Please do subscribe my other channel too
/ @krishnaikhindi
Connect with me here:
Twitter: / krishnaik06
Facebook: / krishnaik06
instagram: / krishnaik06

Пікірлер: 93
@karangupta6402
@karangupta6402 3 жыл бұрын
Thanks, Krish. One way that you suggested is to Drop the feature. There is one more thing that we can do is combine the features which we also called them as an interaction term (like multiply Age and years of experience, and then run your OLS model). This also seems to control collinearity to some extent
@gargisingh9279
@gargisingh9279 3 жыл бұрын
@KrishNaik Sir !! What about another technique called as VIF(Variance inflation factor)?That is the technique also used mostly for multicollinearity
@bhagwatchate7511
@bhagwatchate7511 3 жыл бұрын
Hello Krish, Thanks for the amazing explanation. I have one question as - Is it necessary to check summary and then conclude about multicollinearity. Can we proceed only with corr() function and conclude about multicollinearity. Thanks in advance, :) #multicollinearity #Krish Naik #Regression
@StyleTrick
@StyleTrick 4 жыл бұрын
Krish, could you explain why AGE has the high P value and not Years of Experience? If you remove Years of Experience from the table, then Age has a P value of 0, as this makes sense as in the data as Age increases, Salary increases accordingly.
@Katarahul
@Katarahul 4 жыл бұрын
plotting a correlation matrix and analyzing it is a faster option right?
@Roblox5091
@Roblox5091 3 жыл бұрын
Yes
@shashankkhare1023
@shashankkhare1023 4 жыл бұрын
Hey Krish, do we really need to look at Std. error? I think looking at p-vaue is enough as p-value is calculated based on t-statistic, which is calculated based on coeff. and std. error. Correct me if I am wrong, thanks
@unezkazi4349
@unezkazi4349 3 жыл бұрын
So that means we should now have 0 expenditure on newspaper? And after removing the newspaper feature, if we do OLS again, can there be any other feature with p-value less than 0.5 in the new OLS model?
@ashwininaidu2302
@ashwininaidu2302 4 жыл бұрын
Hi Krish, Nice explaination of the concept. If you could make a video on Assumptions of Linear Regression ( theory and practical) , that would be of great help for us.
@prijwalrajavat5811
@prijwalrajavat5811 4 жыл бұрын
I was searching for the theory behind backward elimination ....everywhere just the procedure was given but why was explained by you ...Thank you, sir
@peeyushyadav4991
@peeyushyadav4991 3 жыл бұрын
Wait if the newspaper column/feature is not correlated then how do we come to the conclusion that it can be dropped as the P-value is high? Are we making the conclusions based on the T-test values here as well?
@ruturajjadhav8905
@ruturajjadhav8905 3 жыл бұрын
1.Linearity 2.Homoscedascity 3.Multivariate normality 4.Indepence of errors. 5.Lack of multicollinearity. Please do a video on it. I am not getting these concepts.
@rankerhill2335
@rankerhill2335 4 жыл бұрын
Unless you calculate Variance Inflation Factor, you cannot be certain that multicollinearity is nor present. Ideally you should be calculating partial correlation coefficient and compute (1/1-partial correlation coefficient) and see if this is greater than 2
@alexandremaillot803
@alexandremaillot803 4 жыл бұрын
Please After making a model with python , training, testing and saving it, how can I put new data through it for predictions ? Thanks for the content
@amansrivastava3081
@amansrivastava3081 3 жыл бұрын
for ex if here is how you have done made the regressor: reg=LinearRegression() reg.fit(x_train,y_train) then you can predict it like: reg.predict([[100]]) and you will get the predicted value if you have trained for multiple features, then use like reg.predict([[100,200]]) according to the dataframe you have trained!
@muhammadiqbalbazmi9275
@muhammadiqbalbazmi9275 3 жыл бұрын
I think, instead of removing any feature we can combine both correlated features (in this context you talked about) Age+YearsOfExperience=> Seniority level. Problem solved. (domain knowledge required for feature engineering).
@akramhossain9576
@akramhossain9576 3 жыл бұрын
I was looking for Multicollinearity problem in interaction terms. How to solve this problem? One suggestion is centring the variable! But still confused! When one of my interaction variables is dichotomous and coding =0 is important for me, then if I use centering, 0 coding will not be there anymore! So how to solve this? Can you please clarify this for me? Thanks
@adityakumar-sp4ki
@adityakumar-sp4ki 3 жыл бұрын
while Importing this library -> import statsmodels.api as sm, I'm getting this error -> module 'pandas' has no attribute 'Panel'
@DataInsights2001
@DataInsights2001 3 жыл бұрын
Nice presentation! Dropping a feature is the one solution, sometimes you can’t drop and you may need that feature in the analysis what would you do? Combine features or division or multiplication? Or use factor analysis? Mostly the cases like Marketing Mix modeling the similar issues arise often. Please Try videos like Marketing Mix modeling, Optimization, and forecasting estimating ROI, IROI, response curves....
@ibrahimmondal9104
@ibrahimmondal9104 2 жыл бұрын
Now the concept of Multicollinearity is totally clear....thanks bhaiya🙂👍✌️
@VishalPatel-cd1hq
@VishalPatel-cd1hq 4 жыл бұрын
Hi Sir, can we directly find covariance matrix of the feature data and find the determinant of the covariance matrix if it is Zero then we can directly say that our covariance matrix is singular and if it is singular then it having dependent variable. so there is Multi-Collinearity in our data ...
@K-mk6pc
@K-mk6pc 2 жыл бұрын
Can anyone explain why we are adding constant 1 in the predictor variables?
@maskman9630
@maskman9630 2 жыл бұрын
can we use the same process for logistic regression brother....?
@ajaykushwaha4233
@ajaykushwaha4233 2 жыл бұрын
Will this approach work on data with huge features?
@chitranshaagarwal4676
@chitranshaagarwal4676 Жыл бұрын
Please make the topics in order, getting confused
@sahilgaikwad98
@sahilgaikwad98 4 жыл бұрын
thankyou for such a wonderful explanation sir
@sachinborgave8094
@sachinborgave8094 4 жыл бұрын
Thanks Krish, Please upload further Deep Learning videos.
@amitkhandelwal8030
@amitkhandelwal8030 3 жыл бұрын
sir why you cannot choose tv as B0.??
@madhavilathamandaleeka5953
@madhavilathamandaleeka5953 3 жыл бұрын
What is the difference between OLS model and Linear regression with MSE ??......Do both give same ??.... plzz clear it 🙏🙏
@dikshagupta3276
@dikshagupta3276 2 жыл бұрын
Pls add the link of colinearity
@jatin7836
@jatin7836 3 жыл бұрын
An imp ques, if anyone knows, please answer ---> "Why we give OLS, the X value(which is constant) and not the x values(which are our independent features)? if we are not giving the independent values to OLS, then how can it is showing the table in output(with X and y?)"
@shubhamkundu2228
@shubhamkundu2228 3 жыл бұрын
So to sum up, to identify multicollinearity, we use OLS (ordinary least squares) method to check the model summary for std error (values should be less not high), Rsquare value (should tend towards 1), p value (should
@ayushasati2110
@ayushasati2110 3 жыл бұрын
your voice is not that much clear
@avinashmishra6783
@avinashmishra6783 3 жыл бұрын
Why multicollinearity arises? What happens that reduces adj Rsq?
@adhvaithstudio6412
@adhvaithstudio6412 4 жыл бұрын
Can you please explain why p values is biased towards in Age variable why it is not for years of experience.
@nagarjunakanneganti5953
@nagarjunakanneganti5953 3 жыл бұрын
If I remove years of exp instead of age just by correlation matrix. It will work right? I don't have to check for p values as if I can construct a LR model with just age could change the p value? Any thoughts @channel
@shivanshsingh5555
@shivanshsingh5555 4 жыл бұрын
why u r keep saying 0.5 again and again sir. This is not cleared. Plz send the link for the reference coz i m watching this playlist step by step but im still not getting this 0.5 criteria
@shubhamchoudhary5461
@shubhamchoudhary5461 3 жыл бұрын
Is it like hypothesis testing??
@abhishekverma549
@abhishekverma549 4 жыл бұрын
Sir what is P>[t]
@jatashukla6891
@jatashukla6891 4 жыл бұрын
hi krish i had bought your package of 300 rs on youtube , just to get in connect with you .i need your time to answer some of my doubts.Its regarding switching my career into datascience.Please let me know way to connect with you directly
@adityapathania3618
@adityapathania3618 3 жыл бұрын
bhai data set to de dia kro
@sejalchandra2114
@sejalchandra2114 4 жыл бұрын
Hi sir, I have a doubt. Could we use this if we have one hot encoded features in the dataset?
@saidurgakameshkota1246
@saidurgakameshkota1246 4 жыл бұрын
Did you get the answer
@likithliki1160
@likithliki1160 3 жыл бұрын
amazing Explanation
@Datacrunch777
@Datacrunch777 3 жыл бұрын
Sir kindly upload vdio on ridge regression for lambda value and how can simulate with respect to the basic estimators or formulas
@sumeersaifi6354
@sumeersaifi6354 3 жыл бұрын
at the last, you said 1 unit decrease in newspaper exp will result in one unit increase in sales. can you plzz explain it. because according to me it will result in change in profit but not in sales
@saumyagupta2606
@saumyagupta2606 2 жыл бұрын
Is it a good practice to find multicollinearity for other ml models also apart from linear? Is it necessary for other models as well?
@pushkarasharma3746
@pushkarasharma3746 4 жыл бұрын
everything is fine Krish sir but pls try not to say particular in every sentence...it is very annoying
@bonnyphilip8022
@bonnyphilip8022 2 жыл бұрын
You are simply awesome... Nothing else to say.. Thank you for your contribution and service...
@adhvaithstudio6412
@adhvaithstudio6412 4 жыл бұрын
why you are ignoring age only why can't we ignore years of expereince?
@slowhanduchiha
@slowhanduchiha 4 жыл бұрын
In the 2nd example wasn't scaling important??
@RitikSingh-ub7kc
@RitikSingh-ub7kc 4 жыл бұрын
Can we just use principle component analaysis to see if all variance is covered by reducing number of features here?
@ganeshdevare7360
@ganeshdevare7360 2 жыл бұрын
Thank you very much sir for explanation at 00:00
@abhishekchanda4002
@abhishekchanda4002 4 жыл бұрын
Hi Krish, I got a doubt. If i ignore multicollinearity in the second example, how will it affect the prediction?
@guneet556
@guneet556 4 жыл бұрын
Sorry but have to ask again in my previous comment can someone correct me ??
@JP-fi1bz
@JP-fi1bz 4 жыл бұрын
Isn't P value used for null hypothesis??
@raedalshehri7969
@raedalshehri7969 2 жыл бұрын
Thank you so much
@syedhamzajamil4490
@syedhamzajamil4490 4 жыл бұрын
Sir Can i use PCA for ignoring Multi_ colinearily
@deepaktripathi892
@deepaktripathi892 4 жыл бұрын
As year and age are related how to decide which one to drop?
@РайымбекТолеугазиев
@РайымбекТолеугазиев 4 жыл бұрын
Cool video, but about heteroskedasticity?
@nithinmamidala
@nithinmamidala 4 жыл бұрын
correlation part 1 video is missing please upload and rearrange with sequence
@akashaia
@akashaia 4 жыл бұрын
Thanks sir your explanation is really good
@Mohit-im1rp
@Mohit-im1rp 3 жыл бұрын
what is the p value in above explanation?
@guneet556
@guneet556 4 жыл бұрын
Hi krish kindly please answer to this question of mine for under and oversampling data the techniques u have mentioned is applied on numneric datatypes for categorical datatypes first we have to encode in some 0,1 then further SMOTETomek and neamiss wil apply that is the proper way to deal ?? and if we use trees approach then it will straight away deal this imbalance and encoding thing itself ?? Please krish reply to this question it will be a big help!
@krishnaik06
@krishnaik06 4 жыл бұрын
Yes tree approach will solve that problem. But understand tree techniques usually require lot of time
@guneet556
@guneet556 4 жыл бұрын
@@krishnaik06 sir thanks a lot i m just a beginner follow ur each video thoroughly . Sir can u make a video on how to built ur resume for guys looking for transition in ds.
@kakranChitra
@kakranChitra 3 жыл бұрын
Thankss, good elaboration!!
@namratarajput7092
@namratarajput7092 4 жыл бұрын
hi i am getting error after running this line "import statsmodels.api as sm" ImportError: cannot import name 'factorial' i am beginner so plz help
@tanweerkhan3020
@tanweerkhan3020 4 жыл бұрын
You need to downgrade your scipy or install statsmodels from master. You can check this in stackoverflow
@SatnamSingh-cm2vt
@SatnamSingh-cm2vt 4 жыл бұрын
what does p-value signifies?
@0SIGMA
@0SIGMA 3 жыл бұрын
WAH! beautiful sir..
@harshstrum
@harshstrum 4 жыл бұрын
Hi bhaiya, didn't get why you choose to drop age feature over year of experience.
@krishnaik06
@krishnaik06 4 жыл бұрын
because age and experience are highly correlated and we could see the p value of age was greater than 0.05
@SayantanSenBony
@SayantanSenBony 4 жыл бұрын
Hi Karish, i have one question, which i am facing daily basic in realtime, How to detect heteroscedasticity and what is the method to rectify it in python ?
@simonelgarrad
@simonelgarrad 3 жыл бұрын
There are many tests available. You can research on Bartlett's test. It assumes that the samples come from populations having same variances. There is another called Goldfeld quandt test. To rectify in python, what I Believe is that when working with MLR, if the normality assumption is satisfied then, you shouldnt get heteroscedasticity. So if we have variables that aren't following normal distribution, they can be transformed (refer Krish's video for that too) and then I dont think heteroscedasticity should be present.
@dhirajbaruah9888
@dhirajbaruah9888 4 жыл бұрын
What does the p value mean??
@raghavchhabra4783
@raghavchhabra4783 4 жыл бұрын
dekhlia, pr samjh nhi aya agar 150+ columns honge to kaise krna hai!!
@sahilgaikwad98
@sahilgaikwad98 4 жыл бұрын
use iloc to get independent variables and store it in X and store target in Y and use OLS model
@abdullahshafi8865
@abdullahshafi8865 4 жыл бұрын
Intuition King is back.
@nidhipandey8004
@nidhipandey8004 3 жыл бұрын
What is the difference in OLS and gradient descent?
@simonelgarrad
@simonelgarrad 3 жыл бұрын
What i have read on it and understood as of now is, that OLS is a good method for simple linear regression. Gradient descent is better method when working with many Independent variables. You can understand more on this by watching Josh Stramer's Gradient descent video
@shreyasb.s3819
@shreyasb.s3819 4 жыл бұрын
Good explained. Thanks a lot
@sushilchauhan2586
@sushilchauhan2586 4 жыл бұрын
What if our feature are in 1000's ? pls reply any one who knows can answer
@krishnaik06
@krishnaik06 4 жыл бұрын
For that we will apply dimensionality reduction
@sushilchauhan2586
@sushilchauhan2586 4 жыл бұрын
@@krishnaik06 Kabhi socha nahi tha, ki BHAI khud reply denge
@sushilchauhan2586
@sushilchauhan2586 4 жыл бұрын
@@krishnaik06 sorry we cant apply dimensionality reduction as its only deal with high variance and has no relation with our class output... we will be using randomized lasso regression or randomized logistic regression.. thank you krish bhai
@jyothinkjayan6508
@jyothinkjayan6508 4 жыл бұрын
When will the next deep learning batch begins
@Vignesh0206
@Vignesh0206 4 жыл бұрын
may i know is there a part one of this video?
@galymzhankenesbekov2924
@galymzhankenesbekov2924 4 жыл бұрын
very good
@abhi9raj776
@abhi9raj776 4 жыл бұрын
thanx a lot sir !
@a_wise_person
@a_wise_person 4 жыл бұрын
How is Dell ispiron i5 8th gen 2tb HDD for data science and programming work.
@srujohn652
@srujohn652 4 жыл бұрын
Its pretty enough though you can use google colab for your ML projects
💩Поу и Поулина ☠️МОЧАТ 😖Хмурых Тварей?!
00:34
Ной Анимация
Рет қаралды 2 МЛН
An Unknown Ending💪
00:49
ISSEI / いっせい
Рет қаралды 56 МЛН
小丑妹妹插队被妈妈教训!#小丑#路飞#家庭#搞笑
00:12
家庭搞笑日记
Рет қаралды 38 МЛН
Multiple Linear Regression using python and sklearn
19:51
Krish Naik
Рет қаралды 150 М.
Dynamic Pricing using Machine Learning Demonstrated
8:05
Data Science Demonstrated
Рет қаралды 28 М.
Linear Regression Summary in R
10:05
Dataslice
Рет қаралды 58 М.
Linear Regression in Python - Full Project for Beginners
50:52
Alejandro AO - Software & Ai
Рет қаралды 23 М.
Learn Statistical Regression in 40 mins! My best video ever. Legit.
40:25
How to implement Linear Regression from scratch with Python
17:03
💩Поу и Поулина ☠️МОЧАТ 😖Хмурых Тварей?!
00:34
Ной Анимация
Рет қаралды 2 МЛН