Your videos are always on point. You make Data science a lot simpler....thanks a lot for explaining in detail
@RichardOnData3 жыл бұрын
My pleasure! That's the stated goal of my channel!
@arcadevampire2 жыл бұрын
It would be great if could do a deep dive into generalized, lasso, ridge, elastic nets. Your explanations are always very straight forward. Cheers
@RichardOnData2 жыл бұрын
This is coming up the pipeline soon. Thanks!
@kaushikroychowdhury77873 жыл бұрын
Plz make this type of videos , why and when to use different models ? Appreciate you work Thank you
@RichardOnData3 жыл бұрын
I will try to do exactly that! "When should you use PCA" is right around the corner!
@gregmaland53183 жыл бұрын
Wow! This was way over my head. Yet, I still think I got something out of it.
@jorislimonier3 жыл бұрын
Currently writing my thesis on High Dimensional Regression Models. Such an interesting topic 👌🏻👌🏻 Great video !
@RichardOnData3 жыл бұрын
Awesome! Thank you; yes, isn't it an exciting topic?
@jorislimonier3 жыл бұрын
@@RichardOnData It is. Specifically LASSO and determining which parameters to throw away...pretty cool !
@chacmool25813 жыл бұрын
I see quite often the use of regression without checks for substitutions and model fit. I see people using SLR without understanding, in fact confusing linearity for collinearity when those two things are separate and distinct. I do think BLR is a bit trickier than SLR for two reasons. One, the coefficients that come out of the glm() function in R are logarithmic values so you need to exponentiate them. Two, the response variable is the log odds or just odds after coefficient exponentiation. The other tricky part of logistic regression are the assumption of linearity of continuous variables vs. the logit of the response to be checked with BoxTidwell.For OLR, the equal odds assumption to be checked with a Brant test.
@dmitrytkachuk23043 жыл бұрын
Thanks for the video. Richard can you explain more about classification methods, for example when we should use log-regression, SVM or another methods? In modern data science log-regression (in your opinion) is still cool?
@bassthunder81113 жыл бұрын
Great Video! In another video you could tackle an adjacent problem: "interpretable" ML methods like partial dependence profiles, variable importance measures and instance based methods.
@RichardOnData3 жыл бұрын
Great suggestion! I think that would help a lot of people.
@vishalthatsme3 жыл бұрын
Using L1 for feature selection - I’ve seen it mentioned in various places but never explained clearly, in case you’re looking for future topic ideas 😉. Also, detecting/dealing with multicollinearity - tricky and a little confusing.... Also, GLMs... I could go on and on...
@RichardOnData3 жыл бұрын
Those are three excellent video ideas. I'll roll them all into the video pipeline!
@vishalthatsme3 жыл бұрын
@@RichardOnData keep up the great work 👍🏽
@shyamgurunath58763 жыл бұрын
Good tutorial Richard.Can you do a video on Linear regression Assumption & can I use ensemble of Linear & Ridge to find the response variable ?
@RichardOnData3 жыл бұрын
You certainly can ensemble that way, though I've never done that myself nor heard of doing so. Now, ensembling the Lasso and Ridge Regression penalty parameters is an approach in and of itself known as the Elastic Net. I use that one all the time. Great video ideas!
@206Seattle3 жыл бұрын
Thank you Richard!
@unmanbarman86193 жыл бұрын
Hi can you please do a video for how much and what to learn in python for data science/ data analysis same as you did for sql
@Trazynn3 жыл бұрын
At university they only taught me the formulas with barely any context. And even those weren't complete. I had to learn everything else from KZbin.
@RichardOnData3 жыл бұрын
Yeah..... I get the feeling that's a common experience for far too many. I hope this video was helpful.
@jasonloghry Жыл бұрын
I really enjoyed this video, so very helpful! Would you have any interest in making a video about the basics of interactions?
@moisesdiaz98523 жыл бұрын
Great explanation as always
@prod.kashkari30753 жыл бұрын
Ugh I hate when they label logistic regression as a classification algorithm in machine learning. It really isn’t right?
@RichardOnData3 жыл бұрын
Logistic regression can be trained in ML style using stochastic gradient descent. While it's fitted values consist of the log odds of the event "success" (a quantity that is on a negative to positive infinity scale), this can be converted to a probability. This probability can then be used for classification purposes (i.e. if Observation 1 has >0.5 probability of being in Class A, classify them as Class A). Ergo, it's messy looking to define a "regression" method as a "classification algorithm", but it can indeed serve as such.