Thank you for taking the time to do these videos, they have been a huge help with my studies
@perfectmoments3876 Жыл бұрын
Hi, how do you come up with the critical value of 0.1 respectively 10? Is there a source for that?
@MichelleCox-l4w20 күн бұрын
This is a great video, thank you!
@Dataverse32112 жыл бұрын
Thank you for this wonderful lecture 👍🤝
@datatab2 жыл бұрын
Thanks for your nice Feedback!
@paparokauli Жыл бұрын
God bless you woman!
@jasbirmanhas33553 жыл бұрын
Can you make video on coefficient of determination
@datatab3 жыл бұрын
Thanks for your message! Yes sure! I can try!
@kennyjohndelacruz34353 жыл бұрын
Can you make a video about two tailed test for next pls.
@datatab3 жыл бұрын
two tailed t-Test? or in general two tailed?
@juanwang37052 жыл бұрын
hi, this video is really helpful. You mentioned in the next video, you will tell how to test the multicollinearity of dummy variables. but I can't find that video. could you send me a link?
@datatab2 жыл бұрын
Oh sorry, we have just a video on dummy variables! But for dummy variables it is the same, so you test it in the same way. kzbin.info/www/bejne/mJ_Nga1-hpVnd8U
@juanwang37052 жыл бұрын
@@datatab ok! Thanks.
@RichardGoza Жыл бұрын
I am under the Linear regression tab and i do not see the subtab check condition. All i see under Linear Regression is Test assumptions, Effect size and Summary in words. Please help!!! I have a subscription on your website.
@allexclusive007Ай бұрын
The same problem here. Why doesn't she clarify it?
@konstantinosbanos2 ай бұрын
Independent and correlated? How is this possible?
@wayneisk3202 ай бұрын
The variables are independent measurements. Like weight and height or weight and body fat percentage. But these values correlate to a degree with each other. As correlation increases the value of these variables become more interchangeable.
@konstantinosbanos2 ай бұрын
@@wayneisk320 It’s not possible two variables be independent and correlated simultaneously. At the example you mentioned, the variables are not independent.
@wayneisk3202 ай бұрын
@@konstantinosbanos Technically, yes. In practice you do not know the level of independence until you measure them. Commonly all variables for regression are considered "independent" until testing. However, the partial correlations can have new information and will still be used as independent variables. Particularly in large datasets with 1 million+ samples you will find most features may have weak correlations. Excellent example of this is gene SNIPs and medication interactions. There will always be correlations when working with this type of data caused by external co-related factors that are not measured. Independence is accepted as part of feature selection.
@wayneisk3202 ай бұрын
@@konstantinosbanos There is a famous quote from a statistician that "no variable is independent." This was in reference that with enough features all variables will have mutual information. Feature selection and feature abstraction/crafting create independence. But only after statistical analysis. Also mutual information can, with some data, be "removed" in preprocessing. Not technically "independent", but independent enough.
@jasbirmanhas33553 жыл бұрын
Kindly send video on correlation
@datatab3 жыл бұрын
Hello Jasbir, just search at youtube "correlation DATAtab" we have greate Videos about correlation!
@hasanatayoub6512 Жыл бұрын
Can you teach multicolineality by using this series😢 .what is nature of mulicoliniality .is multicoliniality really problem? .what are its practical consequence? .how do one detectit? .what remedial measure?
@mikefieselman Жыл бұрын
The nature of multcolinarity is the similarity on impact one independant variable has with another independant variable. it is really a problem because we're looking for a model that best fits, and why have a model that shows 4 independant variables's influence on a dependant variable, when 3 gives same result. Ask yourself is 3 better than 4? yes it is Practical consequence is that you are not recieving the best model to explain the impact on variables to the independant variable. you detect it with VIF formula which is (1-(1-R^2) where r^2 is correlation squared or you can get R^2 through running a regression analysis on excel remedial measure... you remove one of the dependant variables that shows multicollinarity either from a correlation matrix chart or VIF diagnosis, or looking at the P-value regardless of multicollinarity. You're welcome :)
@mikefieselman Жыл бұрын
You can also detect multicollinarit by performing regression model with only the dependant variables, where you have 1 of the dependant variables as y and the rest as x.