Cluster analysis
43:38
7 жыл бұрын
Writing functions in R
22:32
7 жыл бұрын
Introduction to R and RStudio part 2
1:27:24
Conditional statements and loops in R
39:01
Principal components analysis in R
26:49
Logistic regression in R
1:06:49
7 жыл бұрын
Introduction to R and RStudio
1:31:21
7 жыл бұрын
Introduction to ggplot in R
1:17:25
7 жыл бұрын
Пікірлер
@ejiet-igolatemmanuel7376
@ejiet-igolatemmanuel7376 3 ай бұрын
Excellent study material. BUT BECAME BLURRED FROM 6.45 TO ABOUT 16.05 MINUTES. Please how can I study the entire video clearly? Thank you for such a summary.
@izzuddinabdullah880
@izzuddinabdullah880 5 ай бұрын
I have a question, what if i want want to perform PCA on data that have not just different scale, but also different unit, such as data that involves environmental parameters such as temperature, humidity, light intensity, etc. Will scaling the data can solve this? Thank you
@hefinrhys8572
@hefinrhys8572 5 ай бұрын
Hi, yes this is a common situation. Scaling our variables means we can use them to find meaningful principal components, irrespective of their different measurement scales. Try running PCSLA on your data set with and without scaling the variables, you'll likely see a big difference. Scaling is valid (and important) for vaeiables with different measurement scales.
@uma9183
@uma9183 6 ай бұрын
use remote sensing data
@uma9183
@uma9183 6 ай бұрын
Sir, please make a videos regulary
@uselessminority6071
@uselessminority6071 7 ай бұрын
great tutorial, although some functions are not up to date anymore (like plot for boxplot is now boxplot etc.)
@TEBAKBENDA
@TEBAKBENDA 11 ай бұрын
attend to learn
@kasia9904
@kasia9904 Жыл бұрын
when i generate the PCA with the code explained @ 20:46 my legend appears as a gradient rather than the separate values (as in your three different species appearing in red, blue green. how can i change this?
@tonyrobinson9046
@tonyrobinson9046 Жыл бұрын
Outstanding. Thank you.
@nrlzt9443
@nrlzt9443 Жыл бұрын
really love your explanantion! thank you so much for your video, really helpful and i can understand it! keep it up! looking forward to your many more upcoming videos
@user-kb6ui2sh5v
@user-kb6ui2sh5v Жыл бұрын
really useful video thank you, I've just started my MSc project using PCA, so thank you for this. I will be following subsequent videos.
@kevinroberts5703
@kevinroberts5703 Жыл бұрын
thank you so much for this video. incredibly helpful.
@djangoworldwide7925
@djangoworldwide7925 Жыл бұрын
Great tutorial but it leaves me with the question, what do i do with it? Is this just the begining of a K means classification that gives me an idea of the proper k?
@djangoworldwide7925
@djangoworldwide7925 Жыл бұрын
Lol you just replied in 26:00... Thank you so much!
@djangoworldwide7925
@djangoworldwide7925 Жыл бұрын
Fantastic. Subscribed!
@noahmutunga1708
@noahmutunga1708 Жыл бұрын
Many thanks for creating this. I found it very useful.
@mohsenvigeh
@mohsenvigeh Жыл бұрын
Great! There are many practical points.
@christianberntsen3856
@christianberntsen3856 Жыл бұрын
10:21 - When using "prcomp", the calculation is done by a singular value decomposition. So, these are not actually eigenvectors, right?
@hefinrhys8572
@hefinrhys8572 Жыл бұрын
SVD still finds eigenvectors as it's a generalization of eigen-decomposition. This might be useful: web.mit.edu/be.400/www/SVD/Singular_Value_Decomposition.htm
@christianberntsen3856
@christianberntsen3856 Жыл бұрын
@@hefinrhys8572 Thank you answering! I will look into it.
@mohammadtuhinali1430
@mohammadtuhinali1430 Жыл бұрын
Many thanks for your efforts to make this complex issue much easier for us. Could you enlight me to understand group similarly and dissimilarity using pca?
@alessandrorosati969
@alessandrorosati969 Жыл бұрын
How is it possible to generate outliers uniformly in the p-parallelotope defined by the coordinate-wise maxima and minima of the ‘regular’ observations in R?
@sajidfaisal3283
@sajidfaisal3283 Жыл бұрын
You are really doing amazing videos with providing the class codes, that's totally amazing. Why are not you continuing your video?
@lisakaly6371
@lisakaly6371 Жыл бұрын
In fact I found out how to overcome the multicolinearity , by using the eigen values of PC1 and PC2! I love PCA!
@lisakaly6371
@lisakaly6371 Жыл бұрын
Thank you for this great video. can you show how to seek multicolinearity or treat multicolinearity with PCA ? I have a data set with 40 variables with high intercorrelation because of cross reactivity . VIF and matrix correlation doesnt work probably because of multiple comparison ....:(((
@chris-qm2tq
@chris-qm2tq 2 жыл бұрын
Excellent walkthrough. Thank you!
@mikeybratkovic
@mikeybratkovic 2 жыл бұрын
Wow! Absolute great video, well explained + really helpful tipps and tricks! Thank you for that!
@vagabond197979
@vagabond197979 2 жыл бұрын
Added to my stats/math playlist! Very useful.
@roshansharma3407
@roshansharma3407 2 жыл бұрын
please explain Scatterplot using DAPC also
@parth1211
@parth1211 2 жыл бұрын
Ty for the quality content brother , I am beginner that's been very helpful can you please provide more videos 🙂 thankyou
@parth1211
@parth1211 2 жыл бұрын
Why not uploading more videos brother?
@firdaussaleh5519
@firdaussaleh5519 2 жыл бұрын
How apply conditional vlookup in r program?
@HarmonicaTool
@HarmonicaTool 2 жыл бұрын
5 year old video still one of the best I found on the topic on YT. Thumbs up
@Margoth195
@Margoth195 2 жыл бұрын
Sir, you are a saint! Thank you thank you thank you!!! Not only did you make this easy but you gave me peace of mind. If we ever meet in person, I hope you will give me the honor of buying you a drink.
@ailen5197
@ailen5197 2 жыл бұрын
Great video! How can I have the package ggplot2 in the new R version?
@hefinrhys8572
@hefinrhys8572 2 жыл бұрын
Hiya, you just need to run install.packages("ggplot2") in R :) Thanks!
@aminsajid123
@aminsajid123 2 жыл бұрын
Amazing video! Thanks for the explaining everything very simply. Could you please do a video on PLS-DA?
@Moe5Tavern
@Moe5Tavern 2 жыл бұрын
I wish you had made a career out of KZbin tutorials, this is the best R tutorial I've found. You seem to understand very well what is important to know for beginners and convey it beautifully. Thank you so much!
@brunocamargodossantos5049
@brunocamargodossantos5049 2 жыл бұрын
Thanks for the the video, it helped me a lot!! Your explanation is very didactic!
@suavejoshokoye9456
@suavejoshokoye9456 2 жыл бұрын
Your videos are Gems!!
@federicogarland272
@federicogarland272 2 жыл бұрын
Thank you so much, I've watched almost all your videos and am now able to use R for useful data processing tasks in my work field which is agriculture. You're great at explaining, cheers!
@mario17-t34
@mario17-t34 2 жыл бұрын
Thanks much Hefin!!!
@minhtrungdang1853
@minhtrungdang1853 2 жыл бұрын
Thank you for the video. I have a question regarding the option of exporting or saving graph as Jpeg or PNJ. How can we keep the font size exactly the same after saving pic. I did set up fig.height and fig.witdth in '''{r}, but the title of the graph misses some characters of the end. Thank you !
@rafaborkowski580
@rafaborkowski580 2 жыл бұрын
How can I upload my data into RStudio to work with ?
@avnistar2703
@avnistar2703 2 жыл бұрын
Can you run PCA on factor variables coded as 0 vs 1. 1 meaning presence of something?
@hefinrhys8572
@hefinrhys8572 2 жыл бұрын
There are some answers here that might help: stats.stackexchange.com/questions/5774/can-principal-component-analysis-be-applied-to-datasets-containing-a-mix-of-cont But I would ask what your goal is with this. Are you looking to uncover some underlying latent variables in your data? In which case factor analysis may be the way to go. If it's just to reduce dimensionality to uncover clusters/patterns in the data, then PCA might work, but it will treat those 0/1 variables as continuous, which might not yield the results you're hoping for.
@federicogarland272
@federicogarland272 2 жыл бұрын
thank you very much
@glawtonmoore
@glawtonmoore 2 жыл бұрын
GRE = graduate record exam - a standards base testing, evaluating preparedness for grad school
@Axle_Tavish
@Axle_Tavish 2 жыл бұрын
Explained everything one might need. If only every tutorial on KZbin is like this one!
@stefankazakov5302
@stefankazakov5302 2 жыл бұрын
Nice explained! Thank you!
@fasihakhan5319
@fasihakhan5319 2 жыл бұрын
Giving commanding is tough....
@fsxaviator
@fsxaviator 2 жыл бұрын
Where did you define PC1 and PC2 (where you use them in the ggplot)? I'm getting "Error: object 'PC1' not found"
@yibletdagnachew1755
@yibletdagnachew1755 2 жыл бұрын
Wow, Thank you!!! Please some more
@lindseykoper761
@lindseykoper761 2 жыл бұрын
Thank you so much for your videos!! Your videos are the best I have seen hands down :) All of your explanations and step by step through R are what I needed to work on my research. One area I am having trouble with (since I am not a statistician) is making sure I run my data through all the necessary statistical tests before running the PCA. My data is similar to the iris dataset (skull measurements categorized by family and subfamily levels) but I am seeing different sources run different tests before the PCA (ANOVA vs non-parametric tests). If anything, would you be able to recommend some good sources for me to refer to? Thank you! I really appreciate it!
@jakubkaczynski4747
@jakubkaczynski4747 2 жыл бұрын
superb!
@brazilfootball
@brazilfootball 2 жыл бұрын
Dumb question, but the way we interpret our results given different variables for our logistic regression, it's always "in the direction of whatever our "1" value is"? As in, a positive odds ratio says we are X times likely to see (fill in blank for definition of value 1 = "get admitted to college"; "catch a fish while fishing", "Win the lottery")? This is not the odds of getting a "0", correct?
@hefinrhys8572
@hefinrhys8572 2 жыл бұрын
Yes you are correct. The direction of the odds ratios depend on which class you assign the value 1 to. An odds ratio of 2 means with a 1 unit increase in the predictor, the positive outcome is twice as likely as the negative outcome. An odds ratio of 0.5 means with a 1 unit increase in the predictor, the negative outcome is twice as likely as the positive outcome (after accounting for the effects of all other predictors).
@brazilfootball
@brazilfootball 2 жыл бұрын
@@hefinrhys8572 Awesome, thank you, and thank for replying to a question on a 3 year old video when most ppl wouldn't! 😅