is very helpful, but is it only on my computer that the writing in the video is impossible to read?
@ejiet-igolatemmanuel73764 ай бұрын
Excellent study material. BUT BECAME BLURRED FROM 6.45 TO ABOUT 16.05 MINUTES. Please how can I study the entire video clearly? Thank you for such a summary.
@izzuddinabdullah8806 ай бұрын
I have a question, what if i want want to perform PCA on data that have not just different scale, but also different unit, such as data that involves environmental parameters such as temperature, humidity, light intensity, etc. Will scaling the data can solve this? Thank you
@hefinrhys85726 ай бұрын
Hi, yes this is a common situation. Scaling our variables means we can use them to find meaningful principal components, irrespective of their different measurement scales. Try running PCSLA on your data set with and without scaling the variables, you'll likely see a big difference. Scaling is valid (and important) for vaeiables with different measurement scales.
@uma91837 ай бұрын
use remote sensing data
@uma91837 ай бұрын
Sir, please make a videos regulary
@uselessminority60718 ай бұрын
great tutorial, although some functions are not up to date anymore (like plot for boxplot is now boxplot etc.)
@TEBAKBENDA Жыл бұрын
attend to learn
@kasia9904 Жыл бұрын
when i generate the PCA with the code explained @ 20:46 my legend appears as a gradient rather than the separate values (as in your three different species appearing in red, blue green. how can i change this?
@tonyrobinson9046 Жыл бұрын
Outstanding. Thank you.
@nrlzt9443 Жыл бұрын
really love your explanantion! thank you so much for your video, really helpful and i can understand it! keep it up! looking forward to your many more upcoming videos
@user-kb6ui2sh5v Жыл бұрын
really useful video thank you, I've just started my MSc project using PCA, so thank you for this. I will be following subsequent videos.
@kevinroberts5703 Жыл бұрын
thank you so much for this video. incredibly helpful.
@djangoworldwide7925 Жыл бұрын
Great tutorial but it leaves me with the question, what do i do with it? Is this just the begining of a K means classification that gives me an idea of the proper k?
@djangoworldwide7925 Жыл бұрын
Lol you just replied in 26:00... Thank you so much!
@djangoworldwide7925 Жыл бұрын
Fantastic. Subscribed!
@noahmutunga1708 Жыл бұрын
Many thanks for creating this. I found it very useful.
@mohsenvigeh Жыл бұрын
Great! There are many practical points.
@christianberntsen3856 Жыл бұрын
10:21 - When using "prcomp", the calculation is done by a singular value decomposition. So, these are not actually eigenvectors, right?
@hefinrhys8572 Жыл бұрын
SVD still finds eigenvectors as it's a generalization of eigen-decomposition. This might be useful: web.mit.edu/be.400/www/SVD/Singular_Value_Decomposition.htm
@christianberntsen3856 Жыл бұрын
@@hefinrhys8572 Thank you answering! I will look into it.
@mohammadtuhinali14302 жыл бұрын
Many thanks for your efforts to make this complex issue much easier for us. Could you enlight me to understand group similarly and dissimilarity using pca?
@alessandrorosati9692 жыл бұрын
How is it possible to generate outliers uniformly in the p-parallelotope defined by the coordinate-wise maxima and minima of the ‘regular’ observations in R?
@sajidfaisal32832 жыл бұрын
You are really doing amazing videos with providing the class codes, that's totally amazing. Why are not you continuing your video?
@lisakaly63712 жыл бұрын
In fact I found out how to overcome the multicolinearity , by using the eigen values of PC1 and PC2! I love PCA!
@lisakaly63712 жыл бұрын
Thank you for this great video. can you show how to seek multicolinearity or treat multicolinearity with PCA ? I have a data set with 40 variables with high intercorrelation because of cross reactivity . VIF and matrix correlation doesnt work probably because of multiple comparison ....:(((
@chris-qm2tq2 жыл бұрын
Excellent walkthrough. Thank you!
@mikeybratkovic2 жыл бұрын
Wow! Absolute great video, well explained + really helpful tipps and tricks! Thank you for that!
@vagabond1979792 жыл бұрын
Added to my stats/math playlist! Very useful.
@roshansharma34072 жыл бұрын
please explain Scatterplot using DAPC also
@parth12112 жыл бұрын
Ty for the quality content brother , I am beginner that's been very helpful can you please provide more videos 🙂 thankyou
@parth12112 жыл бұрын
Why not uploading more videos brother?
@firdaussaleh55192 жыл бұрын
How apply conditional vlookup in r program?
@HarmonicaTool2 жыл бұрын
5 year old video still one of the best I found on the topic on YT. Thumbs up
@Margoth1952 жыл бұрын
Sir, you are a saint! Thank you thank you thank you!!! Not only did you make this easy but you gave me peace of mind. If we ever meet in person, I hope you will give me the honor of buying you a drink.
@ailen51972 жыл бұрын
Great video! How can I have the package ggplot2 in the new R version?
@hefinrhys85722 жыл бұрын
Hiya, you just need to run install.packages("ggplot2") in R :) Thanks!
@aminsajid1232 жыл бұрын
Amazing video! Thanks for the explaining everything very simply. Could you please do a video on PLS-DA?
@Moe5Tavern2 жыл бұрын
I wish you had made a career out of KZbin tutorials, this is the best R tutorial I've found. You seem to understand very well what is important to know for beginners and convey it beautifully. Thank you so much!
@brunocamargodossantos50492 жыл бұрын
Thanks for the the video, it helped me a lot!! Your explanation is very didactic!
@suavejoshokoye94562 жыл бұрын
Your videos are Gems!!
@federicogarland2722 жыл бұрын
Thank you so much, I've watched almost all your videos and am now able to use R for useful data processing tasks in my work field which is agriculture. You're great at explaining, cheers!
@mario17-t342 жыл бұрын
Thanks much Hefin!!!
@minhtrungdang18532 жыл бұрын
Thank you for the video. I have a question regarding the option of exporting or saving graph as Jpeg or PNJ. How can we keep the font size exactly the same after saving pic. I did set up fig.height and fig.witdth in '''{r}, but the title of the graph misses some characters of the end. Thank you !
@rafaborkowski5802 жыл бұрын
How can I upload my data into RStudio to work with ?
@avnistar27032 жыл бұрын
Can you run PCA on factor variables coded as 0 vs 1. 1 meaning presence of something?
@hefinrhys85722 жыл бұрын
There are some answers here that might help: stats.stackexchange.com/questions/5774/can-principal-component-analysis-be-applied-to-datasets-containing-a-mix-of-cont But I would ask what your goal is with this. Are you looking to uncover some underlying latent variables in your data? In which case factor analysis may be the way to go. If it's just to reduce dimensionality to uncover clusters/patterns in the data, then PCA might work, but it will treat those 0/1 variables as continuous, which might not yield the results you're hoping for.
@federicogarland2722 жыл бұрын
thank you very much
@glawtonmoore2 жыл бұрын
GRE = graduate record exam - a standards base testing, evaluating preparedness for grad school
@Axle_Tavish2 жыл бұрын
Explained everything one might need. If only every tutorial on KZbin is like this one!
@stefankazakov53022 жыл бұрын
Nice explained! Thank you!
@fasihakhan53192 жыл бұрын
Giving commanding is tough....
@fsxaviator2 жыл бұрын
Where did you define PC1 and PC2 (where you use them in the ggplot)? I'm getting "Error: object 'PC1' not found"
@yibletdagnachew17552 жыл бұрын
Wow, Thank you!!! Please some more
@lindseykoper7612 жыл бұрын
Thank you so much for your videos!! Your videos are the best I have seen hands down :) All of your explanations and step by step through R are what I needed to work on my research. One area I am having trouble with (since I am not a statistician) is making sure I run my data through all the necessary statistical tests before running the PCA. My data is similar to the iris dataset (skull measurements categorized by family and subfamily levels) but I am seeing different sources run different tests before the PCA (ANOVA vs non-parametric tests). If anything, would you be able to recommend some good sources for me to refer to? Thank you! I really appreciate it!