NOTE: The StatQuest LDA Study Guide is available! statquest.gumroad.com Support StatQuest by buying my book The StatQuest Illustrated Guide to Machine Learning or a Study Guide or Merch!!! statquest.org/statquest-store/
@realcirno17504 жыл бұрын
woohoo
@WIFI-nf4tg3 жыл бұрын
Can you please do something on canonical analysis ?
@statquest3 жыл бұрын
@@WIFI-nf4tg I'll keep that in mind.
@falaksingla62422 жыл бұрын
Hi Josh, Love your content. Has helped me to learn a lot & grow. You are doing an awesome work. Please continue to do so. Wanted to support you but unfortunately your Paypal link seems to be dysfunctional. Please update it.
@aayushtheapple2 жыл бұрын
website shows "Error establishing a database connection" !
@yuniprastika70223 жыл бұрын
the funny thing is, so many materials from this channel are for those university students (like me) but he keeps treating us like kindergarten children. Haha feels like i'll never be growing up, by watching your videos sir! QUADRO BAAM SIR, THIS WORLD HAS BEEN GONE TOO SERIOUS, THANK YOU FOR BRINGING BACK THE JOY
@statquest3 жыл бұрын
Thank you very much! :)
@daisy-fb5jc2 жыл бұрын
I am a kindergarden kid in this subject : (
@andrejalabama1204 Жыл бұрын
@@daisy-fb5jc same here; i need someone to explain it like im a little kid
@SinoLegionaire9 ай бұрын
Remember: Us Adults are just big children
@bokai58295 жыл бұрын
Every time I heard the intro music. I know my assignment is due in 2 days.
@statquest5 жыл бұрын
LOL! :)
@bokai58295 жыл бұрын
@@statquest Thank you very much!
@KonesThe4 жыл бұрын
hahahah I'm on the same boat right now
@HenriqueEC14 жыл бұрын
Good to know I'm not alone.
@PJokerLP Жыл бұрын
10,5 hours till my machine learning exam. Thank you so much, I feel way better prepared than if I would have watched all of my class material.
@beccalynch44072 жыл бұрын
Just spent hours so confused, watching my lectures where the professor used only lin alg and not a single picture. Watched this video and understood it right away. Thank you so much for what you do!
@statquest2 жыл бұрын
Glad it helped!
@robinduan19856 жыл бұрын
This is amazing! 15 mins video does way better than my lecturer in an 2 hours class
@elise34553 жыл бұрын
While these 15 min videos are excellent for gaining intuition, you still often need those 2-hour classes to get familiar with the mathematical rigor.
@NoahElRhandour2 жыл бұрын
@@elise3455 no you dont. math follows super quick and easy when you understood what it is about
@GaganSingh-zz9el Жыл бұрын
@@NoahElRhandour yeah brother
@Jacob-t1j Жыл бұрын
@@elise3455 No you don't. Math become super easy once you understand what you doing
@seifeldineslam4 жыл бұрын
This was honestly helpful, i am an aspiring behavioral geneticist (Aspiring because I am still an undergraduate of biotechnology) with really disrupted fundamentals of math especially statics. Your existence as a youtube channel is a treasure discovery to me !
@statquest4 жыл бұрын
Thanks! :)
@Muzik2hruRain4 жыл бұрын
You, sir, you are a life saver. Now in every complicated machine learning topics I look for your explanation, or at least wonder how you would have approached this. Thank you, really.
@statquest4 жыл бұрын
Awesome! Thank you! :)
@haydo83737 жыл бұрын
Hey what is the intro track called? I couldn't find it on Spotify. . . :D
@hiteshjoshi306126 күн бұрын
It's their own
@hiteshjoshi306126 күн бұрын
Listen carefully it's the channel name in it and is cool 😂😂👌
@PV100085 жыл бұрын
I really like the systematic way you approach each topic and anticipate all the questions a student might have.
@sridharyamijala4739 Жыл бұрын
Another excellent video just as great as the one on PCA. I read a Professor's view on most of the models and algorithms stuff in ML where he recommended understanding the concepts well so that we know where to apply and not worry too much about the actual computation at that stage. The thing that is great in your videos is that you explain the concept very well.
@statquest Жыл бұрын
Thank you very much! :)
@rachelstarmer98358 жыл бұрын
Awesome! Even I get it and love it! I'm going to share one of your stat-quest posts as an example of why simple explanations in everyday language is far superior to using academic jargon in complex ways to argue a point. Also, it's a great example of how to develop an argument. You've created something here that's useful beyond statistics! Three cheers for the liberal arts education!!!! Three cheers for Stat-Quest!!
@kshitijkumar66104 жыл бұрын
Are you somehow related to Joshua? :-P
@georgeshibley95294 жыл бұрын
@@rachelstarmer5073 ha
@merida39752 жыл бұрын
The song at the beginning made my day, even though I took wrong tutorial of Linear discriminant analysis in data science. Just awesome. Love it a lot. We need more and more funny teachers like you.
@statquest2 жыл бұрын
Thanks!
@RaghuMittal7 жыл бұрын
Great video! I initially couldn't understand LDA looking at the math equations elsewhere, but when I came across this video, I was able to understand LDA very well. Thanks for the effort.
@mahdimantash3132 жыл бұрын
I really can't thank you enough for that...you did in 16 mins what I couldn't do in 4 hours. keep on the good work!! and thank you again !!!
@statquest2 жыл бұрын
Thanks!
@phoenixflames6019Ай бұрын
10/10 intro song 10/10 explanation using PCA, I can reduce these two ratings to just one: 10/10 is enough to rate the whole video using LDA, the KZbin chapters feature maximizes the separation between these 2 major components (intro and explanation) of the video
@statquestАй бұрын
BAM!!! :)
@lifeislarge2 жыл бұрын
Never thought anyone could explain things this easily. I appreciate the effort. Thank You
@statquest2 жыл бұрын
Thank you! :)
@neelkhare69344 жыл бұрын
Wow , that is one of the best explanations of LDA it helped me get an intuitive idea about LDA and what it actually does in classification Thank You!
@statquest4 жыл бұрын
Hooray! Thank you! :)
@neelkhare69344 жыл бұрын
Can you make a video on quadratic discriminant Analysis
@statquest4 жыл бұрын
@@Sachin-vr4ms Which part? Can you specify minutes and seconds in the video?
@statquest4 жыл бұрын
@@Sachin-vr4ms I'm sorry that it is confusing, but let me try to explain: At 9:46, imagine rotating the black line a bunch of times, a few degrees at a time, and using the equation shown at 8:55 to calculate a value at each step. The rotation that gives us the largest value (i.e. there is a relatively large distance between the means and a relatively small amount of scatter in both clusters) is the rotation that we select. If we have 3 categories, then we rotate an "x/y-axis" a bunch of times, a few degrees each time, and calculate the distances from the means to the central point and the scatter for each category and then calculate the ratio of the squared means and the scatter. Again, the rotation with the largest value is the one that we will use. Does that help?
@statquest4 жыл бұрын
@@Sachin-vr4ms I'm glad it was helpful, and I'll try to include more "how to do this in R and python" videos.
@neillunavat4 жыл бұрын
I am so glad this channel has grown to around 316k subscribers. Very well explained. The best of bests.
@statquest4 жыл бұрын
Wow, thank you!
@Azureandfabricmastery4 жыл бұрын
Hi Josh, Helpful to understand the differences between PCA and LDA and how LDA actually works internally. You're indeed making life easier with visual demonstrations for students like me :) God bless and Thank you!
@statquest4 жыл бұрын
Glad it was helpful!
@leeamraaАй бұрын
among the best best 15 minutes you can spend on youtube! thank you.
@statquestАй бұрын
Wow, thanks!
@adejumobiidris28922 жыл бұрын
Thank you so much for helping me provide a faster solution for the confusion that has taken control of my head for 72h.
@statquest2 жыл бұрын
Happy to help!
@laurading70124 жыл бұрын
I just graduated from high school, but your videos helped me understand many research papers. Thank you very much!!!!!
@statquest4 жыл бұрын
BAM and congratulations!!! :)
@138shreyshekhar24 жыл бұрын
@@statquest DOUBLE BAM !!
@nishisaxena48314 жыл бұрын
much better than my university lecture that I listened to twice but couldn't understand ... this was awesome, thanks!
@statquest4 жыл бұрын
Hooray! I'm glad the video was helpful. :)
@rohil19934 жыл бұрын
This explains the beauty of LDA so well! Thank you so much!
@statquest4 жыл бұрын
Awesome! Thank you very much! :)
@sassmos0087 жыл бұрын
wow... my professor has been trying to teach me the concepts for weeks. and now I finally understand. Thank you so much. I will refer this to my mates.
@peterantley7 жыл бұрын
You are my hero. I am a senior hoping to get into data science and your videos are great and very helpful. Keep up the good work.
@chemicalbiomedengine5 жыл бұрын
always excited when i look for a topic and its available on statquest
@statquest5 жыл бұрын
Awesome! :)
@scottsun9413 Жыл бұрын
Really great videos, saved me from my data science classes. I'm applying for graduate program at UNC, hope I can have the opportunity to meet the content creators sometime in the future.
@statquest Жыл бұрын
Best of luck!
@balexander287 жыл бұрын
All of your StatQuest videos are awesome! Thanks for using your time to help others! Much appreciated!
@alis58933 жыл бұрын
Josh. you are an amazing teacher. i have learned so much from you , a big thank you from the bottom ofmy heart. god bless you
@statquest3 жыл бұрын
My pleasure!
@elizabeths39893 жыл бұрын
You are about to be the reason I pass my qualifying exam in bioinformatics 🙏🙏
@statquest3 жыл бұрын
Good luck!!! BAM! :)
@saiakhil19974 жыл бұрын
I really liked how you compared the processes of PCA and LDA analysis. I got to know a different way to view LDA due to this video
@statquest4 жыл бұрын
Bam!
@alexhoneycutt83524 жыл бұрын
This helped me understand LDA before my midterm! I could not wrap my head around how the functions worked and what they did, but I got an "ah-hah!" moment at 6:49 and I totally understand it now. Thank you for explaining this!
@statquest4 жыл бұрын
Hooray! Good luck with your midterm. :)
@hlatse987 жыл бұрын
Brilliant video! Very helpful. Thank you.
@Illinoise8884 жыл бұрын
Thanks for the video! I have an exam next week and even though its open book, I still didn't feel comfortable going into it. This video definitely helped!
@statquest4 жыл бұрын
Good luck and let me know how it goes. :)
@jahanvi94292 жыл бұрын
the song in the introduction is always awesome. thanks lol! and very useful video
@statquest2 жыл бұрын
Thanks!
@hanaibrahim15632 жыл бұрын
Amazing. Thank you for this excellent video. Explained everything super clearly to me in a super concise manner without all the academic jargon getting in the way.
@statquest2 жыл бұрын
Glad it was helpful!
@oklu_3 жыл бұрын
thank you for your kind, slow, and detailed explanation😭
@statquest3 жыл бұрын
You’re welcome 😊!
@rmiliming2 жыл бұрын
very clearly explained. the video is very enjoyable to watch too! Statquest has all that is needed to learn machine learning algos and stats well
@statquest2 жыл бұрын
Thank you!
@SeqBioMusic7 жыл бұрын
Awesome! It'll be good to give some differences of PCA and LDA. For example, PCA is studying the X. LDA is studying the X->Y.
@DaveGogerly2 жыл бұрын
I love your stuff, you have the knack to explain things better than most!
@statquest2 жыл бұрын
Thank you!
@meng-laiyin21982 жыл бұрын
@@statquest Thank you so much for this video. I tried to understand LDA by reading lots of materials (books, papers, etc.), but none of them can explain things as clear as you do. Really appreciate it!
@statquest2 жыл бұрын
@@meng-laiyin2198 Thanks! :)
@datoubi4 жыл бұрын
i recommended all your videos to my fellow students in the data analysis course
@statquest4 жыл бұрын
Thank you very much! :)
@jialingzhang13413 жыл бұрын
Thanks for this brilliant video! One thing I think is worth mentioning or emphasizing is LDA is supervised and PCA is unsupervised.
@statquest3 жыл бұрын
Noted
@Dr.CandanEsin4 жыл бұрын
Too much time and effort spent, but they worth it. Best explanation I watched after six weeks of search. Cordially thank you.
@statquest4 жыл бұрын
Thanks! :)
@rodrigolivianu95314 жыл бұрын
Great video! Just wanted to point out that LDA is a classifier, which involves a few more steps than the procedure described here, such as assumption that the data is gaussian. The procedure here described is only the feature extraction/dimensionality reduction phase of the LDA. G
@statquest4 жыл бұрын
You are correct! I made this video before I was aware that people had adapted LDA for classification. Technically we are describing "Fisher's Linear Discriminant". That said, using LDA for classification is robust to violations to the gaussian assumptions. For more details, see: sebastianraschka.com/Articles/2014_python_lda.html
@rodrigolivianu95314 жыл бұрын
StatQuest with Josh Starmer That said, I must admit I am having a really hard time understanding how the fisherian and baysian approach lead to the same conclusion even with completely different routes. If you have any source on that it would be of enormous help for my sanity haha
@aneeshmenon128 жыл бұрын
woww........toooo goodddddddddddd.....dear Starmer...nothing to say..you are incredible...I am eagerly waiting for your next video...
@arungandhi56123 жыл бұрын
you are very cool bro. I aced my work at my research institute because of youuuuuuuu
@statquest3 жыл бұрын
That's awesome!!! So glad to hear the videos are helpful. :)
@ChaminduWeerasinghe3 жыл бұрын
Best explanation iv ever seen on ML. This is the first time iv watch ML youtube video without rewind :| .. Keep Up bro..
@statquest3 жыл бұрын
Wow, thanks!
@wtfJonKnowNothing9 ай бұрын
I'm more aligned to hear and love the song than the lecture these days :)
@statquest9 ай бұрын
:)
@amrit200619943 жыл бұрын
"But what if we used data from 10k genes?" "Suddenly, being able to create 2 axes that maximize the separation of three categories is 'super cool'." Well played, StatQuest, well played!
@statquest3 жыл бұрын
Thanks!
@worldofbrahmatej20235 жыл бұрын
Excellent! You are a better teacher than many overrated professors out there :)
@statquest5 жыл бұрын
Thank you! :)
@daisy-fb5jc2 жыл бұрын
I wish I can throw this video to my professor, and teach her how to give understandable lectures. Just a wish.
@statquest2 жыл бұрын
:)
@seant79074 жыл бұрын
subscribed just because the way you described this topic is so simple and understandable. nice job!
@statquest4 жыл бұрын
Thank you very much! :)
@whasuklee5 жыл бұрын
Came for my midterm tomorrow, stayed for the intro track.
@onurdemir3534 жыл бұрын
You are awesome.Eventually,I was able to reach understanding point of machine learning staffs thanks to you.
@statquest4 жыл бұрын
Awesome! :)
@LifeofTF4 жыл бұрын
Loved the explanation. Your channel has been a truly invaluable source for studying ML. I was wondering whether you could make a video on the differences/similarities along with use cases for KNN/LDA/PCA.
@statquest4 жыл бұрын
I'll keep that in mind.
@Anmolmovies6 жыл бұрын
Absolutely brilliant. Kudo's to you for making seem it so simple. Thanks!
@alphabetadministrator6 ай бұрын
Hello Josh. As always, thank you for your super intuitive videos. I won't survive college without you. I do have an unanswered conundrum about this video, however. For Linear Discriminant Analysis, shouldn't there be at least as many predictors as the number of clusters? Here's why. Say p=1 and I have 2 clusters. In this case, there is nothing I can do to further optimize the class separations. The points as they are on the line already maximizes the Fisher Criterion(between-class scatter/in-class scatter). While I do not have the second predictor axis to begin with, even if I were to apply a linear transformation on the line to find a new line to re-project the data on, it will only make the means closer together. Extending this reasoning to the 2D case where you used gene x and gene y as predictors and 3 classes, if the 3 classes exist on a 2D plane, there is nothing we can do to further optimize the separation of the means of the 3 classes because re-projecting the points on a new tilted 2D plane will most likely reduce the distances between the means. Now, if each scatter lied perfectly vertically such that as Gene Y goes up the classes are separated distinctly, then we could re-project the points on a new line(that would be parallel to the invisible vertical class separation line) to further minimize each class's scatter, but this kind of case is very rare. Given my reasoning, my intuition is that an implicit assumption for LDA is that there needs to be at least as many predictors as the number of classes to separate. Is my intuition valid?
@statquest6 ай бұрын
I believe your question might be answered in this video on PCA tips: kzbin.info/www/bejne/pYPZmKRva5uskMk
@RaviShankar-jm1qw4 жыл бұрын
Simply superb! Awesome Josh!!!!
@statquest4 жыл бұрын
Thank you very much! :)
@hamzaghandi48072 жыл бұрын
Besides this wonderful explanation, Your music is very good !
@statquest2 жыл бұрын
Many thanks!
@paulhamacher7734 жыл бұрын
This channel is pure gold!
@statquest4 жыл бұрын
Thank you! :)
@gptty4 жыл бұрын
I get it! You sir is the best lecturer in statistics
@statquest4 жыл бұрын
Thanks!
@saharafox23602 жыл бұрын
That helped me a lot! Thank you sooo much! Now I'm ready for my exam tomorrow :)
@statquest2 жыл бұрын
Best of luck!
@nuttapatchaovanapricha10 ай бұрын
Very useful and intuitive, also sick intro music right there as usual! xD
@statquest10 ай бұрын
I think this might be my favorite intro.
@muskanjhunjhunwalla85056 жыл бұрын
It was a very helpful video. I get to understand it in the first attempt only. Thanks a lot for this video sir.
@statquest6 жыл бұрын
Hooray!!! I'm glad the video was so helpful! :)
@sanketbadhe35725 жыл бұрын
I just watched all your videos for intro track :P ......awesome tracks and nicely explained videos
@statquest5 жыл бұрын
Awesome! :)
@cnbmonster10423 жыл бұрын
Amazing! I subscribed after watching your video only twice!
@statquest3 жыл бұрын
Wow, thanks!
@nintishia Жыл бұрын
Once again, a fantastic job. Thanks, StatQuest.
@statquest Жыл бұрын
Thanks again!
@wei-tingko78712 жыл бұрын
I really like your channel, the explanation of concepts was clear and precise!!
@statquest2 жыл бұрын
Thank you!
@karannchew25343 жыл бұрын
Like PCA, LDA "compress" the data into lower dimensions. But unlike PCA, it do so while keeping/maximising the classificability (separability) of the data according to the given classification, as much as possible. Data must already be classified to use LDA. Find the line (the new lower dimension) such that the difference between the means of two classes of data are maximised. At the same time, the variance among the data of the same class is minimised.
@statquest3 жыл бұрын
:)
@weixu5537 жыл бұрын
great video, you make all the academic terms very understandable, cheers from China
@gaboceron1003 жыл бұрын
Very illustrative, thanks for the video!
@statquest3 жыл бұрын
Thanks!
@Pmarmagne3 жыл бұрын
Another clearly explained video by StatQuest!
@statquest3 жыл бұрын
BAM! :)
@sureshmakwana8709 Жыл бұрын
The best explanation on whole internet 💯
@statquest Жыл бұрын
Thank you! :)
@duchuyho70277 жыл бұрын
Great video Joshua ! Looking forward to learning more from you ! Cheers from Japan !
@tomaszberent8015 жыл бұрын
The video shows how LDA reduces dimensions and we can clearly see a newly constructed axis (like with PCA) which - in LDA analysis - maximizes the separation. That was very clear!. How does this line relates to a line that actually separates the two categories on an original XY plain you refer to on 2:48 minute of your video?. After all it is this line (do we call it a discriminant function?) which is usually used to show the separation?. The latter is intuitively understood as a separation border, the former explains how we reduced dimension. What is the link between the two?
@statquest5 жыл бұрын
That's a good question. There are a few options for coming up with a threshold that allows you to classify new observations into one of the categories in your training dataset. The simplest is to transform the new observation using the transformation that the training dataset created, and then measure the euclidean distance between the new observations and the center of each classification. The classification that is closest to the new observation is used to classify the new observation.
@ThangPham-dx9ic3 жыл бұрын
Super interesting video! Your videos are much better than those boring equations on the books BTW, may I ask you a question? What you mean in "LDA for 3 categories" is number of LDs = number of categories - 1, right?
@statquest3 жыл бұрын
Yes.
@xujerry78917 жыл бұрын
Hi, Joshua. Thank you for your videos, it’s really helpful. I have a question: so when you have a LDA to categorize n categories, does it mean that you need (n-1) axis to separate the points? In that case, how can I visualize them?
@chieftainsupreme9387 Жыл бұрын
You're an excellent teacher. Thank you so much.
@statquest Жыл бұрын
Thank you! :)
@eniisy2 жыл бұрын
This lesson is just so beautiful dude!!!!
@statquest2 жыл бұрын
Thank you!
@tomaszberent8015 жыл бұрын
Josh, BRILLIANT!!!! Some quations tough. 1) In what sense LDA generates the same separation as linear regresssion (however understood) as often claimed? Doesn't LDA use its own "optimazation" criterion, somewhat different from e.g. OLS? 2) Can we do ROC having identified a "separation line" between two categories using LDA? I am a bit confused here. As far as logistic regression is concerned, it is all clear: we can do ROC as different cutoffs are available to us after we estimate the curve. But with LDA, it looks to me that the "optimization" criterion defines (kind of) one cutoff point and we should not mess with it anyore, i.e. should not use other cutoffs for the line eatimated.. If I am correct, why are there so many research papers which estimate the z-scores (using LDA) and subsequently change the cutoff point to calculate specificity and sensitivity (or any other classification or misclassification metric), or ROC/AUC itself. Hopefully my inquiry is clear. Still I wish I was so clear as you are on those funny videos.
@statquest5 жыл бұрын
LDA is totally different from Linear Regression. Way, way different. If you have questions about OLS, watch my videos: kzbin.info/www/bejne/hpKpgZWYa5t3rrM As for ROC - for LDA, if you are using Bayes' Theorem for Classification, then you can change the threshold you use for the posterior distribution. If you are not using Bayes' Theorem, and instead using K-means clustering on top of LDA, you can alter the values for "k".
@livetolearn4776 жыл бұрын
Really great explanation. Thanks.
@vishwanathg80836 жыл бұрын
Thankyou , Explanation of LDA & PCA is very clear....
@happylearning-gp2 жыл бұрын
Excellent Tutorial, Thank you very much
@statquest2 жыл бұрын
Glad you liked it!
@mrunalwaghmare5 ай бұрын
love ur vids they are simple to understand
@statquest5 ай бұрын
Glad you like them!
@mastermike8907 жыл бұрын
This is an AWESOME vid. Thanks for making this idea so simple
@mohammadadnan82486 жыл бұрын
Tomorrow is my exam, that might be helpful Thanks a lot from India
@jennysspiceoflife85813 жыл бұрын
Thank you for the illustration, it's very clear!
@statquest3 жыл бұрын
Glad it was helpful!
@EbraM966 жыл бұрын
That musical intro is much fun :D
@JalerSekarMaji6 жыл бұрын
Wow! At first "wt.f is Statquest" then At the end of video, STATQUEST! and I checked on the description. Its a great website ! Thanks
@sakhawat30034 жыл бұрын
Hey man! That was a nice clear cut explanation . I have been doing machine learning using LDA but I never knew what this LDA actually does . I only had a vague idea . By the way , you wrote "seperatibility" instead of "separability " at 5:26 ....
@statquest4 жыл бұрын
That's embarrassing. One day when StatQuest is making the big bucks I will hire an editor and my poor spelling will no be source of great shame.
@michaelgeorgoulopoulos86783 жыл бұрын
Fav statQuest intro!
@statquest3 жыл бұрын
I think this is my favorite as well. It's a classic!
@mrdoerp6 жыл бұрын
this videos are incredible, i would pay for it if i had money
@statquest6 жыл бұрын
Sometimes it's the thought that counts! I'm glad you enjoy the videos. :)
@amanzholdaribay98714 жыл бұрын
Thanks as every time! The best explanations of complicated things!
@statquest4 жыл бұрын
Thanks!
@namanjha496411 ай бұрын
Thank You very very very much, You bring joy to me
@statquest11 ай бұрын
Thank you!
@MartinUToob5 жыл бұрын
When's the StatQuest album coming out? (Here come the Grammies!) 🎸👑 Actually, the only reason I watch your videos is for the music. 😍🎶🎵
@greina69457 жыл бұрын
Very nice explanation. The only issue I have is that the first and second axes for both PCA and LDA are not Gene 1 and Gene 2. They are instead some linear combination of Gene 1 and Gene 2. So in a 10,000 gene space, you will get some combination of some of the 10,000 genes that clearly separate the two groups. For example, LD1 could be one third of Gene 12 plus one third of gene 45 plus one sixth of gene 456 plus one sixth of gene 1,234.
@BillHaug2 жыл бұрын
I would agree that "awesome song..." is an appropriate label.
@statquest2 жыл бұрын
bam!
@BillHaug2 жыл бұрын
@@statquest I would even say double bam... btw... "we're going to do a lot of maths step by step" = triple bam
@statquest2 жыл бұрын
@@BillHaug Awesome!!! I love that you're a connoisseur of StatQuest themes!!!
@ramarajugadiraju68865 жыл бұрын
Very nice video . Admiring your passion and contribution !
@statquest5 жыл бұрын
Thank you! :)
@gauranggarg5493 жыл бұрын
Cant understand a topic and then u find a statquest video on it TRIPLE BAMM!!