PCA explained with intuition, a little math and code

  Рет қаралды 6,131

AI Coffee Break with Letitia

AI Coffee Break with Letitia

Күн бұрын

Пікірлер: 23
@exoticcoder5365
@exoticcoder5365 Жыл бұрын
I really found that little coffee bean animation helped me to concentrate on the video ! Thank you for this strategy !
@matt.jordan
@matt.jordan 3 жыл бұрын
Such a great explanation awesome work!!
@AICoffeeBreak
@AICoffeeBreak 3 жыл бұрын
So glad you liked it!
@omarlopezrincon
@omarlopezrincon 2 жыл бұрын
ha ha ha, i was hoping to learn how to calculate the eigen vector, I love this channel
@quote.d
@quote.d 2 жыл бұрын
Thanks! Great explanation and visuals, and I especially enjoyed whispered parts. I'm sure I'm not the only one who gets this emotional reaction to information that is being whispered instead of plainly said. And emotions are very important for remembering things. Please consider making a full-whispered redub of your videos!
@AICoffeeBreak
@AICoffeeBreak 2 жыл бұрын
ASMR with Machine Learning content. 😅
@EGlobalKnowledge
@EGlobalKnowledge 2 жыл бұрын
A very good explanation with details of how it works
@vincetechclass3390
@vincetechclass3390 Жыл бұрын
Nice presentation. Pls, what tool did you use for presentation?
@AICoffeeBreak
@AICoffeeBreak Жыл бұрын
Thanks! I used good old Powerpoint. 😅
@chenzakaim3
@chenzakaim3 2 жыл бұрын
you are really awesome!, thanks a lot
@AICoffeeBreak
@AICoffeeBreak 2 жыл бұрын
🙂 Thanks for watching and leaving this awesome comment!
@kellymarchisio377
@kellymarchisio377 2 жыл бұрын
First off - loving the videos! Thanks for the fun and clear explanations. Quick clarification, though: The matrix V at 5:22 is drawn as a D' x D, no? Are we meant to actually have z_i = x_i V^T (V-transpose)?
@KevinTurner-aka-keturn
@KevinTurner-aka-keturn 2 жыл бұрын
I tried doing some dimensionality reduction using yellowbrick and sklearn on what I _thought_ was a very modestly-sized data set, and I was surprised by how long it took! I guess it was probably the Manifold Learning methods that took longer than PCA, but I don't recall PCA being exactly quick either. Is that expected? Are there techniques for subsampling data to get some faster approximation?
@dontaskme1625
@dontaskme1625 3 жыл бұрын
Wouldn't a coffee bean be afraid of a coffee break because that's the point in time when it would be most likely ground up?
@AICoffeeBreak
@AICoffeeBreak 3 жыл бұрын
Shhh!!! Don't tell Ms. Coffee Bean that! 🤫😱
@user-or7ji5hv8y
@user-or7ji5hv8y 4 жыл бұрын
I replaced my morning coffee with AI coffee. :)
@AICoffeeBreak
@AICoffeeBreak 4 жыл бұрын
Haha, this is funny and wholesome! :)
@elinetshaaf75
@elinetshaaf75 3 жыл бұрын
@C me too, lol!
@ricardoabraham4016
@ricardoabraham4016 2 жыл бұрын
thank u this is so cute
@rumanubhardwaj6559
@rumanubhardwaj6559 4 жыл бұрын
Funny? Ya(y.)
@AICoffeeBreak
@AICoffeeBreak 4 жыл бұрын
🤣
UMAP explained | The best dimensionality reduction?
9:16
AI Coffee Break with Letitia
Рет қаралды 64 М.
Adding vs. concatenating positional embeddings & Learned positional encodings
9:21
AI Coffee Break with Letitia
Рет қаралды 22 М.
We Attempted The Impossible 😱
00:54
Topper Guild
Рет қаралды 56 МЛН
REAL or FAKE? #beatbox #tiktok
01:03
BeatboxJCOP
Рет қаралды 18 МЛН
PCA : the basics - explained super simple
22:11
TileStats
Рет қаралды 63 М.
Principal Component Analysis (PCA)
26:34
Serrano.Academy
Рет қаралды 421 М.
Masked Autoencoders Are Scalable Vision Learners - Paper explained and animated!
12:56
AI Coffee Break with Letitia
Рет қаралды 26 М.
StatQuest: Principal Component Analysis (PCA), Step-by-Step
21:58
StatQuest with Josh Starmer
Рет қаралды 3 МЛН
Data leakage during data preparation? | Using AntiPatterns to avoid MLOps Mistakes
7:42
AI Coffee Break with Letitia
Рет қаралды 3 М.
ConvNeXt: A ConvNet for the 2020s - Paper Explained (with animations)
19:20
AI Coffee Break with Letitia
Рет қаралды 22 М.
Principal Component Analysis (PCA) - easy and practical explanation
10:56
Dimensional Reduction| Principal Component Analysis
19:06
Krish Naik
Рет қаралды 164 М.