A few more lessons from my Pop!
4:40
Human Stories in AI: Amy Finnegan
28:25
Human Stories in AI: Xavier Moyá
32:33
Human Stories in AI: Tommy Tang
36:55
Log_e Song - Official Lyric Video
3:21
Human Stories in AI: Fabio Urbina
35:31
Human Stories in AI: Khushi Jain
27:13
Human Stories in AI: Achal Dixit
33:12
Human Stories in AI: Rick Marks
31:13
Another 3 lessons from my Pop!!!
6:46
The Ukulele: Clearly Explained!!!
3:27
Пікірлер
@wyskass861
@wyskass861 5 сағат бұрын
I wish more people would understand probability and statistics better.. So much misinformation about the world is spread by media who misinterpret studies.. Also hate how often we talk about averages without an included variance. Without which people make lots of wrong conclusions from just averages.
@samsheppard8265
@samsheppard8265 6 сағат бұрын
Vampire
@azuarc
@azuarc 7 сағат бұрын
I already understood probability. This video does nothing to explain what likelihood is or why it's different. The creator only acknowledges *that* it is different and how to compute it, but why do I even care?
@statquest
@statquest 7 сағат бұрын
In common conversation we use probability and likelihood interchangeably. However, statisticians make a clear distinction that is important to understand if you want to follow their logic. The good news is that they are both super simple. The bad news is that they are easy to get mixed up. This StatQuest gives you visual images that make them both easy to remember so you'll always keep them straight. NOTE: This video was originally made as a follow up to an overview of Maximum Likelihood kzbin.info/www/bejne/jpbTiaeibr5-rcU . That video provides context that gives this video more meaning. So if you are wondering what a likelihood of 0.21 represents, please watch that video.
@kmcng
@kmcng 11 сағат бұрын
Amazing explanation!
@statquest
@statquest 7 сағат бұрын
Thank you!
@ShubhamPlays
@ShubhamPlays 13 сағат бұрын
thanks for your dedication by explaining with such good visuals and yes not forgeting your music good job.
@statquest
@statquest 7 сағат бұрын
Thanks!
@cmexgd7096
@cmexgd7096 14 сағат бұрын
Thank you Josh, you are the one that makes the internet the best achievement the humanity has ever achieved, You are the definition of its bright side. I wouldn't be so happy and concentrated studying all this stuff 6 hours a day If there were no you. Thank you again and TRIPLE BAM!!!
@statquest
@statquest 7 сағат бұрын
Thank you very much! :)
@bikashkumarpaul5922
@bikashkumarpaul5922 15 сағат бұрын
It’s amazing how effective learning can be when the content is this well-presented and thoughtfully made!
@statquest
@statquest 7 сағат бұрын
Thanks!
@dshefman
@dshefman 15 сағат бұрын
Are you sure encoder-only transformers are the same as embedding models? I think they have different architectures.
@statquest
@statquest 7 сағат бұрын
There are lots of ways to create embeddings - and this video describes those ways. However, BERT is probably the most commonly used way to make embeddings with an LLM.
@acleahcim
@acleahcim 16 сағат бұрын
OMG this video is much better than the class I"m taking explained it.
@statquest
@statquest 7 сағат бұрын
Thanks!
@collinsk.langat6230
@collinsk.langat6230 16 сағат бұрын
amazing
@statquest
@statquest 7 сағат бұрын
Thanks!
@citruskeys
@citruskeys 18 сағат бұрын
What a wonderful world, to open a video on a R function only to hear a lovely ukelele theme song.
@statquest
@statquest 7 сағат бұрын
bam! :)
@AdwaitMalghade-pb2ed
@AdwaitMalghade-pb2ed 22 сағат бұрын
GOAT
@statquest
@statquest 7 сағат бұрын
:)
@uniquekatnoria5380
@uniquekatnoria5380 Күн бұрын
I am new to statistics and have trouble understanding the formal terms stated in books. The content from this channel really makes it easy to get intuition and understand the underlying principles. Great work!!
@statquest
@statquest 7 сағат бұрын
Thank you!
@MauricioGaelPerezArroyo
@MauricioGaelPerezArroyo Күн бұрын
is he on drugs?
@statquest
@statquest 7 сағат бұрын
:) nope!
@jasrajanand5759
@jasrajanand5759 Күн бұрын
small bam
@statquest
@statquest 7 сағат бұрын
:)
@taofeecohadesanu
@taofeecohadesanu Күн бұрын
Almost 8 years old gem and I find this very comprehensive enough for my level. Thank youuuuu.
@statquest
@statquest 7 сағат бұрын
Glad to hear it's still helpful! :)
@Otonium
@Otonium Күн бұрын
I want a StatSquatch plushie
@statquest
@statquest Күн бұрын
That would be awesome!
@brpawankumariyengar4227
@brpawankumariyengar4227 Күн бұрын
Thank you so very much Sir ❤ … I very much appreciate these videos…. Thanks a ton 😊
@statquest
@statquest Күн бұрын
TRIPLE BAM!!! Thank you so much for supporting StatQuest! :)
@RichGwilliam
@RichGwilliam Күн бұрын
You know what I love? When lecturers do falsetto freeform scat jazz when I'm trying to follow their maths.
@statquest
@statquest Күн бұрын
:)
@JustinShaedo
@JustinShaedo Күн бұрын
The likely hood that this video labeled the Y axis is zero. The probability that this video labeled the Y axis is also zero.
@amirhasansafizade2128
@amirhasansafizade2128 2 күн бұрын
Great as always. Thanks Josh
@statquest
@statquest Күн бұрын
Thanks again!
@jasrajanand5759
@jasrajanand5759 2 күн бұрын
amazing
@statquest
@statquest 2 күн бұрын
Thanks!
@football-soccer9853
@football-soccer9853 2 күн бұрын
Wonderful explanation. Thank you.
@statquest
@statquest 2 күн бұрын
Thanks!
@annieghostofyourapple639
@annieghostofyourapple639 2 күн бұрын
Thank you for this video! Very helpful, one of the best on this topic!
@statquest
@statquest 2 күн бұрын
Thank you!
@starnemi6825
@starnemi6825 2 күн бұрын
i totally didnt fall asleep in class, miss the entire lesson, and completely lose any sense of direction in my class wdym 😁😁😁
@statquest
@statquest 2 күн бұрын
:)
@mohdyusuf9694
@mohdyusuf9694 2 күн бұрын
6:04 WHERE THIS VALUE FOR SQUARED RESIDUALS COMING FROM?
@statquest
@statquest 2 күн бұрын
The residuals are the vertical distances from the line to the data points. We then square those distances.
@mohdyusuf9694
@mohdyusuf9694 Күн бұрын
@@statquest got that!
@stuartkramer1082
@stuartkramer1082 2 күн бұрын
By any chance do you have the actual small dataset handy you could put in a comment? I'd like to use it to replicate your results to test my understanding. Thanks for all the great videos!
@statquest
@statquest 2 күн бұрын
Because the process of bootstrapping requires random sampling, you'd get something similar, but not the same, as what I have. That said, I believe the "shifted" values are -4, -3.5, -2.2, 1.2, 1.5, 1.6, 2.5, 3
@stuartkramer1082
@stuartkramer1082 2 күн бұрын
@@statquest Many thanks! When I try to reproduce your results, I get close on the p value of 63%, but I'm way off from you in the split. I get about even percentages in the tails, like 32% 29% -- I'll keep working on it ;-)
@statquest
@statquest 2 күн бұрын
@@stuartkramer1082 Here's my R code: raw.data <- c(-3.5, -3, -1.7, 1.7, 2, 2.1, 3, 3.5) mean(raw.data) raw.data - mean(raw.data) shifted.data <- c(-4, -3.5, -2.2, 1.2, 1.5, 1.6, 2.5, 3) mean(shifted.data) data.1 <- c(-4, -4, -2.2, -2.2, 1.2, 1.5, 2.5, 3) mean(data.1) set.seed(42) the.means <- c() for(i in 1:1000) { boot.data <- sample(raw.data, size=length(raw.data), replace=TRUE) the.means <- c(the.means, mean(boot.data)) } hist(the.means, breaks=20, xlim=c(-4,4)) sum(the.means < -0.5) sum(the.means < -0.5)/1000 sum(the.means > 0.5) sum(the.means > 0.5)/1000 sum((the.means > -0.5) & (the.means < 0.5))/1000 the.means[the.means > 2.4]
@stuartkramer1082
@stuartkramer1082 2 күн бұрын
@@statquest Well, now I'm confused ha ha ha. When I plug in data.1 into my excel variation of your model (-4, -4, -2.2, -2.2, 1.2, 1.5, 2.5, 3), I get the result you show in the video. So, that's great. And I see how raw.data turns into shifted.data consistent with your explanation in the video as to how to derive the null hypothesis distribution from the initial sample. My confusion is figuring how data.1 (the average of data.1 is about -.5) is derived from shifted.data, cause data.1seems to be the basis of the null hypothesis graph starting at around 3:30 in the video.
@statquest
@statquest 2 күн бұрын
@@stuartkramer1082 Ooops! It looks like I messed up when calculating the numbers. I should have used "shifted.data" in the loop instead of "raw.data". That said, when corrected, the p-value is still >0.05 so we don't need to change the result.
@hammadrehman5753
@hammadrehman5753 2 күн бұрын
I was very confused that how to proceed to the third stump (that is how to obtain its bootstrapped dataset), chatgpt cleared it for me as: First calculate the total error and amount of say from the bootstrapped dataset. Apply the stump to the main dataset to obtain the number of correct and incorrect classifications Then apply the amount of say to the weights of the samples in the main dataset respectively. Please Correct me if any of this is incorrect.
@statquest
@statquest 2 күн бұрын
You just add all of the samples back to the pool and bootstrap from the original dataset (with updated weights).
@hammadrehman5753
@hammadrehman5753 Күн бұрын
@statquest thats what I'm confused about, how do you update the weights in the main dataset, there will be some data samples which would not be in the current bootstrapped dataset, how to deal with their weights?
@statquest
@statquest Күн бұрын
@@hammadrehman5753 You just use whatever their old weights were and then do the normalization step.
@lailatp7101
@lailatp7101 2 күн бұрын
This is my first review on youtube. I just have to say thank you! This is so good explained, that it even looks simple. And it is funny too (I am just smiling at each "BAM!")!. I am finalizing my studies in Data Science and ML and this is by far the best source of information that I could find. I have your both books and, after watching the videos I only need to take a short look into the books to remember what neural networks or other concepts and approaches are all about. Thank you very much for saving me so much time! Please, keep teaching. And keep it so funny :)
@statquest
@statquest 2 күн бұрын
Thank you very much!
@jasrajanand5759
@jasrajanand5759 2 күн бұрын
beautiful
@statquest
@statquest 2 күн бұрын
Thank you! Cheers!
@Sarah-cm6nv
@Sarah-cm6nv 2 күн бұрын
Thank you so much sir, you really clearly explained it! no one explained it like you did!
@statquest
@statquest 2 күн бұрын
Thank you!
@unclvinny
@unclvinny 2 күн бұрын
I felt that DOUBLE BAM in my soul.
@statquest
@statquest 2 күн бұрын
You made me laugh out loud.