Great explanation. Clear, structured and explained in simple understandable terms. Thanks for taking the time to put this together.
@argentina21527 жыл бұрын
Thank you so much. Now it all makes sense. I had difficulty grasping the idea of MLE, but with your explanation I feel confident going back to the lectures and being able to follow them.
@FloppyDobbys7 жыл бұрын
Most clear explanation i have seen on KZbin thus far
@CC-op3ez2 күн бұрын
Thank you so much for your video, especially for warnings about model selection and AIC. Would you please explain more (or give me some documents or references) about "do not combine model selection with hypothesis testing. The p value significance will be inflated because you are implicitly testing multiple hypotheses with model selection"
@shane11468 жыл бұрын
Matthew you are awesome. I wish you did a video on Bayesian too. Bayesian, MCMC one please??
@stevegyro16 жыл бұрын
Matthew you are outstanding as a teacher. Thank you for the many insights and teaching. -Steve G.
@alonsobentley33513 жыл бұрын
You all probably dont care at all but does any of you know of a way to log back into an instagram account?? I was stupid lost my password. I love any tips you can offer me
@Maha_s19996 жыл бұрын
Fantastic! Thank you so much for this super clear exposition.
@franciscomendoza37782 жыл бұрын
Really nice presentation
@مصطفىعبدالجبارجداح3 жыл бұрын
بارك الله فيكم وجزاكم الله خير الجزاء
@sukursukur36172 жыл бұрын
Yeaahh thats the clear explanation
@saurabh75prakash6 жыл бұрын
Nicely explained, thanks!
@elliott81752 жыл бұрын
Thank you so much!! Such clear explanations!!
@RPDBY6 жыл бұрын
AIC - the lower the better, LL - the higher the better, but both measure the same concept, so using both is a redundancy, one will suffice (as one will always go down when the other goes up judging by the formula). Did i get it right?
@profmo8 жыл бұрын
Thank you for taking the time to make this video.
@joseluisredondogarcia52448 жыл бұрын
The last slide is gold
@joelmhaske81854 жыл бұрын
Seriously good!
@PedroRibeiro-zs5go7 жыл бұрын
Great video man! Helped me a lot, all the best :D
@tag_of_frank6 жыл бұрын
isnt the (probability of x given theta) = (probability of theta given x)(probability x)/(probability theta) If this is the case, the "likelihood function" as you defined, is it really equal to the probability of x given theta ? If so, why, since it is missing those two terms extra terms?
@cube2fox6 жыл бұрын
Fahraynk I have the same question, please tell me when you found an answer.
@MM-du7je6 жыл бұрын
That is true in a Bayesian setting, where the parameters in a given model are treated as being random. The point of maximum likelihood estimation, on the other hand, and Frequentist inference is we treat parameters as being static, such that we can estimate them. I wish the author had written in the more common notation for density for MLE, with f(x ; theta) instead of f(x given theta), it's confusing when this isn't known. Btw you are correct, so Bayes theorem works for densities too! p(theta) is the density of the parameter, p(theta given x) is the parameter conditioned on the data (the thing we want!) and p(x) the normalizing constant. Bayesian inference is basically the science of picking a prior based on objective/subjective mathematical means.
@roma90267 жыл бұрын
Thank you very much for your introduction!
@golamwahid86305 жыл бұрын
Thank you very much!
@taded71698 жыл бұрын
Really it is very interesting!! Thank You!!
@ndiegow18 жыл бұрын
Amazing, I finally understood MLE
@StephenRoseDuo7 жыл бұрын
How does restricted maximum likelihood estimation change the description here?
@hounamao71408 жыл бұрын
you're amazing sir
@RafaelLima-ox9ul8 жыл бұрын
cool! Good job!
@ivarockazi7 жыл бұрын
nice explanation...but i ended up cleaning my laptop screen.. after 4.49
@clairekunesh46378 жыл бұрын
Very helpful
@mnkmnkification7 жыл бұрын
AWESOME!
@sandrahuhu74297 жыл бұрын
this is great!
@rawiyahalraddadi70646 жыл бұрын
Thank you!
@azadalmasov58497 жыл бұрын
It is interesting to me why they just do not divide AIC eqn. by 2.
@aravindhan93687 жыл бұрын
Thankyou sir. Its very helpful. Can you please show the mathematical workout of this in figures?