there is a mistake at 6:38: should multiply by N the left part
@albertogutierrez3464 жыл бұрын
I can't find the words to thank you Ben, you have no idea of how many times your videos have saved me along graduate school. The simplicity and detail of how you explain is priceless! I owe you big time!! Best wishes from Mexico.
@SatyaRanjanSahoowmu9 жыл бұрын
I think the last equation should be logP(N*Xbar) instead of just logP(Xbar)
@Partho252510 жыл бұрын
you are really really good...I never understood this concept this clear.
@SpartacanUsuals10 жыл бұрын
Hi Prektesh, glad to hear it was useful! Best, Ben
@sami-samim8 жыл бұрын
I love the way you recap previous lectures.
@internetscheisse8 жыл бұрын
Thanks alot for this helpful video! But still I got a question. In the last line, shouldn't it be N*mean(x) instead of just mean(x)?
@SpartacanUsuals8 жыл бұрын
Hi, thanks for your message - yep you are right - there should be an 'N' before the last mean(x). My apologies for the confusion! Best, Ben
@YawarSohail7 жыл бұрын
Oh I was wondering what happened to N. Thanks he asked it.
@abokbeer8 жыл бұрын
You are my statistics hero. I shared this with all my colleagues.
@horsepowerfactorytx11 жыл бұрын
I want to thank you for such a great explanation. My professor does not teach half as good as you. Cheers.
@jacquelineoverton82238 жыл бұрын
Beautiful explanation of log likelihood! Thank you. This is tremendously helpful!
@zeeshan008x525 жыл бұрын
Rockstar of statistics. U teaching style is very good sir
@notyouraveragecat97 Жыл бұрын
This explanation is beyond amazing. Thank you so much.
@elenae13665 жыл бұрын
Great explanation of ML! This whole playlist is very helpful for understanding the intuition behind these concepts, especially since you show the mathematical derivations clearly. Thank you for posting!
@tsasa19210 жыл бұрын
watched many videos and so confused. got it know. yours is the best!
@N.E.Olsson8 жыл бұрын
Thanks Ben! The lecturerers tend to make it more complicated than it is when they skip this kind of thorough explanation. Big thanks, it is very clear now! // Niklas
@Simbwhaha6 жыл бұрын
In my eyes, you obtained “hero status” !
@charlesledesma3054 жыл бұрын
Excellent explanation!
@ArtisticContingent8 жыл бұрын
Audio needs boosting but this was a very helpful video :) thanks for making it!
@kikikikikiki78596 жыл бұрын
This is really helpful and it really makes sense to me!
@akjagsdfiausd10 жыл бұрын
In the end shouldn't it be log(p)Nx(bar) + log(1-p)(N - Nx(bar)) rather than log(p)x(bar) + log(1-p)(N - Nx(bar))? Great video, nonetheless!
@zangasenor10 жыл бұрын
The uploader corrected the mistake through annotations at 6:35
@kamikaze92717 жыл бұрын
yeah...since the background is black, it's kind of hard to see the annontation..
@indragesink9 жыл бұрын
you can also rewrite L as p^(sumfromi=1ton(xi))*(1-p)^(n-sumfromi=1ton(xi)), which seems (a lot) easy(/ier) but I guess you'll still need the product rule because of p*(1-p) (in the basis).
@SC-hu2rt7 жыл бұрын
Thanks a lot for your explanation! Really helpful and encouraging
@chriskiguongo1388 жыл бұрын
Saved me on the night before my statistics exam \
@zarakkhan7210 жыл бұрын
A very nice explanation. Understood it completely but except for the final line. The same point raised by "dlambriex" in the comments, is my question as well.
@SpartacanUsuals10 жыл бұрын
Thanks for your message. Yes, you were correct. That was a mistake. I have now added a note to the video. Best, Ben
@sardaou39443 жыл бұрын
is it possible sir to explain in more detail the fact that differentiate and taking the log leads to the same conclusion Thanks in advance
@alandubackupchannel52018 жыл бұрын
Have you considered the case where X1=M, X2=F, but your sample is X1=F, X2=M, the samples are the same but the likelihood L function doesn't take that into account? Sorry if I'm wrong, I have just began learning this. So, the likelihood estimator is just the chance that you get a particular sample (where the ordering *does* matter)?
@ScottEdwards20003 жыл бұрын
I was thinking the same thing! Yeah it seems like this is calculating the probability of getting this same in this exact order of M's and F's. Is that right?
@rosamaytapalacios9 жыл бұрын
Thanks for the explanation. I have a doubt,in which case is it used negative log?
@dheerajsharma52797 жыл бұрын
Hi Ben, Amazing video. Can I please ask if thats nX (bar) in the end.
@carolinaxlopez749510 жыл бұрын
thank you so much for your great videos! i really loved them all! very clrear and simple way to get complicated stuffs :)
@abhilamsal71857 жыл бұрын
Setting fist derivative to zero, might give Maximum or Minimum. So, In my opinion further steps are required like setting second derivative less than zero.
@comvnche3 жыл бұрын
or even a local max or min
@danielbhc6 жыл бұрын
Thanks a lot! I'm just a little confused. p hat would be an estimator of how many males are in the population. But for a more general MLE understanding, is p hat just the parameter that makes the sample data fit the PDF f(x|p) where p is the unknown parameter? So for example, could you do MLE for a Poisson distribution where the unknown parameter p would be the mean lambda? Could a Possion MLE still be described by f(x|p)? Thanks!
@lmgpc6 жыл бұрын
You are great! really helpful
@ashankavindu24093 жыл бұрын
Thank you very much sir ♥
@dlambriex11 жыл бұрын
First of all thank you for your videos! They have helped a lot with studying for my econometrics courses! But I have a question regarding the last line in your last simplification. You state that the summation of Xi is equal to N * Xaverage (in yellow on the right). Yet when you erase sum of Xi in the formula you only write Xaverage and not N*Xaverage. While for the second summation you do simplify the that sum of (1-Xi) = N(1-Xaverage). Could you please confirm that this should in fact be N * Xaverage? Thanks ever so much in advance and keep up the good work it is really a lifesaver for me and my fellow students in the Netherlands!
@zarakkhan7210 жыл бұрын
Yup i also didn't get that part.
@SpartacanUsuals10 жыл бұрын
Hi, Thanks for your messages, and kind words. Yes, it should be N times Xaverage; my apologies for the mistake. I shall add a note to the video now. Best, Ben
@guilhermearaujo58684 жыл бұрын
But how can I be sure that I am maximizing the function? The turning point could be a maximum or minimum, no? The likelihood function doesn't have a minimum ( I imagine it does and its 0), or its minimum is not a turning point?
@regularviewer16826 жыл бұрын
How do you get N(1-xbar) at the very end? I can't figure it out for the life of me, even though I know it's relatively simple!
@igorrizhyi55726 жыл бұрын
i did't get why we can so easily wrap this all into log without worrying about that it will be completely different expression? i see how it makes our calculations much easier but do you have some sources where i can get material about this log function wrapping? thanks
@igorrizhyi55726 жыл бұрын
ok, i think i got it. if somebody will stack into the same problem of understanding this material i can recommend this url - en.wikipedia.org/wiki/Monotonic_function
@jeongsungmin2023 Жыл бұрын
In 6:50 shouldn’t sigma Xi equal N*(X bar)?
@adelinesou88995 жыл бұрын
This is so helpful thank you!
@leojboby7 жыл бұрын
if log is always increasing, when is the derivative 0?
@chandnibhudia62410 жыл бұрын
you are amazing!!
@SpartacanUsuals10 жыл бұрын
Hi, thanks for your message and kind words. Glad to hear it was helpful! Best, Ben
@khalidmzd10 жыл бұрын
Excellent
@SpartacanUsuals10 жыл бұрын
Hi, thanks for your message, and kind words. Glad to hear that it was useful. Best, Ben
@anthonymatovu61697 жыл бұрын
Thank you very much,
@endeshawhabte66679 жыл бұрын
I MUST THANK YOU!!
@LovelyWorldFressia4 жыл бұрын
Thank you so much! I was able to get it fast with this video. (Tried other methods for a day and didn't get it >.
@bend.45066 жыл бұрын
thank you soooo much!!!!
@jansiranibdumtcs5 жыл бұрын
The (unobserved) parameters are Θ = (µ1 ,…,µn , σ2) and the parameter space is Ѳ = Rn X R+. (a) Derive the maximum likelihood estimator(MLE) for Θ. Pls answer me for this.
@user-ck1hh9lh5s5 жыл бұрын
Why did the xi become a xbar
@abbe234567895 жыл бұрын
It is an error, it is supposed to be N*x_bar
@lehmann_clips10 жыл бұрын
awesome
@SpartacanUsuals10 жыл бұрын
Hi, many thanks for your message and kind words. All the best with your studies, Ben