The use of maximum likelihood estimation to estimate the mean of a normally distributed random variable.
Пікірлер: 41
@aidosmaulsharif95703 жыл бұрын
Man this algebraic explanation of square deviations as quadratic equation was completely mind-blowing thing, instead of just taking derivative and equalizing it to zero, you simply solved quadratic equation. That was a great eureka moment for me. Thank you so much man!!! Subbed
@BambooForestRecords12 жыл бұрын
I was so confused by my stupid notes. Thank you for explaining it so clearly. You are boss.
@christopherpflaum898911 жыл бұрын
I have been struggling to understand this since graduate school - now I do. Thank you.
@TheRealRslive10 жыл бұрын
you sound like u just ran 100 miles
@souslicer9 жыл бұрын
what about variance, how do i estimate that?
@TheYeo113 жыл бұрын
Great tutorial, very clearly explained. Thanks.
@dchaitow13 жыл бұрын
Thats where RSQ comes from! Thanks for the video.
@andrevauvelle44138 жыл бұрын
An example including n samples with different variances would also be gr8!
@muhammadhammadkarim88659 жыл бұрын
to make things a bit simpler, where it is mentioned that minima of parabola is B/2*A, it would have been easier to say that the derivative is 0 at minima, so taking derivative w.r.t 'x' would yield the same thing. good video nonetheless.
@Qbabxtra8 жыл бұрын
Intense breathing
@schaeferconstantin12 жыл бұрын
Very, very good lecture!
@f1mercury11 жыл бұрын
Excellent Demonstration. Thank you very much.
@tangled5513 жыл бұрын
This is an EXCELLENT video on maximum likelihood estimation. Are you an actual professor?
@GAWRRELL10 жыл бұрын
Can you please make an example of this using real world data?
@study2481611 жыл бұрын
Thanks for your tut! but may I ask a question? you brought all together (1/sqrt(2pi*sigma))^n meaning all observations have the same sigma? how come?
@TheYeo113 жыл бұрын
@MathHolt Hi... I have some doubts. My question may be silly, but why we need to take log ? Can't we just differentiate and then assign to zero to get the maxima? Thanks.
@MrArunavadatta9 жыл бұрын
well explained
@gautama45905 жыл бұрын
Great explanation... at the end we could have simply find derivative of the sum wrt mu and equate it to zero to find MLE of mu = x-bar
@krish240574mumbai8 жыл бұрын
BEAUTIFUL
@mrms1547 жыл бұрын
Breathing unbelievably
@babyroo55512 жыл бұрын
genius!! thank you so much you have no idea how much you've helped me understanding this!! If I knew you i'd hug you :p
@ukno91810 жыл бұрын
I've seen least squares with linear algebra using Ax = b and finding coefficients for the polynomial that'll regress through the data. Is this somehow related?
@mkrasmus9 жыл бұрын
Great
@satbirsaini887511 жыл бұрын
It has to do with the monotonic nature of log, since log only changes the y values, the x value where the max occurs remains the same. Or at least, that's how my prof put it.
@manmalhotra13 жыл бұрын
Good tut but if you had just taken the log and then differential to find maxima(of the whole -ve term), it'd have been a lot easier.
@lilmoesk8997 жыл бұрын
One question: at 5:04 or so he says we can add the powers, but then writes a minus sign for all the powers of e. What am I missing?
@study2481611 жыл бұрын
I see! so you concern about muy over sigma and assume all sigmas are constant and similar. Am I right?
@MrArunavadatta10 жыл бұрын
nice
@fakeplayer12 жыл бұрын
This heavy breathing........omg....
@EvolutionCode8 жыл бұрын
AWESOME! :D
@Rorkazak6 жыл бұрын
Did you just run a marathon ?
@jmccoynpg10 жыл бұрын
\o/ MATH HOLT \o/
@annalam86247 жыл бұрын
thank you!!!
@shrimatkapoor22003 жыл бұрын
Why are you out of breath? Seems like a pretty strenuous calculation lol
@JH-tc5wz7 жыл бұрын
why not just take dL/d mu = 0
@pokirihoney11 жыл бұрын
i wish u were my teacher lol
@mahesh-mh9hg5 жыл бұрын
marry me
@visweshwaranra6 жыл бұрын
good video but his breathing is so much irritating