i am so grateful for your videos, this is the first time i have fully comprehended this subject. thank you for sharing!
@jbboyne3 ай бұрын
Bayesian stats made me multiply multivariate probability density functions by other PDFs, and I'm still mad about it.
@saamyarana20963 ай бұрын
thank you so much for explaining with such clarity
@lilbee22334 ай бұрын
I’m so confused 😢
@jnvststudy2335 ай бұрын
Thankyou so much for this amazing video lectures 🙏 love and respect from India keep uploading more videos on mathematical statistics
@kalyanroy87127 ай бұрын
Thanks ma'am for this explanation. It was clear and concise.
@Sohorioots8 ай бұрын
I would not have passed advanced statistics without you - thank you so much for these videos!
@m_c_86568 ай бұрын
nice!
@duane300010 ай бұрын
is it not (1+theta)^-1
@fjumi365210 ай бұрын
why is it mu - ybar in the likelihood and not ybar - mu?
@chloechong94010 ай бұрын
this is simple, straightforward and easy to understand, walked me straight through the steps of deriving exponential family for a Normal distribution, thank you so much!
@NiklasLandsberg-i1u11 ай бұрын
Super helpful!
@jexray-rt3eb11 ай бұрын
thankyou very much mchinkuuu
@andrestorres734311 ай бұрын
could you explain what's the logic to go from "this is proportional to" to "this is equal to"?
@radiancewithjasmin Жыл бұрын
Thank you so much for these videos! I feel like I can actually understand this now haha
@adabwana1253 Жыл бұрын
thank you. im still wondering where you got phi at 4:14. it felt arbitrary.
@oliviacummings2112 Жыл бұрын
Thank you for this video!! Super helpful and well explained!
@MiyadhIslamNup Жыл бұрын
❤❤
@muhammadsulaman5426 Жыл бұрын
Thnk you ❤❤
@oscarmark4430 Жыл бұрын
very goood👏
@dinaashraf9199 Жыл бұрын
Thanks a lot!
@timurtenishev6243 Жыл бұрын
thank you, great explanation
@playlists_bmac Жыл бұрын
perfect.
@sergiomwendwa809 Жыл бұрын
Thank you so much madam professor.
@harrietfosuahquansah4180 Жыл бұрын
Great video 😊 Please when the expression nlog(1-p) was expressed in terms of theta it turned negative that is -nlog(1+e^theta) and the b theta function is being subtracted from x theta, so by substitution the negatives should turn positive... Kindly clarify that portion for me. Thank you.
@sfundoy5dube59 Жыл бұрын
we thank you so much ma'am🙏🙏🙏well explained, we appreciate you
@spencerwadsworth1638 Жыл бұрын
So mu ~ N(m, s2) and yi ~ N(mu, sigma2) ?
@meenakshigautam4249 Жыл бұрын
please calculate bayes factor for this
@nondogopal1251 Жыл бұрын
Thanks
@nondogopal12512 жыл бұрын
Wow thanks
@dr.m.b.c.2 жыл бұрын
Thanks a lot. 🙂
@tomatocultivator95392 жыл бұрын
Thank you for saving my head.... I was constantly trying to understand the steps.... My text book has directly jumped to the mean and variance without any steps in middle ....
@sushifishwins2 жыл бұрын
so much easier to understand now! My lecturer had confused me
@shabnamkohistani202 жыл бұрын
5. Today, Sasha checked their weight several times with different scales observing (in kilograms): 92, 82, 83, 86, 86, 90, 83, 84, 89, 85. Assume that the data is normal with variance σ 2 = 9 and a prior distribution for the true weight µ ∼ N(80, 100). (a) What is the posterior distribution? (b) Compute the credible interval of 95% for µ a priori and a posteriori. (c) Compare both intervals with the frequentist 95% confidence interval. Can you conclude that I was optimistic?
@leiyeeduck2 жыл бұрын
Hey, it should be n*y_bar^2/sigma^2 in the constant term. I know this is not important haha but just a reminder lol
@LSMARTChannel2 жыл бұрын
I really like your video . thanks..good bless for you. do you have instagram maybe?
@d1a2n3i5e8l2 жыл бұрын
Thank you
@joacorapela2 жыл бұрын
This result proved false: Counter example: n=2, y_1=0, y_2=0, \mu=0, \sigma^2=1. Then f(y_1, y_2|\mu, \sigma^2)=N(0|0, 1)*N(0|0, 1)=0.3989*0.3989=0.1592 which is different from N(\mu|\bar{y}, \sigma^2/2)=N(0|0,0.5)=0.5642. Can anybody guess what is wrong in the proof?
@hongkyulee97242 жыл бұрын
Thank you for the very intuitive explanation.
@sumankafle44212 жыл бұрын
DO WE NEED TO INTRODUCE Y VARIABLE AT ALL WE CAN CONCLUDE IT FROM THE X
@vandarkholme4422 жыл бұрын
why didn't my lecturer mentioned this but he just conclude what is min suff so confusing at the time
@Toady.78712 жыл бұрын
You‘re videos are a blessing. So clear and reduced to the relevant information. Thank you very much!
@rafaelbechaves2 жыл бұрын
You are amazing at teaching! Thank you so much!!
@vitaminprotein86302 жыл бұрын
The Indian teaching method is easier than this
@setyol112 жыл бұрын
Mrs, Can you derivate Maximum Likelihood Estimation for Inverse gaussian 3p please
@willhitchcock51392 жыл бұрын
if Mu_l = y.bar, then mu_l ^2 does not equal (1/n) sum(y_i^2) and it doesnt match up with what you had before
@MOHSINALI-bk2qo2 жыл бұрын
can you plz tell me how this e^theta/1+e^theta comes? plz help with this OR give it any reference
@Siroitin2 жыл бұрын
multiply, expand, subtract and divide
@yuqianchen78202 жыл бұрын
Really helpful by doing all the steps so clearly!! All series about the exponential family is so awesome, a lot of help for my exams
@musefakedir27122 жыл бұрын
oh! amazing please finish it. thank you vary much!