undergraduate machine learning 9: Hidden Markov models - HMM

  Рет қаралды 167,716

Nando de Freitas

Nando de Freitas

Күн бұрын

Пікірлер: 68
@jmcarter9t
@jmcarter9t 7 жыл бұрын
Nando's lectures are certainly some of the best on the web. Mastery is step one. Communicating that mastery is a real gift!
@htetnaing007
@htetnaing007 2 жыл бұрын
Don't stop sharing these knowledge for those are vital to the progress of humankind!
@blaznsmasher
@blaznsmasher 6 жыл бұрын
Amazing! I've watched a couple videos on HMM before this and this is by far the most clear and easy to understand
@qorod123
@qorod123 7 жыл бұрын
Having teachers like Prof. Nando makes me fell in love with science. Thank you so much Professor.
@jmrjmr8254
@jmrjmr8254 10 жыл бұрын
Great! Now this paper I'm reading finally starts to make sense! Most helpful video on this topic !
@azkasalsabila5328
@azkasalsabila5328 7 жыл бұрын
The best lectures I have ever watched on KZbin!!! Great professor. The explanation is easy to follow. Thank you.
@mehr1methanol
@mehr1methanol 9 жыл бұрын
Very very helpful!! Unfortunately when I joined UBC you were already gone to Oxford. But I'm so glad you have the lectures here.
@OmarCostillaReyes
@OmarCostillaReyes 11 жыл бұрын
Great presentation Nando. you succeed on teaching excellence on this lecture. you made you presentation interesting, funny and knowledgeable. Congratulations!
@wackyfeet
@wackyfeet 7 жыл бұрын
I am so glad someone told him to fix the colour on the screen. I was losing it hoping he would fix the issue.
@23karthikb
@23karthikb 7 жыл бұрын
Fantastic explanation Nando - great lecture! Thank you!
@tzu-minghuang7100
@tzu-minghuang7100 8 жыл бұрын
Great video for understanding HMM, worth every minutes
@ThuleTheKing
@ThuleTheKing 11 жыл бұрын
It is by far the best presentation of HMM out there. However I miss an example going from x0 - x2, so both cases (initial and t-1) are illustrated. It could also be nice, if you uploaded the assignments and answers.
@nargeschinichian6286
@nargeschinichian6286 7 жыл бұрын
He is an amazing teacher! I suggest you to watch his class!
@ilnurgazizov2959
@ilnurgazizov2959 4 жыл бұрын
Excellent! A great and clear explanation of HMM!
@fisherh9111
@fisherh9111 5 жыл бұрын
This guy has a great sense of humour.
@noorsyathirahmohdisa2720
@noorsyathirahmohdisa2720 7 жыл бұрын
Best explanation of all. Tq you helped my research on speech recognition
@user-eh5wo8re3d
@user-eh5wo8re3d 8 жыл бұрын
Really nice lecture. Very engaging and informative as well.
@MrA0989741818
@MrA0989741818 9 жыл бұрын
Very good lecture!!! Thanks so much for saving me a large amount of time!
@hsinyang1796
@hsinyang1796 4 жыл бұрын
I'm at student at UBC, we need you back teaching this course :
@asharigamage7486
@asharigamage7486 6 жыл бұрын
Very clear explanation and best examples from GOT ..Thank you so much :-D....
@_dhruvawasthi
@_dhruvawasthi 3 жыл бұрын
At 43:22 why it is outside the markov blanket?
@DrINTJ
@DrINTJ 9 жыл бұрын
Most lectures seem to go in length and repeat ad infinitum the obvious parts, then jump over the important bits.
@pandalover555
@pandalover555 7 жыл бұрын
"You were so happy to see those dragons" LOL this guy is hilarious
@riyasamanta3236
@riyasamanta3236 6 жыл бұрын
simple..easy to understand..real world problems..thank you for this video.Can you upload more about the applications and extension of HMM ??
@PoyanNabati
@PoyanNabati 10 жыл бұрын
This is fantastic, thank you Nando!
@ar_rahman_90
@ar_rahman_90 7 жыл бұрын
Thankyou so much for these lectures!
@qureshlokhandwala3866
@qureshlokhandwala3866 10 жыл бұрын
Amazing intuitive explantation.Thanks!
@doggartwuttus1082
@doggartwuttus1082 11 жыл бұрын
Thank you! This was really engaging and helpful.
@acltm
@acltm 10 жыл бұрын
Hi- could you also tell how to estimate the transition matrix?
@arnabpaul6996
@arnabpaul6996 5 жыл бұрын
Me Before this Video: X(0): Sad Y(0): Crying Me After this Video: X(1): Happy Y(1): Watching GoT Me After Watching GoT: X(2): Sad Y(2): Crying, because the last season sucks
@ivansorokin8054
@ivansorokin8054 10 жыл бұрын
If node x0 do not have y0 on the graph then we assume P(x0/y0) = P(x0) when predicting P(x1/y0) or we don't count transition between x0 to x1. I think graph must have node y0 and we can use Bayes rule to compute P(x0/y0) like in the beginning of the lecture. Anyway, thanks for sharing lectures.
@amizan8653
@amizan8653 11 жыл бұрын
Thank you very much for posting this!
@phillipblunt
@phillipblunt 6 жыл бұрын
Really fantastic lecture, thanks a lot!
@saijaswanth5085
@saijaswanth5085 2 жыл бұрын
Can i know reference book of above lecture?
@yuvrajsingh-wn3up
@yuvrajsingh-wn3up 10 жыл бұрын
If the events W,S,C and F are not mutually exclusive then what changes we need to make to present HMM?
@IndrianAmalia
@IndrianAmalia 9 жыл бұрын
thanks for the amazing lecture ! helps me alot :)
@ghufranghuzlan4404
@ghufranghuzlan4404 7 жыл бұрын
omg the best explanation eveeeeeeeeeer .very helpfull thank u sooo much
@Michael-kt3tf
@Michael-kt3tf 4 жыл бұрын
Just wondering since we just care about the posterior. Why the forward algorithm compute the joint distribution? What is the point of that?
@AnekwongYoddumnern
@AnekwongYoddumnern 8 жыл бұрын
Dear sir, if I use pir sensor with markov chain how many state that I should to setting?
@noorsyathirahmohdisa2720
@noorsyathirahmohdisa2720 7 жыл бұрын
Where to find his next video on HMM?
@ryuzakace
@ryuzakace 4 жыл бұрын
were you able to find it? It seems the HMM part is missing in this lecture, there is a slide present though which is not covered in this lecture
@allisonzhang6527
@allisonzhang6527 9 жыл бұрын
awesome! thanks ,Nando!
@youssefdirani
@youssefdirani 4 жыл бұрын
I didn't know the smoothing assignment ... What was it ?
@aashishraina2831
@aashishraina2831 8 жыл бұрын
I loved teh material. Thanks a lot
@shashanksagarjha2807
@shashanksagarjha2807 5 жыл бұрын
someone please let me know.. can HMM be used for anomaly detection.. if yes . does it work better than techniques such as SMOTEENN and wighted class
@yuezhao8657
@yuezhao8657 5 жыл бұрын
I do not feel either HMM or smote is a major anomaly detection technique. The more common approaches are LOF, Isolation forest, ocsvm, abod, loci and so on.
@shashanksagarjha2807
@shashanksagarjha2807 5 жыл бұрын
@@yuezhao8657 as far as i know LOF, isolation forest works better in case of unsupervised learning but technique such as weighted class or smoteEnn work better when we have labels. How much accuracy we can get in HMM
@ddarhe
@ddarhe 7 жыл бұрын
at the beginning of the lecture; shouldnt the columns of the table add up to 1 instead of rows? P(y|x) + P(y|~x) = 1, right?
@gggrow
@gggrow 7 жыл бұрын
No... P(y|x) + P(~y|x) = 1 Whereas P(y|x) + P(y|~x) means "the probability of y given x plus the probability of y given not x". That could equal more than 1 if y is likely in both cases, or less than 1 if y is unlikely in both cases
@gggrow
@gggrow 7 жыл бұрын
So... P(sad|crying) + P(sad| not crying) doesnt have to equal one because maybe I'm not likely to be sad either way, but P(sad|crying) + P(not sad| crying) = 1 because that exhausts the list of possible states; I have to be either sad or not!
@charliean9237
@charliean9237 7 жыл бұрын
That's what I thought too. Summing rows to 1 means this puppy always does one of the 4 things, and the puppy never eats. However, summing cols to 1 means the puppy is either happy or sad, which makes more sense.
@upinsanity
@upinsanity 8 жыл бұрын
absolute masterpiece!
@gopalnarayanan4217
@gopalnarayanan4217 8 жыл бұрын
very good explanation
@shineminem
@shineminem 11 жыл бұрын
OMG this is so helpful!
@ralphsaymoremakuyana7126
@ralphsaymoremakuyana7126 9 жыл бұрын
great. well explained!!
@kunwaravikalnathmathur2003
@kunwaravikalnathmathur2003 6 жыл бұрын
This video has baye's theorem applied in full form
@RelatedGiraffe
@RelatedGiraffe 10 жыл бұрын
6:37 We are all gonna be there one day? Speak for yourself! :P
@bhagzz
@bhagzz 7 жыл бұрын
Really good one :)
@pebre79
@pebre79 11 жыл бұрын
wow. thanks for posting!
@engomasri
@engomasri 8 жыл бұрын
Thanks so much !
@roseb2105
@roseb2105 7 жыл бұрын
i dont understand the equation
@abcborgess
@abcborgess 8 жыл бұрын
brilliant.
@tina3829
@tina3829 7 жыл бұрын
Super!
@WilliamStevenDonald
@WilliamStevenDonald 8 жыл бұрын
very good
@samidelhi6150
@samidelhi6150 5 жыл бұрын
YuePeng Guo why
@dr.loucifhemzaemmysnmoussa7686
@dr.loucifhemzaemmysnmoussa7686 7 жыл бұрын
Great!
@yatmoparni993
@yatmoparni993 6 жыл бұрын
Dr. Loucif Hemza Emmys n Moussa
@nickizcool20
@nickizcool20 3 жыл бұрын
HMM 🤔
Hidden Markov Models 12: the Baum-Welch algorithm
27:02
Andro, ELMAN, TONI, MONA - Зари (Official Audio)
2:53
RAAVA MUSIC
Рет қаралды 8 МЛН
Непосредственно Каха: сумка
0:53
К-Media
Рет қаралды 12 МЛН
Wednesday VS Enid: Who is The Best Mommy? #shorts
0:14
Troom Oki Toki
Рет қаралды 50 МЛН
Hidden Markov Models
30:18
Bert Huang
Рет қаралды 88 М.
Hidden Markov Model : Data Science Concepts
13:52
ritvikmath
Рет қаралды 135 М.
A friendly introduction to Bayes Theorem and Hidden Markov Models
32:46
Serrano.Academy
Рет қаралды 483 М.
Machine learning - Importance sampling and MCMC I
1:16:18
Nando de Freitas
Рет қаралды 86 М.
CS480/680 Lecture 17: Hidden Markov Models
1:01:31
Pascal Poupart
Рет қаралды 16 М.
Hidden Markov Model | Clearly Explained
16:37
LiquidBrain Bioinformatics
Рет қаралды 17 М.
CS885 Lecture 11a: Hidden Markov Models
29:29
Pascal Poupart
Рет қаралды 6 М.
10. Markov and Hidden Markov Models of Genomic and Protein Features
1:18:26
MIT OpenCourseWare
Рет қаралды 39 М.
Hidden Markov Model Clearly Explained! Part - 5
9:32
Normalized Nerd
Рет қаралды 519 М.
Andro, ELMAN, TONI, MONA - Зари (Official Audio)
2:53
RAAVA MUSIC
Рет қаралды 8 МЛН