Hidden Markov Model : Data Science Concepts

  Рет қаралды 112,103

ritvikmath

ritvikmath

Күн бұрын

All about the Hidden Markov Model in data science / machine learning

Пікірлер: 187
@totomo1976
@totomo1976 Жыл бұрын
Thank you so much for your clear explanation!!! Look forward to learning more machine-learning related math.
@rssamarth099
@rssamarth099 7 ай бұрын
This helped me at the best time possible!! I didn't know jack about the math a while ago, but now I have a general grasp of the concept and was able to chart down my own problem as you were explaining the example. Thank you so much!!
@stevengreidinger8295
@stevengreidinger8295 3 жыл бұрын
You gave the clearest explanation of this important topic I've ever seen! Thank you!
@chadwinters4285
@chadwinters4285 2 жыл бұрын
I have to say you have an underrated way of providing intuition and making difficult to understand concepts really easy.
@13_yashbhanushali40
@13_yashbhanushali40 Жыл бұрын
Unbelievable Explanation!! I have referred to more than 10 videos where basic working flow of this model was explained but I must say that rather I'm sure that this is the most easiest explanation one can ever find on youtube , the way of explanation considering the practical approach was much needed and you did exactly that Thanks a ton man !
@user-xj1pi5ec6x
@user-xj1pi5ec6x 4 ай бұрын
True experts always make it easy.
@ashortstorey-hy9ns
@ashortstorey-hy9ns 2 жыл бұрын
You're really good at explaining these topics. Thanks for sharing!
@coupmd
@coupmd 2 жыл бұрын
Wonderful explanation. I hand calculated a couple of sequences and then coded up a brute force solution for this small problem. This helped a lot! Really appreciate the video!
@pinkymotta4527
@pinkymotta4527 2 жыл бұрын
Crystal-clear explanation. Didn't have to pause video or go back at any point of video. Would definitely recommend to my students.
@beyerch
@beyerch 3 жыл бұрын
Really great explanation of this in an easy to understand format. Slightly criminal to not at least walk through the math on the problem, though.
@froh_do4431
@froh_do4431 3 жыл бұрын
really good work on the simple explanation of a rather complicated topic 👌🏼💪🏼 thank you very much
@mohammadmoslemuddin7274
@mohammadmoslemuddin7274 3 жыл бұрын
Glad I found your videos. Whenever I need some explanation for hard things in Machine Learning, I come to your channel. And you always explain things so simply. Great work man. Keep it up.
@ritvikmath
@ritvikmath 3 жыл бұрын
Glad to help!
@linguipster1744
@linguipster1744 3 жыл бұрын
oooh I get it now! Thank you so much :-) you have an excellent way of explaining things and I didn’t feel like there was 1 word too much (or too little)!
@spp626
@spp626 Жыл бұрын
Such a great explanation! Thank you sir.
@VascoDaGamaOtRupcha
@VascoDaGamaOtRupcha 9 ай бұрын
You explain very well!
@louisc2016
@louisc2016 2 жыл бұрын
I really like the way you explain something, and it helps me a lot! Thx bro!!!!
@beckyb8929
@beckyb8929 2 жыл бұрын
beautiful! Thank you for making this understandable
@hichamsabah31
@hichamsabah31 3 жыл бұрын
Very insightful. Keep up the good work.
@zishiwu7757
@zishiwu7757 3 жыл бұрын
Thank you for explaining how HMM model works. You are a grade saver and explained this more clearly than a professor.
@ritvikmath
@ritvikmath 3 жыл бұрын
Glad it was helpful!
@srijanshovit844
@srijanshovit844 6 ай бұрын
Awesome explanation I understood in 1 go!!
@songweimai6411
@songweimai6411 Жыл бұрын
Really appreciate your work. Much better than the professor in my class who has a pppppphhhhdddd degree.
@paulbrown5839
@paulbrown5839 3 жыл бұрын
To get to the probabilities in the top right of the board, you keep applying P(A,B)=P(A|B).P(B) ... eg. A=C3, B=C2 x C1 x M3 x M2 x M1 ... keep applying P(A,B)=P(A|B).P(B) and you will end up with same probabilities as shown on the whiteboard top right of screen for the viewer. Great video!
@ritvikmath
@ritvikmath 3 жыл бұрын
Thanks for that!
@ummerabab8297
@ummerabab8297 Жыл бұрын
Sorry, but I still don't get the calculation at the end. The whole video was explained flawlessly but the calculation was left out. I don't understand. If you can please further help. Thankyou.
@toyomicho
@toyomicho Жыл бұрын
@@ummerabab8297 Here is some code in python showing the calculations in the output, you'll see that the hidden sequence s->s->h has the highest probability (0.018) ##### code #################### def get_most_likely(): starting_probs={'h' :.4, 's':.6} transition_probs={'hh':.7, 'hs':.3, 'sh':.5, 'ss':.5, } emission_probs = {'hr':.8, 'hg':.1,'hb':.1, 'sr':.2, 'sg':.3, 'sb':.5} mood={1:'h', 0:'s'} # for generating all 8 possible choices using BitMasking observed_clothes = 'gbr' def calc_prob(hidden_states:str)->int: res = starting_probs[hidden_states[:1]] # Prob(m1) res *= transition_probs[hidden_states[:2]] # Prob(m2|m2) res *= transition_probs[hidden_states[1:3]] # Prob(m3|m2) res *= emission_probs[hidden_states[0]+observed_clothes[0]] # Prob(c1|m1) res *= emission_probs[hidden_states[1]+observed_clothes[1]] # Prob(c2|m2) res *= emission_probs[hidden_states[2]+observed_clothes[2]] # Prob(c2|m3) return res #Use BitMasking to generate all possible combinations of hidden states 's' and 'h' for i in range(8): hidden_states = [] binary = i for _ in range(3): hidden_states.append(mood[binary&1]) binary //=2 hidden_states = "".join(hidden_states) print(hidden_states, round(calc_prob(hidden_states),5)) ##### Output ###### sss 0.0045 hss 0.0006 shs 0.00054 hhs 0.000168 ssh 0.018 hsh 0.0024 shh 0.00504 hhh 0.001568
@AakashOnKeys
@AakashOnKeys Ай бұрын
@@toyomicho I had the same doubt. Thanks for the code! Would be better if author pins this.
@mirasan2007
@mirasan2007 3 жыл бұрын
Dear ritvik, I watch your videos and I like the way you explain. Regarding this HMM, the stationary vector π is [0.625, 0.375] for the states [happy, sad] respectively. You can check the correct stationary vector by multiplying it with the transpose of the Transition probability Matrix, then it should result the same stationary vector as result: import numpy as np B = np.array([[0.7, 0.3], [0.5, 0.5]]) pi_B = np.array([0.625, 0.375]) np.matmul(B.T, pi_B) array([0.625, 0.375])
@jinbowang8814
@jinbowang8814 Жыл бұрын
Really nice explanation! easy and understandable.
@skyt-csgo376
@skyt-csgo376 2 жыл бұрын
You're such a great teacher!
@silverstar6905
@silverstar6905 4 жыл бұрын
verry nice explanation. looking forward to seeing something about quantile regression
@mengxiaoh9048
@mengxiaoh9048 Жыл бұрын
thanks for the video! I've watched two other videos but this one is the easiest to understand HMM and I also like that you added the real-life application NLP example at the end
@ritvikmath
@ritvikmath Жыл бұрын
Glad it was helpful!
@awalehmohamed6958
@awalehmohamed6958 2 жыл бұрын
Instant subscription, you deserve millions of followers
@alecvan7143
@alecvan7143 Жыл бұрын
Very insightful, thank you!
@Molaga
@Molaga 3 жыл бұрын
A great video. I am glad I discovered your channel today.
@ritvikmath
@ritvikmath 3 жыл бұрын
Welcome aboard!
@juanjopiconcossio3146
@juanjopiconcossio3146 Жыл бұрын
Great great explanation. Thank you!!
@Dima-rj7bv
@Dima-rj7bv 3 жыл бұрын
I really enjoyed this explanation. Very nice, very straightforward, and consistent. It helped me to understand the concept very fast.
@ritvikmath
@ritvikmath 3 жыл бұрын
Glad it was helpful!
@Justin-General
@Justin-General 2 жыл бұрын
Thank you, please keep making content Mr. Ritvik.
@laurelpegnose7911
@laurelpegnose7911 2 жыл бұрын
Great video to get an intuition for HMMs. Two minor notes: 1. There might be an ambiguity of the state sad (S) and the start symbol (S), which might have been resolved by renaming one or the other 2. About the example configuration of hidden states which maximizes P: I think this should be written as a tuple (s, s, h) rather than a set {s, s, h} since the order is relevant? Keep up the good work! :-)
@1243576891
@1243576891 3 жыл бұрын
This explanation is concise and clear. Thanks a lot!
@ritvikmath
@ritvikmath 3 жыл бұрын
Of course!
@qiushiyann
@qiushiyann 4 жыл бұрын
Thank you for this explanation!
@mia23
@mia23 3 жыл бұрын
Thank you. That was a very impressive and clear explanation!
@ritvikmath
@ritvikmath 3 жыл бұрын
Glad it was helpful!
@clauzone03
@clauzone03 3 жыл бұрын
You are great! Subscribed with notification after only the first 5 minutes listening to you! :-)
@ritvikmath
@ritvikmath 3 жыл бұрын
Aw thank you !!
@srinivasuluyerra7849
@srinivasuluyerra7849 2 жыл бұрын
Great video, nicely explained
@ananya___1625
@ananya___1625 Жыл бұрын
As usual awesome explanation...After referring to tons of videos, I understood it clearly only after this video...Thank you for your efforts and time
@ritvikmath
@ritvikmath Жыл бұрын
You are most welcome
@user-or7ji5hv8y
@user-or7ji5hv8y 3 жыл бұрын
This is really great explanation
@nathanielfernandes8916
@nathanielfernandes8916 Жыл бұрын
I have 2 questions: 1. The Markov assumption seems VERY strong. How can we guarantee the current state only depends on the previous state? (e.g., person has an outfit for the day of the week instead of based on yesterday) 2. How do we collect the transition/emission probabilities if the state is hidden?
@jirasakburanathawornsom1911
@jirasakburanathawornsom1911 2 жыл бұрын
Im continually amazed by how well and easy to understand you can teach, you are indeed an amazing teacher
@ahokai
@ahokai 2 жыл бұрын
I don't know why I had paid for my course and then came here to learn. Great explanation, thank you!
@SPeeDKiLL45
@SPeeDKiLL45 2 жыл бұрын
Great Video Bro ! Thanks
@mihirbhatia9658
@mihirbhatia9658 3 жыл бұрын
I wish you went through Bayes Nets before coming to HMM. That would make the conditional probabilities so much more easier to understand for HMMs. Great explanation though !! :)
@jijie133
@jijie133 Жыл бұрын
Great video!
@mansikumari4954
@mansikumari4954 8 ай бұрын
This is great!!!!!
@deter3
@deter3 3 жыл бұрын
amazing explanation !!!
@minapagliaro7607
@minapagliaro7607 4 ай бұрын
Great explanation ❤️
@wendyqi4727
@wendyqi4727 Жыл бұрын
I love your videos so much! Could you please make one video about POMDP?
@Aoi_Hikari
@Aoi_Hikari Ай бұрын
i had to rewind the videos a few times, but eventually i understood it, thanks
@NickVinckier
@NickVinckier 3 жыл бұрын
This was great. Thank you!
@ritvikmath
@ritvikmath 3 жыл бұрын
Glad you enjoyed it!
@kiran10110
@kiran10110 3 жыл бұрын
Damn - what a perfect explanation! Thanks so much! 🙌
@ritvikmath
@ritvikmath 3 жыл бұрын
Of course!
@arungorur3305
@arungorur3305 3 жыл бұрын
Ritvik, great videos.. I have learnt a lot.. thx. A quick Q re: HMM. How does one create transition matrix for hidden states when in fact you don't know the states.. thx!
@slanglabadang
@slanglabadang 3 ай бұрын
I feel like this is a great model to use to understand how time exists inside our minds
@Sasha-ub7pz
@Sasha-ub7pz 2 жыл бұрын
Thanks, amazing explanation. I was looking for such video but unfortunately, those authors have bad audio.
@PF-vn4qz
@PF-vn4qz Жыл бұрын
Thank you!
@mousatat7392
@mousatat7392 Жыл бұрын
amazing keep up very cool explenation
@ritvikmath
@ritvikmath Жыл бұрын
Thanks!
@otixavi8882
@otixavi8882 2 жыл бұрын
Great video, however I was wondering if the hidden state transitioning probabilities are unknown, is there a way to compute/calculate them based on the observations?
@Aquaeflavie81
@Aquaeflavie81 2 жыл бұрын
Great !!
@b7Z8Sjd
@b7Z8Sjd 3 жыл бұрын
Thank you for this video
@kristiapamungkas697
@kristiapamungkas697 3 жыл бұрын
You are a great teacher!
@ritvikmath
@ritvikmath 3 жыл бұрын
Thank you! 😃
@GarageGotting
@GarageGotting 3 жыл бұрын
Fantastic explanation. Thanks a lot
@ritvikmath
@ritvikmath 3 жыл бұрын
Most welcome!
@chia-chiyu7288
@chia-chiyu7288 3 жыл бұрын
Very helpful!! Thanks!
@ritvikmath
@ritvikmath 3 жыл бұрын
Glad it was helpful!
@kalpanasharma5672
@kalpanasharma5672 3 жыл бұрын
AMAZING.
@ResilientFighter
@ResilientFighter 3 жыл бұрын
Ritvik, it might be helpful if you add some practice problems in the description
@ingoverhulst
@ingoverhulst 4 жыл бұрын
Great work! I really enjoy your content.
@shahabansari5201
@shahabansari5201 3 жыл бұрын
Very good explanation of HMM!
@ritvikmath
@ritvikmath 3 жыл бұрын
Glad it was helpful!
@seansanyal1895
@seansanyal1895 4 жыл бұрын
hey Ritvik, nice quarantine haircut! thanks for the video, great explanation as always. stay safe
@ritvikmath
@ritvikmath 4 жыл бұрын
thank you! please stay safe also
@gopinsk
@gopinsk 2 жыл бұрын
I agree Teaching is an art. You have mastered it. Application to real world scenarios are really helpful. Really feel so confident after watching your videos. Question, How did we get the probabilities to start with? are those arbitrary or followed any scientific method to arrive at those numbers?
@OskarBienko
@OskarBienko Жыл бұрын
I'm curious too. Did you figure it out?
@user-or7ji5hv8y
@user-or7ji5hv8y 2 жыл бұрын
Cool. Have you done a video on how to get those probabilities from observed data? Is it using MCMC?
@caspahlidiema4027
@caspahlidiema4027 3 жыл бұрын
The best ever explanation on HMM
@ritvikmath
@ritvikmath 3 жыл бұрын
thanks!
@souravdey1227
@souravdey1227 2 жыл бұрын
Really crisp explanation. I just have a query. When you say that the mood on a given day "only" depends on the mood the previous day, this statement seems to come with a caveat. Because if it "only" depended on the previous day's mood, then the Markov chain will be trivial. I think what you mean is that the dependence is a conditional probability on the previous day's mood: meaning, given today's mood, there is a "this percent" chance that tomorrow's mood will be this and a "that percent" chance that tomorrow's mood will be that. "this percent" and "that percent" summing up to 1, obviously. The word "only" somehow conveyed a probability of one. I hope I am able to clearly explain.
@zacharyzheng3610
@zacharyzheng3610 Жыл бұрын
Brilliant explanation
@ritvikmath
@ritvikmath Жыл бұрын
Thanks!
@jaivratsingh9966
@jaivratsingh9966 2 жыл бұрын
Nice!
@kanhabansal524
@kanhabansal524 Жыл бұрын
best explanation over internet
@ritvikmath
@ritvikmath Жыл бұрын
Thanks!
@StreetArtist360
@StreetArtist360 2 жыл бұрын
Thanks.
@user-or7ji5hv8y
@user-or7ji5hv8y 4 жыл бұрын
Great video
@ritvikmath
@ritvikmath 4 жыл бұрын
thanks !
@hex9219
@hex9219 29 күн бұрын
awesome
@5602KK
@5602KK 3 жыл бұрын
Incredible. All of the other videos I have watched have me feeling quite over whelmed.
@ritvikmath
@ritvikmath 3 жыл бұрын
glad to help!
@mango-strawberry
@mango-strawberry 2 ай бұрын
brilliant explanation
@ritvikmath
@ritvikmath 2 ай бұрын
Glad you think so!
@dhirgajbhiye06
@dhirgajbhiye06 2 жыл бұрын
Cool bro!
@shaoxiongsun4682
@shaoxiongsun4682 Жыл бұрын
Thanks a lot for sharing. It is very clearly explained. Just wondering why the objective we want to optimize is not the conditional probability P(M=m | C = c).
@user-ri7uz9il1v
@user-ri7uz9il1v 2 жыл бұрын
Tanx a LOT
@yasminemohamed5157
@yasminemohamed5157 2 жыл бұрын
You‘re awesome
@montheral5073
@montheral5073 4 жыл бұрын
thank you..
@gnkk6002
@gnkk6002 3 жыл бұрын
Wonderful explanation 👌
@ritvikmath
@ritvikmath 3 жыл бұрын
Thank you 🙂
@ls09405
@ls09405 7 ай бұрын
Great Video. But how did you calculate {SSH} is maximum?
@yuliiashaparenko6623
@yuliiashaparenko6623 3 жыл бұрын
bravo!
@MegaJohnwesly
@MegaJohnwesly Жыл бұрын
oh man. Thanks alot :). I tried to understand here and there by reading..But I didn't get it. But this video is gold
@ritvikmath
@ritvikmath Жыл бұрын
Glad it helped!
@barhum5765
@barhum5765 Жыл бұрын
God bless your soul man
@SuperMtheory
@SuperMtheory 4 жыл бұрын
Great video. Perhaps a follow up will be the actual calculation of {S, S, H}
@ritvikmath
@ritvikmath 4 жыл бұрын
thanks for the suggestion!
@shubhamjha5738
@shubhamjha5738 3 жыл бұрын
Nice one
@ritvikmath
@ritvikmath 3 жыл бұрын
Thanks 🔥
@anand_dudi
@anand_dudi 2 жыл бұрын
thanks
@Infaviored
@Infaviored Жыл бұрын
If there is a concept I did not understand from my lectures, an i see there is a video by this channel, i know I will understand it afterwards.
@ritvikmath
@ritvikmath Жыл бұрын
thanks!
@Infaviored
@Infaviored Жыл бұрын
@@ritvikmath no, thank you! Ever thought of teaching at an university?
@nicolas12189
@nicolas12189 2 жыл бұрын
Hey in future videos could you provide an unobstructed view of the board, either at the beginning or end of the video, just for a few seconds? Sometimes it’s helpful to screenshot your notes
@PeteThomason
@PeteThomason 2 жыл бұрын
Thank you, that was a very clear introduction. They key thing I don't get is where the transition and emission probabilities come from. In a real-world problem, how do you get at those?
@jordanblatter1595
@jordanblatter1595 2 жыл бұрын
In the case of the NLP example with part of speech tagging, the model would need data consisting of sentences that are assigned tags by humans. The problem is that there isn't much of that data lying around.
@claytonwohl7092
@claytonwohl7092 3 жыл бұрын
At 2:13, the lecturer says, "it's not random" whether the professor wears a red/green/blue shirt. Not true. It is random. It's random but dependent on the happy/sad state of the professor. Sorry to nitpick. I definitely enjoyed this video :)
@ritvikmath
@ritvikmath 3 жыл бұрын
Fair point !! Thanks :)
@froh_do4431
@froh_do4431 3 жыл бұрын
Is it possible to describe in a few words, how we can calculate/compute the transition- and emission probabilities?
@anna-mm4nk
@anna-mm4nk Жыл бұрын
appreciate that the professor was a 'she' took me by surprise and made me smile :) also great explanation, made me remember that learning is actually fun when you understand what the fuck is going on
Conditional Random Fields : Data Science Concepts
20:11
ritvikmath
Рет қаралды 31 М.
I Built a Shelter House For myself and Сat🐱📦🏠
00:35
TooTool
Рет қаралды 36 МЛН
Super gymnastics 😍🫣
00:15
Lexa_Merin
Рет қаралды 101 МЛН
Would you like a delicious big mooncake? #shorts#Mooncake #China #Chinesefood
00:30
A friendly introduction to Bayes Theorem and Hidden Markov Models
32:46
Serrano.Academy
Рет қаралды 469 М.
Markov Decision Processes - Computerphile
17:42
Computerphile
Рет қаралды 159 М.
The Viterbi Algorithm : Natural Language Processing
21:13
ritvikmath
Рет қаралды 96 М.
ML Was Hard Until I Learned These 5 Secrets!
13:11
Boris Meinardus
Рет қаралды 207 М.
Entropy (for data science) Clearly Explained!!!
16:35
StatQuest with Josh Starmer
Рет қаралды 578 М.
Markov Chains : Data Science Basics
10:24
ritvikmath
Рет қаралды 61 М.
16. Markov Chains I
52:06
MIT OpenCourseWare
Рет қаралды 345 М.
EM Algorithm : Data Science Concepts
24:08
ritvikmath
Рет қаралды 64 М.
Metropolis - Hastings : Data Science Concepts
18:15
ritvikmath
Рет қаралды 95 М.
I Built a Shelter House For myself and Сat🐱📦🏠
00:35
TooTool
Рет қаралды 36 МЛН