The Forward-Backward algorithm for a hidden Markov model (HMM). How the Forward algorithm and Backward algorithm work together. Discussion of applications (inference, parameter estimation, sampling from the posterior, etc.).
Пікірлер: 44
@dc333335 жыл бұрын
great lecture. Thank You. Can't understand the negative comments. This is free people!!! And he knows what he is doing. Try to find that somewhere else.
@AdrianVrabie8 жыл бұрын
That is really a nice video to watch for Xmas! :) Thank you!
@ShaileeMehta10 жыл бұрын
Excellent Teaching !!!
@daniilsmirnov3153 Жыл бұрын
Easy to follow, thank you
@saimadhavp5 жыл бұрын
Forward Backward Algorithm is used to compute P(x | lamda). How does P (Zk,x) = P(x | lamda). Can you help to connect dots? Lamda = (Initial state, tranisiton probabilites,emission probabilites)
@s4ms4r412 жыл бұрын
Great stuff, thank you.
@DDranks11 жыл бұрын
Sorry, I didn't quite catch why you are able to say at 6:58 that p (z_k, x) == p(x_k+1:n | z_k, x_1:k) * p(z_k, x_1:x) without any explanation? I watched the two introductory videos about HMM and the videos about conditional independence and D-separation, but is there some other videos I should've watched as a preparation?
@NBAChickenLover5 жыл бұрын
Answer for future viewers, because I'm not sure if you'd want the answer after six years. It's because of the definition of conditional independence. P(A | B, C) = P(A, B, C) / P(B, C), so P(A, B, C) = P(A|B,C)*P(B,C). P(Z_k, X) = P(Z_k, X_1:k, X_k+1:n) [Just dividing the X into two parts, but it's the same joint distribution.] Then, using the same conditional relation above [P(A, B, C) = P(A|B,C)*P(B, C)] we can get P(Z_k, X) = P(X_k+1:n | Z_k, X_1:k) * P(Z_k, X_1:k)
@yanghaofan67994 жыл бұрын
There really isn't a reason to do this. Merely a mathematical trick, like 4 = 3 + 1 and 4 = 2 + 2. Maybe it would be easier to prove it is true. I hope this helps.
@PallabDe3 жыл бұрын
@@NBAChickenLover Thank you! Someone definitely needed this after 7 years. :)
@alecvan71435 жыл бұрын
Great vid!
@derrylab4 жыл бұрын
Every year desperate students came here
@nesalipk12 жыл бұрын
very helpful videos, thanks. I have a question: How can we use HMM in hypothesis testing?
@MrSengkruy9 жыл бұрын
can someone explain me more detail about this equation : p (z_k, x) == p(x_k+1:n | z_k, x_1:k) * p(z_k, x_1:x) ? I just can't understand it
@parunach19 жыл бұрын
+MrSengkruy Simple, break up P(Z_k, X_1:n) = P(Z_k, X_1:k, X_k+1:n) then convert to conditonal distribution based on one set of vars, this becomes P(X_K+1:n|Z_k,X_1:k)*P(Z_k, X_1:k) This is by Bayes rule and likelihood *prior to get posterior
@AdrianVrabie8 жыл бұрын
+MrSengkruy well... from a mathematical pov I don't think you can really write that, it is correct given (assuming) a HMM chain. Generally speaking, If you have 2 joint events, then P(A&B) = P(A|B)*P(B) = P(B|A)*P(A). if you have 3 events, combine two together as an event P(A&B&C) = P( (A&B)&C)) and apply the rule recursively.
@roozbehsanaei96728 жыл бұрын
+MrSengkruy By simply breakup we have P(Z_k, X_1:n) = P(Z_k, X_1:k, X_k+1:n) based on definition of joint probability, we have P(A,B) = P(A|B)*P(B) Substituting A= X_k+1:n; B = z_k, X_1:k we have P(Z_k, X_1:k, X_k+1:n) = P(X_k+1:n|Z_k,X_1:k)*P(Z_k, X_1:k) then P(Z_k, X_1:n) = P(X_k+1:n|Z_k,X_1:k)*P(Z_k, X_1:k) and then based on d-separation in HMM we have P(Z_k, X_1:n) = P(X_k+1:n|Z_k)*P(Z_k, X_1:k)
@georgek14344 жыл бұрын
hey,thats a great video !!! can you please send me your references(books, papers etc) so i can study more for this subject? thank you in andvance!
@pelemanov13 жыл бұрын
@raven112345 You need them to be able to do the inference. When you want to use this in practice, you will typically build your own HMM and define these yourself. Then you will use this algorithm to inference what you don't know.
@raven11234513 жыл бұрын
Why do we need to assume the emmission probabilities, transitional probabilities and the initialization probabilities?
@twister94584 жыл бұрын
the forward backward algo aims at learning A et B matrixes, and doesn't assume we have them, I don't understand your first assumptions
@Nair_4 жыл бұрын
his assumption is upon the distributions they belong to. not the actual values.
@hebst96244 жыл бұрын
can anyone explain the part 6:46 where he mentions that p(zk| x) is proportional to p(zk, x)? by the rules of conditional probability shouldn't it be equal to p(zk,x)/p(x)?
@52VaultBoy4 жыл бұрын
@ This is an interesting trick. It is obvious that p(zk| x) is equal to p(zk, x) if p(x) is uniformly distributed, but if you play just with the p(zk,x)=p(zk| x) p(x), one is not aware of this proportionality. Really COOL. Thank you.
@satyamlal44615 жыл бұрын
Could you please tell what's thr difference between P(Z|X) and P(Z,X)
@seekeroftruth015 жыл бұрын
P(Z|X) : is a conditional probability ===> probability of Z occurs knowing X occured P(Z,X) : intersectin probability ==> probability of Z occurs and X occurs
@stoneshou4 жыл бұрын
@@seekeroftruth01 sometimes aka join probability
@hanchen23557 жыл бұрын
Good lecture. Btw Your voice reminds me of "the duck song".
@MrLufemami13 жыл бұрын
Could anyone tell me how to distinguish different numerical sequences just by looking at a number that corresponds to a specific sequence that is different from another sequence...like in a classifer of sequences...have tried entropy but it gives a lot of false positives in the identification of the sequences
@anaselghafari414610 жыл бұрын
I don't mean to be rude, but here is a tip: have a script prepared instead of winging it. Because all these times where your train of thought breaks and you explain some convention or say "I am going to change the color" can be distracting. So, instead of interrupting the explanation to introduce a notational convention, introduce whatever notational convention you want before you start explaining the point where the convention is needed.
@delldell53928 жыл бұрын
+anas elghafari I agree. The frequent interruptions and corrections (beginning to describe X and then realize that the speaker wanted to describe Z all along, so he switches to Z, as seen in 14.5) is not helpful in understanding a material where the mind has to keep track of multiple variables including their meaning and the meaning of the subscript. Because the speaker corrects himself so frequently, it gives an appearance that he is not thorough with the material. Getting tiny details such as the symbols and subscripts completely accurate signals a strong grasp on the topic. i.e. OBSERVING the accuracy of such tiny details implies that the HIDDEN STATEs of mind are those which have strongly grasped the material..
@delldell53928 жыл бұрын
+Dell Dell piggy backing on my own comment. It seems that the speaker needs to speak certain statements in order to explain the material, which highly suggests a tendency towards remembering material rather than understanding it. i.e. this looks like the speaker has crammed up the HMMs
@akino.31926 жыл бұрын
It's been over a year now so I presume you've mastered the material at this point. Where are your own tutorials? If the observation is that you haven't prepared any better ones, then the hidden state is likely to point at destructive, nit-picking critics? I dunno. I'm still learning this stuff but Mathematical monk sure is getting me on the right track.
@edwardbanner11176 жыл бұрын
It can take a long time to write a script rather than just winging it, long enough that this video might not even exist. Personally I am grateful for having a "winged" video rather than no video at all.
@Santa19366 жыл бұрын
I disagree, the off the cuff nature of these videos makes them feel more natural and more interesting to listen to
@anondoggo2 жыл бұрын
It would really hlpe if z and x were defined.
@pelemanov13 жыл бұрын
That should be "infer", not "inference" the second time.
@avengedknot53196 жыл бұрын
Cut. Paste. Boom.
@보쿠링6 жыл бұрын
lol
@weihonglow39944 жыл бұрын
boom!
@nikiceg12 жыл бұрын
great...but 2 slow
@delldell53928 жыл бұрын
I can do away with filler statements like "I guess I must say X1", etc. Why are you guessing the material? Should we also learn how to 'guess' things, especially in a subject like mathematics which can be extremely formal and straightforward? I suggest that you read formal papers, beginning with Euclid's theorem of GCD. See how much guess work was involved in describing the theorem. Then move to more complicated theorems and proofs and see how much guess work they had in their proofs and description.
@Marik06 жыл бұрын
Wow, the level of nitpicking is amazing. Apparently you have never taken a lesson from a real person, you have been probably taught only by robots. "I guess" is used here as a figure of speech because after a minute he proves why X1 is not needed. Probably you are also unfamiliar with fact that mathematicalmonk is a professor of biostatistics at a prestigious university. Maybe it was pure lack, who knows :) My (unsolicited) advice is: Be constructive about free material that people devoted time to create and share, not a smart ass.