3 - The Flow of Causation and Association in Graphs (Week 3)

  Рет қаралды 17,399

Brady Neal - Causal Inference

Brady Neal - Causal Inference

Күн бұрын

Пікірлер: 31
@hongkyulee9724
@hongkyulee9724 Жыл бұрын
Your lecture videos, books and slides are all very helpful to me. I hope you always enjoy your research. And I want to be a person who can contribute to the world like you.
@iffatarasanzida7704
@iffatarasanzida7704 2 ай бұрын
Very well described. Thank You!
@diegoliberatosouza1820
@diegoliberatosouza1820 2 жыл бұрын
Thanks, Brady! Your explanations are really clear, congrats!
@zeeeeeeeeeavs
@zeeeeeeeeeavs 4 жыл бұрын
Great tutorial! Very clearly explained
@gwillis3323
@gwillis3323 3 жыл бұрын
Hi Brady, Why does the Local Markov Assumption on its own permit P(x,y)=P(x)P(y) in the case at 11:50? I would have thought that the Local Markov Assumption in that case only says that Y is independent of its non-descendants, given its parents. As Y has only one parent X, this means that P(Y|X,A)=P(Y|X) where A is any variable that is not a descendant of Y. Maybe it's just semantics, but I would take it as self-evident, that the Local Markov Assumption could be more explicitly expressed as "Given its parents in the DAG, a node X is independent of any variable A that is neither a descendant, not a parent", which surely would cover the case at 11:50?
@Kane9530
@Kane9530 Жыл бұрын
Hi, is there any update on this question?
@luoluoye9620
@luoluoye9620 11 ай бұрын
@@Kane9530 I don't understand this either.
@CathyZhang
@CathyZhang 9 ай бұрын
I think even though the Local Markov Assumption says that node X is independent of its non-descendants given its parents, it does NOT say whether or not there is a dependency between node X and its parent. That is why we need the second assumption.
@plmedici
@plmedici 6 ай бұрын
Like your explanation @CathyZhang. However, he just showed us on the previous slide that the local Markov assumption is equivalent to the Bayesian network factorization, meaning that making the local Markov assumption implies that the joint distribution could only be written in the first way (Pr(x,y) = Pr(x)Pr(y|x)). Am still not following what Brady is saying here, I'm afraid.
@filippobuonco95
@filippobuonco95 Жыл бұрын
Amazing lecture thanks for this incredible gift Brady!
@Alexfnplays
@Alexfnplays 2 жыл бұрын
Super clear and great tutorial. Really enjoyed it.
@peasant12345
@peasant12345 6 ай бұрын
very interesting examples
@yusichou7365
@yusichou7365 4 жыл бұрын
Nice course. Wish to watch your video next week~~
@johol355
@johol355 3 жыл бұрын
Thanks for the awesome series! Please, add answers for the questions. Practicing without any type of feedback mechanism is close to pointless.
@TheProblembaer2
@TheProblembaer2 5 ай бұрын
How would the immorality change if we were to condition on X2? Or is the argument at 29:33 about conditioning on X2? I thought If we were to condition on X2, P(x1) and P(x2) were to be dependent?
@cmaspi9761
@cmaspi9761 Жыл бұрын
Around 41:30 you said conditioning on X1 blocks the path, can you explain that a bit. I was able to establish that they are independent using markov assumption and bayes rule but didn't quite understand how you directly said they are independent.
@dcar6217
@dcar6217 8 ай бұрын
Along the path T->X1->X2
@polarstate
@polarstate 2 жыл бұрын
This lecture series is amazing!! TY ♥️
@grospipo20
@grospipo20 2 жыл бұрын
That is understatement!
@prfontaine5387
@prfontaine5387 2 жыл бұрын
Hi Brady ... I have been disturbed by the notion of (Markovian) Parents of a Variable. It seems that Parents of X are firstly defined relatively to a particular total order "O" over the set of variables (as Parents are a subset of the "O"-predecessors of X) . But we can have situations where Y is a non-descendant of X but not a "O"-predecessor of X ( X is lower than Y according to order "O"). Then it seems hard to see/prove, with the factorization, that Y is independant of X given the parents of X. So I have a suggestion/proposition to replace the Local Markovian Hypothesis. We should ask that given the factorization P(V) = Product(P(X_i | pa_i) there exists a total order "O" compatible with the pa_i i.e. satisfying (Z,Xi) € O if Z € pa_i. Now if Y is a nondescendant of X_i two cases : either (Y,X) € O and then by minimality of pa_i, X is independant of Y given pa_i ; or (X_i,Y) € O and then one can consider an alternate total order O' (also compatible with the parents as stemming from the factorization ) where (Y,X_i)€ O'. Then by the minimality of pa_i X is independant of Y given pa_i . For short, when Y is a non descendant of X but greater than X w.r.t to the total order, this means that Y is not comparable to X w.r.t the partial order stemming from the factorization. That's why we can always define an alternative total order that preseves the factorization and where this non-descendant Y is place in a lower place w.r.t this alternative total order. It seems that Pearls as seen this point (Introduction to Probabilities, Graphs, and Causal Models, page 16, "the product decomposition (1.33) is no longer order-specific since ...") "
@chadpark9248
@chadpark9248 4 жыл бұрын
Proof of conditional independence in Forks, p(x1,x2,x3) = p(x1|x2)p(x2)p(x3|x2) - (1) then If you divide the sides into p(x2), p(x1,x3|x2) = p(x1|x2)p{x3|x2) - (2) is it the correct answer?
@BradyNealCausalInference
@BradyNealCausalInference 4 жыл бұрын
Sure, that works!
@bripolek1194
@bripolek1194 2 жыл бұрын
Hi Brady, Thank you for your lecture. By the way, at the first question of d-separation, how about blocking {W1, M2, X1}? Thank you.
@melbatutor3896
@melbatutor3896 4 жыл бұрын
Is it called d-separation because it breaks dependence?
@BradyNealCausalInference
@BradyNealCausalInference 4 жыл бұрын
The "d" stands for "directed" or "directional" to distinguish it from separation in undirected graphs. Markov networks are the undirected version of Bayesian networks, and there is a simpler separation condition in Markov networks.
@melbatutor3896
@melbatutor3896 4 жыл бұрын
Why can't the graph have cycles?
@BradyNealCausalInference
@BradyNealCausalInference 4 жыл бұрын
It can; things just become much more complicated, moving us into differential equations territory. For example, see Mooij et al. (2013): arxiv.org/abs/1304.7920 For some simple intuition, consider the following: if a graph has a cycle A -> B -> C -> A and you want to know the effect of A on C, do you mean the causal association flowing along the path A -> B -> C or along the path A -> B -> C -> A -> B -> C or along the path A -> B -> C -> A -> B -> C -> A -> B -> C, etc.? It's a feedback loop.
@konstantinoschristopoulos764
@konstantinoschristopoulos764 2 жыл бұрын
The reason I don't like the good-looking jerks example is that its theoretical assumptions are completely flawed. I prefer the restaurant example of Richard Mcelreath i.e. good restaurants are in nice locations.
@tOo_matcha
@tOo_matcha 2 жыл бұрын
my favorite is where independent nodes are throws of two dice (X and Y) and the collider is the sum of their values (Z). If I observe the sum and the value of one die, then I know the value of the other die - by just doing Y = Z - X. But if I just observe the sum Z, I cannot say for sure what X and Y values are.
@tamas5002
@tamas5002 Жыл бұрын
In Hungarian, the word graph is mostly used only for graphs that consist of vertices and edges. This way we don't mix it up with other charts or functions. 🙂
@whaleshark8700
@whaleshark8700 2 жыл бұрын
the available men example is 🤣🤣🤣🤣
4 - Causal Models
48:29
Brady Neal - Causal Inference
Рет қаралды 20 М.
2 - Potential Outcomes (Week 2)
54:01
Brady Neal - Causal Inference
Рет қаралды 27 М.
小丑女COCO的审判。#天使 #小丑 #超人不会飞
00:53
超人不会飞
Рет қаралды 16 МЛН
Mom Hack for Cooking Solo with a Little One! 🍳👶
00:15
5-Minute Crafts HOUSE
Рет қаралды 23 МЛН
Cheerleader Transformation That Left Everyone Speechless! #shorts
00:27
Fabiosa Best Lifehacks
Рет қаралды 16 МЛН
5 - Randomized Experiments and Identification
40:32
Brady Neal - Causal Inference
Рет қаралды 11 М.
6 - Estimation
39:53
Brady Neal - Causal Inference
Рет қаралды 10 М.
1 - A Brief Introduction to Causal Inference (Course Preview)
42:11
Brady Neal - Causal Inference
Рет қаралды 67 М.
9 - Difference-in-Differences
33:01
Brady Neal - Causal Inference
Рет қаралды 10 М.
7 - Unobserved Confounding, Bounds, and Sensitivity Analysis
1:00:30
Brady Neal - Causal Inference
Рет қаралды 8 М.
Lectures on Causality: Jonas Peters, Part 1
1:44:05
Broad Institute
Рет қаралды 76 М.
1.5 - Causation in Observational Studies
11:58
Brady Neal - Causal Inference
Рет қаралды 24 М.
Introduction to Causal Inference: Philosophy, Framework and Key Methods PART ONE
1:32:54
O'Brien Institute for Public Health
Рет қаралды 14 М.
14.3 - Mediation
15:37
Brady Neal - Causal Inference
Рет қаралды 5 М.
小丑女COCO的审判。#天使 #小丑 #超人不会飞
00:53
超人不会飞
Рет қаралды 16 МЛН