Рет қаралды 6,482
Subscribe to our channel to get notified when we release a new video.
Like the video to tell KZbin that you want more content like this on your feed.
See our website for future seminars: sites.google.c...
Speaker: Andrew Gelman (Columbia University)
Discussants: Elizabeth Tipton (Northwestern), Avi Feller (Berkeley), Jonathan Roth (Brown), Pedro Sant'Anna (Emory)
Title: Better Than Difference in Differences
Abstract: It is not always clear how to adjust for control data in causal inference, balancing the goals of reducing bias and variance. We show how, in a setting with repeated experiments, Bayesian hierarchical modeling yields an adaptive procedure that uses the data to determine how much adjustment to perform. The result is a novel analysis with increased statistical efficiency compared with the default analysis based on difference estimates. The increased efficiency can have real-world consequences in terms of the conclusions that can be drawn from the experiments. An open question is how to apply these ideas in the context of a single experiment or observational study, in which case the optimal adjustment cannot be estimated from the data; still, the principle holds that difference-in-differences can be extremely wasteful of data.The talk follows up on Andrew Gelman and Matthijs Vákár (2021), Slamming the sham: A Bayesian model for adaptive adjustment with noisy control data