Event Details:
Tuesday, November 8, 2022
8:30am - 9:30am PST
This event is open to:
General Public
Free and open to the public
Tuesday, November 8, 2022 [Link to join]
(ID: 996 2837 2037, Password: 386638)
- Speaker: Luke Miratrix (Harvard University)
- Title: A devil’s bargain? Repairing a Difference in Differences parallel trends assumption with an initial matching step
- Discussant: Laura Hatfield (Harvard University)
- Abstract: The Difference in Difference (DiD) estimator is a popular estimator built on the "parallel trends" assumption that the treatment group, absent treatment, would change "similarly" to the control group over time. To increase the plausibility of this assumption, a natural idea is to match treated and control units prior to a DiD analysis. In this paper, we characterize the bias of matching under a class of linear structural models with both observed and unobserved confounders that have time varying effects. Given this framework, we find that matching on baseline covariates generally reduces the bias associated with these covariates, when compared to the original DiD estimator. We further find that additionally matching on pre-treatment outcomes has both cost and benefit. First, matching on pre-treatment outcomes will partially balance unobserved confounders, which mitigates some bias. This reduction is proportional to the outcome's reliability, a measure of how coupled the outcomes are with the latent covariates. On the other hand, we find that matching on pre-treatment outcomes also undermines the second "difference" in a DiD estimate by forcing the treated and control group's pre-treatment outcomes to be equal. This injects bias into the final estimate, creating a bias-bias tradeoff. We extend our bias results to multivariate confounders with multiple pre-treatment periods and find similar results. We summarize our findings with heuristic guidelines on whether to match prior to a DiD analysis, along with a method for roughly estimating the reduction in bias. We illustrate our guidelines by reanalyzing a recent empirical study that used matching prior to a DiD analysis to explore the impact of principal turnover on student achievement.
Related Topics
Explore More Events
-
Causal Science Center
"Causalitea": Causality Networking Social
-Citrus Courtyard, Behind Wallenberg Hall -
Distinguished Lecture
Jeff Heer on Augmenting Data Scientists: The Promise and Peril of AI-Assisted Analysis
-Stanford University Mackenzie Room, Huang Engineering Center 475 Via Ortega Stanford, CA 94305-4121 United States -
Causal Science Center
Bay Area Tech Economics Seminar Series: Causal Inference and Machine Learning
-University of San Fransisco