Knowing What Works: The Case for Rigorous Programme Evaluation

Working Paper: CEPR ID: DP2826

Authors: Christoph M. Schmidt

Abstract: Since interventions by the public sector generally commit substantial societal resources, the evaluation of effects and costs of policy interventions is imperative. This Paper outlines why programme evaluation should follow well respected scientific standards and why it should be performed by independent researchers. Moreover, it outlines the three fundamental elements of evaluation research, the choice of the appropriate outcome measure, the assessment of the direct and indirect cost associated with the intervention, and the attribution of effects to underlying causes. The Paper proceeds to outline in intuitive terms that the construction of a credible counterfactual situation is at the heart of the formal statistical evaluation problem. It introduces several approaches, based on both experiments and on non-experimental data that have been proposed in the literature to solve the evaluation problem, and illustrates them numerically.

Keywords: counterfactual; experiments; observational studies

JEL Codes: C40; C90; H43


Causal Claims Network Graph

Edges that are evidenced by causal inference methods are in orange, and the rest are in light blue.


Causal Claims

CauseEffect
non-experimental strategies (C90)causal inference (C20)
rigorous program evaluation (C90)effective and cost-efficient policy interventions (F68)
RCTs (C90)causal relationships (C32)

Back to index