Understanding Algorithmic Discrimination in Health Economics Through the Lens of Measurement Errors

Working Paper: NBER ID: w29413

Authors: Anirban Basu; Noah Hammarlund; Sara Khor; Aasthaa Bansal

Abstract: There is growing concern that the increasing use of machine learning and artificial intelligence-based systems may exacerbate health disparities through discrimination. We provide a hierarchical definition of discrimination consisting of algorithmic discrimination arising from predictive scores used for allocating resources and human discrimination arising from allocating resources by human decision-makers conditional on these predictive scores. We then offer an overarching statistical framework of algorithmic discrimination through the lens of measurement errors, which is familiar to the health economics audience. Specifically, we show that algorithmic discrimination exists when measurement errors exist in either the outcome or the predictors, and there is endogenous selection for participation in the observed data. The absence of any of these phenomena would eliminate algorithmic discrimination. We show that although equalized odds constraints can be employed as bias-mitigating strategies, such constraints may increase algorithmic discrimination when there is measurement error in the dependent variable.

Keywords: algorithmic discrimination; health economics; measurement errors; machine learning; health disparities

JEL Codes: C53; I10; I14


Causal Claims Network Graph

Edges that are evidenced by causal inference methods are in orange, and the rest are in light blue.


Causal Claims

CauseEffect
measurement errors (C20)algorithmic discrimination (J71)
endogenous selection (C92)algorithmic discrimination (J71)
equalized odds constraints + measurement errors (C20)algorithmic discrimination (J71)

Back to index