Data Privacy and Algorithmic Inequality

Working Paper: NBER ID: w31250

Authors: Zhuang Liu; Michael Sockin; Wei Xiong

Abstract: This paper develops a foundation for a consumer's preference for data privacy by linking it to the desire to hide behavioral vulnerabilities. Data sharing with digital platforms enhances the matching efficiency for standard consumption goods, but also exposes individuals with self-control issues to temptation goods. This creates a new form of inequality in the digital era—algorithmic inequality. Although data privacy regulations provide consumers with the option to opt out of data sharing, these regulations cannot fully protect vulnerable consumers because of data-sharing externalities. The coordination problem among consumers may also lead to multiple equilibria with drastically different levels of data sharing by consumers. Our quantitative analysis further illustrates that although data is non-rival and beneficial to social welfare, it can also exacerbate algorithmic inequality.

Keywords: Data Privacy; Algorithmic Inequality; Consumer Welfare; Digital Platforms

JEL Codes: D0; E0


Causal Claims Network Graph

Edges that are evidenced by causal inference methods are in orange, and the rest are in light blue.


Causal Claims

CauseEffect
data sharing (D16)matching efficiency for standard consumption goods (D11)
data sharing (D16)algorithmic inequality (C69)
weak-willed consumers (D11)data privacy (K24)
data privacy (K24)access to desired normal goods (D10)
full data sharing conditions (Y10)welfare gap between strong-willed and weak-willed consumers (D10)
lack of data privacy regulations (K24)exploitation of vulnerable consumers (D18)

Back to index