Working Paper: NBER ID: w25819
Authors: Jerry A. Hausman; Haoyang Liu; Ye Luo; Christopher Palmer
Abstract: The popular quantile regression estimator of Koenker and Bassett (1978) is biased if there is an additive error term. Approaching this problem as an errors-in-variables problem where the dependent variable suffers from classical measurement error, we present a sieve maximum-likelihood approach that is robust to left-hand side measurement error. After providing sufficient conditions for identification, we demonstrate that when the number of knots in the quantile grid is chosen to grow at an adequate speed, the sieve maximum-likelihood estimator is consistent and asymptotically normal, permitting inference via bootstrapping. We verify our theoretical results with Monte Carlo simulations and illustrate our estimator with an application to the returns to education highlighting changes over time in the returns to education that have previously been masked by measurement-error bias.
Keywords: quantile regression; measurement error; sieve maximum likelihood; returns to education
JEL Codes: C19; C21; C31; I24; I26; J30
Edges that are evidenced by causal inference methods are in orange, and the rest are in light blue.
Cause | Effect |
---|---|
Measurement error in the dependent variable (C20) | Bias in quantile regression estimates (C21) |
Compression bias (C46) | Underestimation of returns to education for higher wage earners (I26) |
Correcting for measurement error (C20) | Increase in returns to education over time (I26) |
Measurement error in the dependent variable (C20) | Compression bias (C46) |
Measurement error in the dependent variable (C20) | Upward bias for lower quantiles (C46) |
Measurement error in the dependent variable (C20) | Downward bias for higher quantiles (C46) |