Working Paper: NBER ID: w28548
Authors: Bryan S. Graham; Fengshi Niu; James L. Powell
Abstract: We study nonparametric regression in a setting where N(N-1) dyadic outcomes are observed for N randomly sampled units. Outcomes across dyads sharing a unit in common may be dependent (i.e., our dataset exhibits dyadic dependence). We present two sets of results. First, we calculate lower bounds on the minimax risk for estimating the regression function at (i) a point and (ii) under the infinity norm. Second, we calculate (i) pointwise and (ii) uniform convergence rates for the dyadic analog of the familiar Nadaraya-Watson (NW) kernel regression estimator. We show that the NW kernel regression estimator achieves the optimal rates suggested by our risk bounds when an appropriate bandwidth sequence is chosen. This optimal rate differs from the one available under iid data: the effective sample size is smaller and dimension of the regressor vector influences the rate differently.
Keywords: nonparametric regression; dyadic data; minimax risk; uniform convergence; kernel regression
JEL Codes: C14
Edges that are evidenced by causal inference methods are in orange, and the rest are in light blue.
Cause | Effect |
---|---|
Nadaraya-Watson kernel regression estimator (C29) | optimal convergence rates (C61) |
dyadic dependence (C29) | slows down feasible rate of convergence (F62) |
choice of estimator (C51) | convergence rates achieved under dyadic dependence (C69) |
effective sample size for dyadic estimation problems (C83) | determined by number of units n (C29) |
dimension of x_i (C51) | relevant dimension for estimation (C51) |