Machine Learning for Regularized Survey Forecast Combination: Partially-Egalitarian Lasso and Its Derivatives

Working Paper: NBER ID: w24967

Authors: Francis X. Diebold; Minchul Shin

Abstract: Despite the clear success of forecast combination in many economic environments, several important issues remain incompletely resolved. The issues relate to selection of the set of forecasts to combine, and whether some form of additional regularization (e.g., shrinkage) is desirable. Against this background, and also considering the frequently-found good performance of simple-average combinations, we propose a LASSO-based procedure that sets some combining weights to zero and shrinks the survivors toward equality ("partially-egalitarian LASSO"). Ex-post analysis reveals that the optimal solution has a very simple form: The vast majority of forecasters should be discarded, and the remainder should be averaged. We therefore propose and explore direct subset-averaging procedures motivated by the structure of partially-egalitarian LASSO and the lessons learned, which, unlike LASSO, do not require choice of a tuning parameter. Intriguingly, in an application to the European Central Bank Survey of Professional Forecasters, our procedures outperform simple average and median forecasts – indeed they perform approximately as well as the ex-post best forecaster.

Keywords: forecast combination; lasso; machine learning; economic forecasts

JEL Codes: C53


Causal Claims Network Graph

Edges that are evidenced by causal inference methods are in orange, and the rest are in light blue.


Causal Claims

CauseEffect
partially-egalitarian lasso (pelasso) method (C51)better forecast combinations (C53)
better forecast combinations (C53)improved out-of-sample forecast accuracy (C53)
optimal solution (discarding less effective forecasters and averaging survivors) (C53)improved out-of-sample forecast accuracy (C53)
pelasso methods (C59)reduced out-of-sample RMSE relative to simple averages (C32)

Back to index