Tradeoffs Between Global And Local Risks In Nonparametric Function Estimation
Tony Cai, Mark Low, and Linda Zhao
Abstract:
The problem of loss adaptation is investigated:
given a fixed parameter the goal is to
construct an estimator that adapts to the loss function
in the sense that the estimator is optimal both globally and locally
at every point. Given the class of estimator sequences that
achieve the minimax rate, over a fixed Besov space, for estimating the
entire function a lower bound is given on the performance
for estimating the function at each point. This bound
is larger by a logarithmic factor than the usual minimax rate for
estimation at a point when the global and local minimax rates of
convergence differ.
A lower bound for
the maximum global risk is given
for estimators that achieve optimal
minimax rates of convergence at every point.
An inequality concerning estimation in a two parameter statistical
problem plays a key role in the proof. It can be considered as an
generalization of an inequality in Brown and Low (1996b). This may be
of independent interest.
A particular wavelet estimator is constructed which is globally
optimal and which attains the lower bound for the local risk provided
by our inequality.