Add abstract
Want to add your dissertation abstract to this database? It only takes a minute!
Search abstract
Search for abstracts by subject, author or institution
Want to add your dissertation abstract to this database? It only takes a minute!
Search for abstracts by subject, author or institution
Variational Estimators in Statistical Multiscale Analysis
by Housen Li
Institution: | Georg-August-Universität Göttingen |
---|---|
Year: | 2016 |
Posted: | 02/05/2017 |
Record ID: | 2135532 |
Full text PDF: | http://hdl.handle.net/11858/00-1735-0000-0028-875F-F |
In recent years, a novel type of multiscale variational statistical approaches, based on so-called multiscale statistics, have received increasing popularity in various applications, such as signal recovery, imaging and image processing, mainly because they in general perform uniformly well over a range of different scales (i.e. sizes of features). By contrast, the underlying statistical theory for these methods is still lacking, in particular with regard to the asymptotic convergence behavior. For the sake of narrowing such gap, we propose and analyze a constrained variational approach, which we call MultIscale Nemirovski-Dantzig (MIND) estimator, for recovering smooth functions in the settings of nonparametric regression and statistical inverse problems. It can be viewed as a multiscale extension of the Dantzig selector (Ann. Statist., 35(6): 2313–51, 2009) based on early ideas of Nemirovski (J. Comput. System Sci., 23:1–11, 1986). To be precise, MIND minimizes a homogeneous Sobolev norm under the constraint that the multiresolution norm of the residual is bounded by a universal threshold. The main contribution of this work is the derivation of convergence rates of MIND both almost surely and in expectation for nonparametric regression and linear statistical inverse problems. To this end, we generalize the Nemirovski’s interpolation inequality for the multiresolution norm and Sobolev norms, and introduce the method of approximate source conditions to our statistical setting. Based on these tools, we are able to obtain certain convergence rates under abstract smoothness assumptions about the truth. For a one-dimensional signal, such assumptions can be translated into classical smoothness classes and source sets by means of the approximation properties of B-splines. As a consequence, MIND attains almost minimax optimal rates simultaneously over a large range of Sobolev and Besov classes, for nonparametric regression of functions and their derivatives. Analogous results have been also obtained for certain linear statistical inverse problems, such as deconvolution if the Fourier coefficients of the convolution kernel is of polynomial decay. Put differently, these results reveal that MIND possesses certain adaptation to the smoothness of the underlying true signal. In parallel, we have presented a similar analysis for a penalized version of MIND, and its parameter choice via the Lepskii balancing principle. Finally, complimentary to the asymptotic analysis, we examine the finite sample performance of MIND by various numerical simulations. Advisors/Committee Members: Munk, Axel (advisor), Munk, Axel (referee), Haltmeier, Markus (referee), Aspelmeier, Timo (referee), Bahns, Dorothea (referee), Krivobokova, Tatyana (referee), Wardetzky, Max (referee).
Want to add your dissertation abstract to this database? It only takes a minute!
Search for abstracts by subject, author or institution
Proof in Alonzo Church's and Alan Turing's Mathema...
Undecidability of First Order Logic
|
|
New Splitting Iterative Methods for Solving Multid...
|
|
A Reusable Learning Object Design Model for Elemen...
|
|
Finding the Real Odds
Attrition and Time-to-Degree in the FSU College of...
|
|
Modelling and Simulation of Stochastic Volatility ...
|
|
Radiative Transfer Using Boltzmann Transport Theor...
|
|
Modeling Credit Risk and Pricing Credit Derivative...
|
|
Canonical Auto and Cross Correlations of Multivari...
|
|