1) La descarga del recurso depende de la página de origen
2) Para poder descargar el recurso, es necesario ser usuario registrado en Universia


Opción 1: Descargar recurso

Detalles del recurso

Descripción

For the problem of high-dimensional sparse linear regression, it is known that an $\ell_{0}$-based estimator can achieve a $1/n$ “fast” rate for prediction error without any conditions on the design matrix, whereas in the absence of restrictive conditions on the design matrix, popular polynomial-time methods only guarantee the $1/\sqrt{n}$ “slow” rate. In this paper, we show that the slow rate is intrinsic to a broad class of M-estimators. In particular, for estimators based on minimizing a least-squares cost function together with a (possibly nonconvex) coordinate-wise separable regularizer, there is always a “bad” local optimum such that the associated prediction error is lower bounded by a constant multiple of $1/\sqrt{n}$. For convex regularizers, this lower bound applies to all global optima. The theory is applicable to many popular estimators, including convex $\ell_{1}$-based methods as well as M-estimators based on nonconvex regularizers, including the SCAD penalty or the MCP regularizer. In addition, we show that bad local optima are very common, in that a broad class of local minimization algorithms with random initialization typically converge to a bad solution.

Pertenece a

Project Euclid (Hosted at Cornell University Library)  

Autor(es)

Zhang, Yuchen -  Wainwright, Martin J. -  Jordan, Michael I. - 

Id.: 69702381

Idioma: inglés  - 

Versión: 1.0

Estado: Final

Tipo:  application/pdf - 

Palabras claveSparse linear regression - 

Tipo de recurso: Text  - 

Tipo de Interactividad: Expositivo

Nivel de Interactividad: muy bajo

Audiencia: Estudiante  -  Profesor  -  Autor  - 

Estructura: Atomic

Coste: no

Copyright: sí

: Copyright 2017 The Institute of Mathematical Statistics and the Bernoulli Society

Formatos:  application/pdf - 

Requerimientos técnicos:  Browser: Any - 

Relación: [References] 1935-7524

Fecha de contribución: 28-ago-2017

Contacto:

Localización:
* Electron. J. Statist. 11, no. 1 (2017), 752-799
* doi:10.1214/17-EJS1233

Otros recursos del mismo autor(es)

  1. Posteriors, conjugacy, and exponential families for completely random measures We demonstrate how to calculate posteriors for general Bayesian nonparametric priors and likelihoods...
  2. Support recovery without incoherence: A case for nonconvex regularization We develop a new primal-dual witness proof framework that may be used to establish variable selectio...
  3. Latent Marked Poisson Process with Applications to Object Segmentation In difficult object segmentation tasks, utilizing image information alone is not sufficient; incorpo...
  4. Measuring Cluster Stability for Bayesian Nonparametrics Using the Linear Bootstrap 9 pages, NIPS 2017 Advances in Approximate Bayesian Inference Workshop
  5. Bayesian Nonparametric Inference of Switching Linear Dynamical Systems Many complex dynamical phenomena can be effectively modeled by a system that switches among a set of...

Otros recursos de la mismacolección

  1. Slice inverse regression with score functions We consider non-linear regression problems where we assume that the response depends non-linearly on...
  2. Dimension reduction-based significance testing in nonparametric regression A dimension reduction-based adaptive-to-model test is proposed for significance of a subset of covar...
  3. High-dimensional robust precision matrix estimation: Cellwise corruption under $\epsilon $-contamination We analyze the statistical consistency of robust estimators for precision matrices in high dimension...
  4. A two stage $k$-monotone B-spline regression estimator: Uniform Lipschitz property and optimal convergence rate This paper considers $k$-monotone estimation and the related asymptotic performance analysis over a ...
  5. Uniformly valid confidence sets based on the Lasso In a linear regression model of fixed dimension $p\leq n$, we construct confidence regions for the u...

Aviso de cookies: Usamos cookies propias y de terceros para mejorar nuestros servicios, para análisis estadístico y para mostrarle publicidad. Si continua navegando consideramos que acepta su uso en los términos establecidos en la Política de cookies.