1) La descarga del recurso depende de la página de origen
2) Para poder descargar el recurso, es necesario ser usuario registrado en Universia


Opción 1: Descargar recurso

Detalles del recurso

Descripción

For the problem of high-dimensional sparse linear regression, it is known that an $\ell_{0}$-based estimator can achieve a $1/n$ “fast” rate for prediction error without any conditions on the design matrix, whereas in the absence of restrictive conditions on the design matrix, popular polynomial-time methods only guarantee the $1/\sqrt{n}$ “slow” rate. In this paper, we show that the slow rate is intrinsic to a broad class of M-estimators. In particular, for estimators based on minimizing a least-squares cost function together with a (possibly nonconvex) coordinate-wise separable regularizer, there is always a “bad” local optimum such that the associated prediction error is lower bounded by a constant multiple of $1/\sqrt{n}$. For convex regularizers, this lower bound applies to all global optima. The theory is applicable to many popular estimators, including convex $\ell_{1}$-based methods as well as M-estimators based on nonconvex regularizers, including the SCAD penalty or the MCP regularizer. In addition, we show that bad local optima are very common, in that a broad class of local minimization algorithms with random initialization typically converge to a bad solution.

Pertenece a

Project Euclid (Hosted at Cornell University Library)  

Autor(es)

Zhang, Yuchen -  Wainwright, Martin J. -  Jordan, Michael I. - 

Id.: 69702381

Idioma: inglés  - 

Versión: 1.0

Estado: Final

Tipo:  application/pdf - 

Palabras claveSparse linear regression - 

Tipo de recurso: Text  - 

Tipo de Interactividad: Expositivo

Nivel de Interactividad: muy bajo

Audiencia: Estudiante  -  Profesor  -  Autor  - 

Estructura: Atomic

Coste: no

Copyright: sí

: Copyright 2017 The Institute of Mathematical Statistics and the Bernoulli Society

Formatos:  application/pdf - 

Requerimientos técnicos:  Browser: Any - 

Relación: [References] 1935-7524

Fecha de contribución: 28-ago-2017

Contacto:

Localización:
* Electron. J. Statist. 11, no. 1 (2017), 752-799
* doi:10.1214/17-EJS1233

Otros recursos del mismo autor(es)

  1. Support recovery without incoherence: A case for nonconvex regularization We develop a new primal-dual witness proof framework that may be used to establish variable selectio...
  2. Latent Marked Poisson Process with Applications to Object Segmentation In difficult object segmentation tasks, utilizing image information alone is not sufficient; incorpo...
  3. Measuring Cluster Stability for Bayesian Nonparametrics Using the Linear Bootstrap 9 pages, NIPS 2017 Advances in Approximate Bayesian Inference Workshop
  4. Bayesian Nonparametric Inference of Switching Linear Dynamical Systems Many complex dynamical phenomena can be effectively modeled by a system that switches among a set of...
  5. A Sticky HDP-HMM With Application to Speaker Diarization We consider the problem of speaker diarization, the problem of segmenting an audio recording of a me...

Otros recursos de la mismacolección

  1. Locally stationary functional time series The literature on time series of functional data has focused on processes of which the probabilistic...
  2. On misspecifications in regularity and properties of estimators The problem of parameter estimation by the continuous time observations of a deterministic signal in...
  3. Confidence intervals for the means of the selected populations Consider an experiment in which $p$ independent populations $\pi_{i}$ with corresponding unknown mea...
  4. Change detection via affine and quadratic detectors The goal of the paper is to develop a specific application of the convex optimization based hypothes...

Aviso de cookies: Usamos cookies propias y de terceros para mejorar nuestros servicios, para análisis estadístico y para mostrarle publicidad. Si continua navegando consideramos que acepta su uso en los términos establecidos en la Política de cookies.