1) La descarga del recurso depende de la página de origen
2) Para poder descargar el recurso, es necesario ser usuario registrado en Universia

Opción 1: Descargar recurso

Detalles del recurso


For the problem of high-dimensional sparse linear regression, it is known that an $\ell_{0}$-based estimator can achieve a $1/n$ “fast” rate for prediction error without any conditions on the design matrix, whereas in the absence of restrictive conditions on the design matrix, popular polynomial-time methods only guarantee the $1/\sqrt{n}$ “slow” rate. In this paper, we show that the slow rate is intrinsic to a broad class of M-estimators. In particular, for estimators based on minimizing a least-squares cost function together with a (possibly nonconvex) coordinate-wise separable regularizer, there is always a “bad” local optimum such that the associated prediction error is lower bounded by a constant multiple of $1/\sqrt{n}$. For convex regularizers, this lower bound applies to all global optima. The theory is applicable to many popular estimators, including convex $\ell_{1}$-based methods as well as M-estimators based on nonconvex regularizers, including the SCAD penalty or the MCP regularizer. In addition, we show that bad local optima are very common, in that a broad class of local minimization algorithms with random initialization typically converge to a bad solution.

Pertenece a

Project Euclid (Hosted at Cornell University Library)  


Zhang, Yuchen -  Wainwright, Martin J. -  Jordan, Michael I. - 

Id.: 69702381

Idioma: inglés  - 

Versión: 1.0

Estado: Final

Tipo:  application/pdf - 

Palabras claveSparse linear regression - 

Tipo de recurso: Text  - 

Tipo de Interactividad: Expositivo

Nivel de Interactividad: muy bajo

Audiencia: Estudiante  -  Profesor  -  Autor  - 

Estructura: Atomic

Coste: no

Copyright: sí

: Copyright 2017 The Institute of Mathematical Statistics and the Bernoulli Society

Formatos:  application/pdf - 

Requerimientos técnicos:  Browser: Any - 

Relación: [References] 1935-7524

Fecha de contribución: 28-ago-2017


* Electron. J. Statist. 11, no. 1 (2017), 752-799
* doi:10.1214/17-EJS1233

Otros recursos del mismo autor(es)

  1. Decoding from Pooled Data: Phase Transitions of Message Passing We consider the problem of decoding a discrete signal of categorical variables from the observation ...
  2. Application of maximal information coefficient and affinity propagation to characterizing seismic time series associated with earthquakes Appropriate feature-based representations are significant for time series analysis and subsequent ma...
  3. Joint covariate selection and joint subspace selection for multiple classification problems We address the problem of recovering a common set of covariates that are relevant simultaneously to ...
  4. Bayesian Nonparametric Inference of Switching Dynamic Linear Models Many complex dynamical phenomena can be effectively modeled by a system that switches among a set of...
  5. A sticky HDP-HMM with application to speaker diarization We consider the problem of speaker diarization, the problem of segmenting an audio recording of a me...

Otros recursos de la mismacolección

  1. Recursive construction of confidence regions Assuming that one-step transition kernel of a discrete time, time-homogenous Markov chain model is p...
  2. The control of the false discovery rate in fixed sequence multiple testing Controlling false discovery rate (FDR) is a powerful approach to multiple testing. In many applicati...
  3. Geometric ergodicity of Rao and Teh’s algorithm for Markov jump processes and CTBNs Rao and Teh (2012, 2013) introduced an efficient MCMC algorithm for sampling from the posterior dist...
  4. Detection of low dimensionality and data denoising via set estimation techniques This work is closely related to the theories of set estimation and manifold estimation. Our object o...
  5. Revisiting the Hodges-Lehmann estimator in a location mixture model: Is asymptotic normality good enough? Location mixture models, resulting in shifting a common distribution with some probability, have bee...

Aviso de cookies: Usamos cookies propias y de terceros para mejorar nuestros servicios, para análisis estadístico y para mostrarle publicidad. Si continua navegando consideramos que acepta su uso en los términos establecidos en la Política de cookies.