Non-Asymptotic Analysis of Block-Regularized Regression Problem

Salar Fattahi1, Somayeh Sojoudi2

  • 1University of California, Berkeley
  • 2UC Berkeley



Regular Session


10:00 - 12:00 | Mon 17 Dec | Dazzle | MoA01

Optimization I

Full Text


Given a linear multivariate regression problem with block sparsity structure on the regression matrix, one popular approach for estimating its unknown parameter is block-regularization, where the sparsity of different blocks of the regression matrix is promoted by penalizing their $ell_{infty}$-norms. The main goal of this work is to characterize the properties of this estimator under high-dimensional scaling, where the growth rate of the dimension of the problem is comparable or even faster than that of the sample size. In particular, this work generalizes the existing non-asymptotic results on special instances of block-regularized estimators to the case where the unknown regression matrix has an arbitrary number of blocks each with a potentially different size. When the design matrix is deterministic, a sharp non-asymptotic rate is derived on the element-wise error of the proposed estimator. Furthermore, it is proven that the same error rate approximately holds for the block-regularized estimator when the design matrix is randomly generated, provided that the number of samples exceeds a lower bound. The accuracy of the proposed estimator is illustrated on several test cases.

Additional Information

No information added


No videos found