Solving inverse problems with sparsity promoting regularizing penalties can be recast in the Bayesian framework as finding a maximum a posteriori (MAP) estimate with sparsity promoting priors. In the latter context, a computationally convenient choice of prior is the family of conditionally Gaussian hierarchical models for which the prior variances of the components of the unknown are independent and follow a hyperprior from a generalized gamma family. In this paper, we analyze the optimization problem behind the MAP estimation and identify hyperparameter combinations that lead to a globally or locally convex optimization problem. The MAP estimation problem is solved using a computationally efficient alternating iterative algorithm. Its properties in the context of the generalized gamma hypermodel and its connections with some known sparsity promoting penalty methods are analyzed. Computed examples elucidate the convergence and sparsity promoting properties of the algorithm.

Sparse reconstructions from few noisy data: analysis of hierarchical Bayesian models with generalized gamma hyperpriors

Pragliola, Monica;
2020

Abstract

Solving inverse problems with sparsity promoting regularizing penalties can be recast in the Bayesian framework as finding a maximum a posteriori (MAP) estimate with sparsity promoting priors. In the latter context, a computationally convenient choice of prior is the family of conditionally Gaussian hierarchical models for which the prior variances of the components of the unknown are independent and follow a hyperprior from a generalized gamma family. In this paper, we analyze the optimization problem behind the MAP estimation and identify hyperparameter combinations that lead to a globally or locally convex optimization problem. The MAP estimation problem is solved using a computationally efficient alternating iterative algorithm. Its properties in the context of the generalized gamma hypermodel and its connections with some known sparsity promoting penalty methods are analyzed. Computed examples elucidate the convergence and sparsity promoting properties of the algorithm.
Calvetti, Daniela; Pragliola, Monica; Somersalo, Erkki; Strang, Alexander
File in questo prodotto:
Eventuali allegati, non sono esposti

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: http://hdl.handle.net/11585/722909
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 7
  • ???jsp.display-item.citation.isi??? 6
social impact