Solving inverse problems with sparsity promoting regularizing penalties can be recast in the Bayesian framework as finding a maximum a posteriori (MAP) estimate with sparsity promoting priors. In the latter context, a computationally convenient choice of prior is the family of conditionally Gaussian hierarchical models for which the prior variances of the components of the unknown are independent and follow a hyperprior from a generalized gamma family. In this paper, we analyze the optimization problem behind the MAP estimation and identify hyperparameter combinations that lead to a globally or locally convex optimization problem. The MAP estimation problem is solved using a computationally efficient alternating iterative algorithm. Its properties in the context of the generalized gamma hypermodel and its connections with some known sparsity promoting penalty methods are analyzed. Computed examples elucidate the convergence and sparsity promoting properties of the algorithm.

Sparse reconstructions from few noisy data: analysis of hierarchical Bayesian models with generalized gamma hyperpriors / Calvetti, Daniela; Pragliola, Monica; Somersalo, Erkki; Strang, Alexander. - In: INVERSE PROBLEMS. - ISSN 0266-5611. - STAMPA. - 36:2(2020), pp. 025010.1-025010.29. [10.1088/1361-6420/ab4d92]

Sparse reconstructions from few noisy data: analysis of hierarchical Bayesian models with generalized gamma hyperpriors

Pragliola, Monica;
2020

Abstract

Solving inverse problems with sparsity promoting regularizing penalties can be recast in the Bayesian framework as finding a maximum a posteriori (MAP) estimate with sparsity promoting priors. In the latter context, a computationally convenient choice of prior is the family of conditionally Gaussian hierarchical models for which the prior variances of the components of the unknown are independent and follow a hyperprior from a generalized gamma family. In this paper, we analyze the optimization problem behind the MAP estimation and identify hyperparameter combinations that lead to a globally or locally convex optimization problem. The MAP estimation problem is solved using a computationally efficient alternating iterative algorithm. Its properties in the context of the generalized gamma hypermodel and its connections with some known sparsity promoting penalty methods are analyzed. Computed examples elucidate the convergence and sparsity promoting properties of the algorithm.
2020
Sparse reconstructions from few noisy data: analysis of hierarchical Bayesian models with generalized gamma hyperpriors / Calvetti, Daniela; Pragliola, Monica; Somersalo, Erkki; Strang, Alexander. - In: INVERSE PROBLEMS. - ISSN 0266-5611. - STAMPA. - 36:2(2020), pp. 025010.1-025010.29. [10.1088/1361-6420/ab4d92]
Calvetti, Daniela; Pragliola, Monica; Somersalo, Erkki; Strang, Alexander
File in questo prodotto:
File Dimensione Formato  
2020_IP_Accepted.pdf

accesso aperto

Tipo: Postprint
Licenza: Licenza per Accesso Aperto. Creative Commons Attribuzione - Non commerciale - Non opere derivate (CCBYNCND)
Dimensione 4.99 MB
Formato Adobe PDF
4.99 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11585/722909
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 23
  • ???jsp.display-item.citation.isi??? 23
social impact