Volume 39, pp. 437-463, 2012.

Gradient descent for Tikhonov functionals with sparsity constraints: Theory and numerical comparison of step size rules

Dirk A. Lorenz, Peter Maass, and Pham Q. Muoi

Abstract

In this paper, we analyze gradient methods for minimization problems arising in the regularization of nonlinear inverse problems with sparsity constraints. In particular, we study a gradient method based on the subsequent minimization of quadratic approximations in Hilbert spaces, which is motivated by a recently proposed equivalent method in a finite-dimensional setting. We prove convergence of this method employing assumptions on the operator which are different compared to other approaches. We also discuss accelerated gradient methods with step size control and present a numerical comparison of different step size selection criteria for a parameter identification problem for an elliptic partial differential equation.

Full Text (PDF) [1.3 MB], BibTeX

Key words

nonlinear inverse problems, sparsity constraints, gradient descent, iterated soft shrinkage, accelerated gradient method

AMS subject classifications

65K10, 46N10, 65M32, 90C48

Links to the cited ETNA articles

[27]Vol. 38 (2011), pp. 233-257 Stefan Kindermann: Convergence analysis of minimization-based noise level-free parameter choice rules for linear ill-posed problems
[37]Vol. 37 (2010), pp. 87-104 Ronny Ramlau and Elena Resmerita: Convergence rates for regularization with sparsity constraints

< Back