Catalog Home Page

Comparisons of parameter choice methods for regularization with discrete noisy data

Lukas, M.A. (1999) Comparisons of parameter choice methods for regularization with discrete noisy data. Inverse Problems, 14 (1). pp. 161-184.

Link to Published Version: http://dx.doi.org/10.1088/0266-5611/14/1/014
*Subscription may be required

Abstract

Several prominent methods have been developed for the crucial selection of the parameter in regularization of linear ill-posed problems with discrete, noisy data. The discrepancy principle (DP), minimum bound (MB) method and generalized cross-validation (GCV) are known to be at least weakly asymptotically optimal with respect to appropriate loss functions as the number n of data points approaches infinity. We compare these methods in three other ways. First, n is taken to be fixed and. using a discrete Picard condition, upper and lower bounds on the 'expected' DP and MB estimates are derived in terms of the optimal parameters with respect to the risk and expected error. Next, we define a simple measure of the variability of a practical estimate and. for each of the five methods, determine its asymptotic behaviour. The results are that the asymptotic stability of GCV is the same as for the unbiased risk method and is superior to that of DP, which is better than for MB and an unbiased error method. Finally, the results of numerical simulations of the five methods demonstrate that the theoretical conclusions hold in practice.

Publication Type: Journal Article
Murdoch Affiliation: School of Mathematical and Physical Sciences
Publisher: Institute of Physics
Copyright: © 1999 IOP Publishing Ltd
URI: http://researchrepository.murdoch.edu.au/id/eprint/15201
Item Control Page Item Control Page