Multistep approximation algorithms: Improved convergence rates through postconditioning with smoothing kernels

Gregory E. Fasshauer*, Joseph W. Jerome

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

19 Scopus citations

Abstract

We show how certain widely used multistep approximation algorithms can be interpreted as instances of an approximate Newton method. It was shown in an earlier paper by the second author that the convergence rates of approximate Newton methods (in the context of the numerical solution of PDEs) suffer from a "loss of derivatives", and that the subsequent linear rate of convergence can be improved to be superlinear using an adaptation of Nash-Moser iteration for numerical analysis purposes; the essence of the adaptation being a splitting of the inversion and the smoothing into two separate steps. We show how these ideas apply to scattered data approximation as well as the numerical solution of partial differential equations. We investigate the use of several radial kernels for the smoothing operation. In our numerical examples we use radial basis functions also in the inversion step.

Original languageEnglish (US)
Pages (from-to)1-27
Number of pages27
JournalAdvances in Computational Mathematics
Volume10
Issue number1
DOIs
StatePublished - 1999

ASJC Scopus subject areas

  • Computational Mathematics
  • Applied Mathematics

Fingerprint Dive into the research topics of 'Multistep approximation algorithms: Improved convergence rates through postconditioning with smoothing kernels'. Together they form a unique fingerprint.

Cite this