Abstract
Gaussian process (GP) metamodels have been widely used as surrogates for computer simulations or physical experiments. The heart of GP modeling lies in optimizing the log-likelihood function with respect to the hyperparameters to fit the model to a set of observations. The complexity of the log-likelihood function, computational expense, and numerical instabilities challenge this process. These issues limit the applicability of GP models more when the size of the training data set and/or problem dimensionality increase. To address these issues, we develop a novel approach for fitting GP models that significantly improves computational expense and prediction accuracy. Our approach leverages the smoothing effect of the nugget parameter on the log-likelihood profile to track the evolution of the optimal hyperparameter estimates as the nugget parameter is adaptively varied. The new approach is implemented in the R package GPM and compared to a popular GP modeling R package (GPfit) for a set of benchmark problems. The effectiveness of the approach is also demonstrated using an engineering problem to learn the constitutive law of a hyperelastic composite where the required level of accuracy in estimating the response gradient necessitates a large training data set.
Original language | English (US) |
---|---|
Pages (from-to) | 501-516 |
Number of pages | 16 |
Journal | International Journal for Numerical Methods in Engineering |
Volume | 114 |
Issue number | 5 |
DOIs | |
State | Published - May 4 2018 |
Keywords
- Gaussian process
- computer experiments
- hyperparameter estimation
- ill-conditioned matrix
- nugget parameter
ASJC Scopus subject areas
- Numerical Analysis
- Engineering(all)
- Applied Mathematics