The rapid growth in computing power has made the computational simulation of complex systems feasible and helped avoid physical experimentation that might otherwise be too time consuming, costly, or even impossible to observe. The first computer experiment appears to have been conducted by a research team headed by Enrico Fermi at Los Alamos National Labs in 1953. Since then, scientists in diverse areas such as cosmology, weather modeling and particle physics have turned to computer model simulation of complex systems as a way to learn about their respective processes.

For the person building a computer simulator, it is tempting to model more and more precisely the system under study as more computer power becomes available. Consequently in many situations, the dimensionality of the input to the computer code can be very large. In others, simulation of the complex phenomena becomes computationally expensive necessitating new design of experiments approaches. Still in other applications, the output of the simulator may be univariate, multivariate, or functional data that is a very complex function of the inputs. Much of the pressing need for new methodology related to the design and analysis of computer experiments can be classified into one or more of the following research problems:
  • Model calibration – combining field and computer model data to build better predictive models and estimate unknown physical constants
  • Response surface estimation – using statistical methods to build surrogate models for often difficult to run complex computer simulators
  • Computer integration – estimating characteristics of the computer model (e.g., means, medians) by integrating the computer model over statistical distributions
  • Experiment design – optimally choosing the computer model trials
  • Numerical analysis – extracting features of the computer code such as global or local optima or solving inverse problems.
Our research aims to work with practitioners to develop new methods for exploring computer simulators and quantifying uncertainty.

Recent Articles and Working Papers

  • Nagy, B., Loeppky, J., and Welch, W. J. (2007), “Fast Bayesian Inference for Gaussian Process Models”, working paper.
  • Linkletter, C. and Sitter, R. R. (2007), “Latent Socio-Spatial Process Model for Social Networks”, working paper.
  • Lu, W., Ranjan, P., Bingham, D., Reese, S., and Williams, B. (2007). “Optimal Experiment Design for Model Calibration”, working paper.
  • Nagy, B., Loeppky, J., and Welch, W. J. (2007),"Correlation Parametrization in Random Function Models to Improve the Normal Approximation of the Likelihood or Posterior", working paper.
  • Linkletter, C., Bingham, D., Hengartner, N., Higdon, D and Ye, K. (2006), “Variable Selection for Gaussian Process Models in Computer Experiments”, Technometrics, 48, 478-490.
  • Mease, D. and Bingham, D. (2006), “Latin Hyper-Rectangle Sampling for Computer Experiments”, Technometrics, 48, 467-477.
  • Ranjan, P., Bingham, D., and Michailidis, G. (2006), "Sequential Experiment Design for Contour Estimation from Complex Computer Codes", to appear in Technometrics.
  • Loeppky, J.L., Bingham, D., and Welch, W.J. (2006), “Issues in Model Calibration”, submitted.
  • Bingham, D., Sitter, R.R. and Tang, B. (2006), “Orthogonal and Nearly-Orthogonal Designs for Computer Experiments”, submitted.