Poster List

Constructing Cascading Latin Hypercubes


CONFERENCE: Design and Analysis of Experiments (DAE) - 2007, Memphis

ABSTRACT: Computer experiments commonly use space-filling designs. As the number of factors increases, the sparsity of the design points increases. Space-filling designs place all the points about the same distance (quite far) apart. If the spatial correlation length is also small relative to the spacing, there are no points close enough together to give reliable estimates of the correlation parameters. Handcock (1991) introduced cascading Latin hypercube designs (CLHD) to alleviate this issue. We develop systematic methods for constructing a rich class of CLHDs.

Click here to see a pdf version

A New Algorithm for Constructing Orthogonal and Nearly-Orthogonal Arrays


CONFERENCE: Design and Analysis of Experiments (DAE) - 2007, Memphis

ABSTRACT: Orthogonal arrays are frequently used in industrial experiments for quality and productivity improvement. Due to run size constraints and level combinations, an orthogonal array may not exist, in which case a nearly-orthogonal array can be used. Orthogonal and nearly-orthogonal arrays can be difficult to find. This poster will introduce a new algorithm for the construction of orthogonal arrays and nearly-orthogonal arrays with desirable statistical properties, and compare the new algorithm to a pre-existing algorithm.

Click here to see a pdf version

Design and Analysis on Non-Convex Regions: Optimal Experiments for Spatial Process Prediction with Application to Industrial Processes


CONFERENCE: Joint Statistical Meetings (JSM) - 2007, Salt Lake City

ABSTRACT: Modeling a response over a non-convex design region is common in many industrial problems. Our research develops design and analysis methodology for experiments on non-convex regions. The approach uses a Gaussian Process (GP) model with a geodesic distance metric as a regression function. A novel use of Multidimensional Scaling (MDS) enables us to perform design and analysis on such regions.

Click here to see a pdf version

Variable Selection for Gaussian Process Models in Computer Experiments


CONFERENCE: Statistical Society of Canada Annual Meeting - 2007, St. John's

ABSTRACT: In many situations, simulation of complex phenomena requires a large number of inputs and is computationally expensive. Identifying the inputs that most impact the system so that these factors can be further investigated can be a critical step in the scientific endeavor. In computer experiments, it is common to use a Gaussian spatial process to model the output of the simulator. In this article we introduce a new, simple method for identifying active factors in computer screening experiments. The approach is Bayesian and only requires the generation of a new inert variable in the analysis; however, in the spirit of frequentist hypothesis testing, the posterior distribution of the inert factor is used as a reference distribution against which the importance of the experimental factors can be assessed. The methodology is demonstrated on an application in material science, a computer experiment from the literature, and simulated examples.

Click here to see a Powerpoint version

Modeling of Fine-Scale Features in an Up-Scaled PDE Simulation via Stochastic Closure




ABSTRACT: Computer models can often be run with different degrees of resolution. Consider a simple ocean basin model. The fine-scale simulation (100 x 200) matches “reality” well but is often too demanding to use in actual simulations. The coarse-scale simulation (25 x 50) can be used in large-scale simulations, but misses some important large-scale features because of its limited resolution. This aims to develop new methodology for estimating the fine-scale realizations from the course scale simulator. The connection between the two scales is established via parallel training runs. Our solutions involves a convolution of the coarse-scale computer model with white noise and a linear model to up-scale the variability of the coarse-scale model.

Click here to see a Powerpoint version

Latin Hyper-Rectangles for Integration in Computer Experiments


CONFERENCE: Design and Analysis of Experiments (DAE) - 2003, Chicago

ABSTRACT: Latin hypercube sampling is a popular method for evaluating the expectation of functions in computer experiments. However, when the expectation of interest is taken with respect to a non-uniform distribution, the usual transformation to the probability space can cause relatively smooth functions to become extremely variable in areas of low probability. Consequently, the equal probability cells inherent in hypercube methods often tend to sample an insufficient proportion of the total points in these areas. Here, we introduce Latin hyper-rectangle sampling to address this problem. Latin hyper-rectangle sampling is a generalization of Latin hypercube sampling which allows for non-equal cell probabilities.

Click here to see a pdf version