Home » Additional Features, Technometrics Highlights

Computer Experiments Motivate New Approach to Bayesian Computation

1 August 2012 1,921 views No Comment
Hugh Chipman,Technometrics Editor

    Simulation-based techniques for Bayesian computation have seen widespread application over the last two decades. Although flexible, they can be time consuming in some problems. This is especially true when the likelihood may be expensive to evaluate. Approximation methods, such as variational Bayesian inference, have shown promise, but may not provide sufficient accuracy. In “Bayesian Computation Using Design of Experiments-Based Interpolation Technique,” V. Roshan Joseph develops a new approximation method for posterior inference that is quick, accurate, and adaptive.

    By leveraging ideas from the design and analysis of computer experiments, a kriging model is fit to the posterior distribution, giving highly accurate approximations in a variety of applications. The techniques are general, easy to implement, and applicable to many complex Bayesian problems. By using sequential design of experiments, the approximation can be further improved though adaptive addition of basis functions.

    The article is accompanied by several excellent discussions and a rejoinder by the authors, which explore a number of extensions of the method. The discussants are Björn Bornkamp, Tirthankar Dasgupta, Xiao-Li Meng, Herbert K. H. Lee, John T. Ormerod, M. P. Wand, David M. Steinberg, and Bradley Jones. The directions explored in the discussions suggest DoIT will provide fertile ground for further research into the use of tools from computer experiments for Bayesian computation.

    The remainder of the issue includes articles about reliability, process monitoring, time series, and compliance testing. This latter topic is studied in “Compliance Testing for Random Effects Models with Joint Acceptance Criteria,” by Crystal D. Linkletter, Pritam Ranjan, C. Devon Lin, Derek R. Bingham, Richard A. Lockhart, Thomas M. Loughin, and William A. Brenneman. For consumer protection, many governments perform random inspections on goods sold by weight or volume to ensure consistency between actual and labeled net contents. Motivated by a problem from a real manufacturing process, the paper provides an approximation for the probability of sample acceptance that is applicable for processes with one or more known sources of variation via a random effects model. This approach also allows the assessment of the sampling scheme of the items.

    In “An improved Bayesian Information Criterion for Multiple Change-Point Models,” Alexis Hannart and Philippe Naveau use a meteorological application to motivate the problem of identification of change-points in a time series. In that problem, the removal of systematic shifts due to changes in measurement systems is an important step before the series can be used. Prior information is used in a Bayesian analysis, with a closed-form form of a BIC-like expression for identification of the change-points.

    The remainder of the issue is devoted to papers involving process monitoring. In “Monitoring Warranty Claims with Cusums,” by Jerald F. Lawless, Martin Crowder, and Ker-Ai Lee, the monitoring of reliability is considered. Using data from warranty claims on North American vehicles, the paper develops practical monitoring methods designed to allow changes in claim rates to be detected in as timely a manner as possible.

    In “Posterior Distribution Charts: A Bayesian Approach for Graphically Exploring a Process Mean,” Daniel W. Apley develops a Bayesian approach for monitoring and graphically exploring a process mean and informing decisions related to process adjustment. Observations are represented as a process mean plus a random error term, and the mean process can follow any Markov model. This includes a mean that wanders slowly, that is constant over periods of time with occasional random jumps, or combinations thereof. This exploratory approach is illustrated using an example from automobile body assembly.

    Matthias Tan and Jianjun Shi also develop Bayesian methods for process monitoring in their paper, “A Bayesian Approach for Interpreting Mean Shifts in Multivariate Quality Control.” The focus here is on identification of the important factors that led to an out-of-control signal when monitoring multivariate quality characteristics. An adaptation of Bayesian variable selection methods from linear regression provides both a probabilistic framework for diagnosis and a mechanism for incorporating expert knowledge.

    The issue closes with “Outlier Detection in Functional Observations with Applications to Profile Monitoring,” by Guan Yu, Changliang Zou, and Zhaojun Wang. Monitoring of profile data (also known as functional data) is becoming increasingly common with available sensor technology. To use such data for profile monitoring, outliers must be removed first. This paper proposes a new testing procedure based on functional principal component analysis. After deriving the appropriate null distributions, the test statistic is demonstrated in a real-data example from a manufacturing process.

    1 Star2 Stars3 Stars4 Stars5 Stars (No Ratings Yet)
    Loading...

    Comments are closed.