Home » Additional Features, Technometrics Highlights

Profile Monitoring Highlighted in August Issue

1 August 2010 1,351 views No Comment
David M. Steinberg, Technometrics Editor

    Statistical process control, first developed by Walter Shewhart some 80 years ago, has proved to be of great value for monitoring industrial processes. Advances in data collection capabilities have generated new challenges for process monitoring and stimulated much recent research in creating appropriate monitoring procedures.

    One interesting direction is profile monitoring, in which the characteristic tracked is actually a relationship (or profile) between two or more variables. For example, an important property of an aluminum electrocatalytic capacitor (AEC) is its dissipation factor, which is measured as a function of temperature across a rather wide range of temperatures. It is the entire temperature-dissipation factor relationship that is of interest for process monitoring.

    The feature article in the August issue, by Peihua Qiu, Changliang Zou, and Zhaojun Wang, is “Nonparametric Profile Monitoring by Mixed Effects Modeling.” This article extends previous methods for profile monitoring in two important directions. First, the authors use nonparametric regression to model the profile, whereas most methods have been limited to parametric models. Second, the approach includes possible correlation among the measurements within each profile (a common feature in actual profile data), whereas current approaches treat these as statistically independent.

    The authors propose a novel control chart that incorporates local linear kernel smoothing and uses a nonparametric mixed-effects model to represent within-profile correlation. The chart also accounts for serial correlation between profiles via an exponentially weighted moving average scheme. The proposed control chart is fast to compute and convenient to use. Numerical examples show it works well in various cases, including the AEC production process.

    The article is accompanied by several excellent discussions and a rejoinder by the authors, which help illuminate further areas of application and possible directions for future research. The discussants are Daniel Apley, Hugh Chipman, R. Jock MacKay, Stefan Steiner, Fugee Tsung, William Woodall, Jeffrey Birch, and Pang Du.

    The Technometrics session at JSM featured this article and discussions.

    Gauge reproducibility and repeatability studies are widely used to assess measurement system variation. In their article, “Leveraged Gauge R&R Studies,” Ryan Browne, Jock MacKay, and Stefan Steiner propose an alternative study design that offers improved estimates of relevant variance ratios. The new plan, called a leveraged gauge R&R study, is conducted in two stages. In the baseline stage, a sample of parts is selected and each part is measured once. The extreme parts are then deliberately selected for the second stage and measured a number of times by each operator. For a fixed number of operators and total number of measurements, the authors show that good performance is obtained from leveraged plans with a baseline size roughly half the total number of measurements. The article demonstrates the advantages of the leveraged over the standard plan by comparing the standard deviations of the estimators of the parameters of interest.

    The next two articles present methods for computing tolerance limits. The first, by David Hoffman, is titled “One-Sided Tolerance Limits for Balanced and Unbalanced Random Effects Models.” The approach is valid for general random effects models with normal data, in both balanced and unbalanced data scenarios. It exploits an approximation to the noncentral t distribution and modified large sample methods for constructing confidence bounds on functions of variance components. An alternative bootstrap-adjusted limit also is proposed. Simulation results indicate the analytical limit is generally somewhat conservative, but is often less conservative than an existing analytical approach and may provide substantially shorter interval lengths, particularly when the sample size is small and the desired confidence is high. The bootstrap-adjusted limit generally maintains the nominal confidence level and yields shorter interval lengths, but it can be anticonservative for small sample sizes.

    For a product manufactured in large quantities, tolerance limits play a fundamental role in setting limits on the process capability. The next paper, by Takeshi Emura and Hsiuying Wang, is titled “Approximate Tolerance Limits Under Log-Location-Scale Regression Models in the Presence of Censoring.” Existing methods for setting tolerance limits in life test and reliability experiments focus primarily on one-sample problems. This work extends tolerance limits to life tests experiments that include covariates. A method constructing approximate tolerance limits is proposed for the widely used log-location-scale regression models. The method is based on an application of the large sample theory of maximum likelihood estimators, which is modified by a bias-adjustment technique to enhance small sample accuracy. The proposed approximate tolerance limits are shown asymptotically to have nominal coverage probability under the assumption of “independent censoring,” which includes type I and type II schemes. Simulation studies are conducted to assess finite sample properties.

    The method is illustrated with two data sets. R code for implementing the proposed method is available here.

    Javier Cano, Javier M. Moguerza, and David Ríos Insua present “Bayesian Reliability, Availability, and Maintainability Analysis for Hardware Systems Described Through Continuous Time Markov Chains.” Reliability, availability, and maintainability (RAM) modeling is an important aspect in the analysis of hardware systems. Markov models are often useful, especially for systems that evolve through several states, some of which are ON states, in which the system continues to function, and the rest are OFF states. This article provides RAM analyses of such systems within a Bayesian framework, addressing both short-term and long-term performance. The approach is illustrated via analysis of data from a university enterprise resource planner.

    The generalized Pareto distribution (GPD) has been widely used to model exceeding thresholds, such as flood levels of rivers. However, it is difficult to obtain good estimates of the parameters of the GPD. In “Improving on Estimation for the Generalized Pareto Distribution,” Jin Zhang extends an earlier Technometrics paper by Zhang and Stephens (2009) that proposed a new estimation method for parameters of the GPD. That method is free from the theoretical and computational problems suffered by traditional estimation approaches. In terms of estimation efficiency and bias, it outperforms other existing methods in common situations, but it may perform poorly for heavy-tailed distributions. This article develops improvements to the method that significantly improve its ability to adapt.

    Multivariate binary data arise in a variety of settings. In “Likelihood Analysis of Multivariate Probit Models Using a Parameter Expanded MCEM Algorithm,” Huiping Xu and Bruce A. Craig propose a practical and efficient computational framework for maximum likelihood estimation of multivariate probit regression models. Their approach uses the Monte Carlo EM (MCEM) algorithm, with parameter expansion to complete the M-step, to avoid direct evaluation of intractable multivariate normal orthant probabilities. The parameter expansion not only enables a closed-form solution in the M-step, but also improves efficiency. Using simulation studies, the authors compare their approach to the MCEM algorithms developed by Chib and Greenberg (1998) and Song and Lee (2005), as well as the iterative approach proposed by Li and Schafer (2008). The new approach is illustrated by application to a study on drivers’ perceptions of headlight glare.

    The final article is by Shifeng Xiong and titled “Some Notes on the Nonnegative Garrote” (NG). The main result is that, compared with other penalized least squares methods, the NG has a natural selection of penalty function according to an estimator of prediction risk. Furthermore, two natural and easy-to-compute estimators of the tuning parameter are proposed corresponding to AIC and BIC, respectively. This indicates that, to select tuning parameters, it may be unnecessary to optimize a model selection criterion multiple times. Several reasonable NG estimators with natural tuning parameters are proposed for settings with multicollinearity and other problems. The good properties of the NG are illustrated by simulation results. The NG also is used to analyze data from a study that was conducted to determine the composition of acid rain.

    1 Star2 Stars3 Stars4 Stars5 Stars (No Ratings Yet)
    Loading...

    Comments are closed.