Home » Additional Features, Technometrics Highlights

Experimental Design Featured in February Issue

1 February 2014 399 views No Comment
Peihua Qiu, Technometrics Editor

    The February 2014 issue of Technometrics features a discussion article titled “Screening Strategies in the Presence of Interactions,” by Danel Draguljić, David C. Woods, Angela M. Dean, Susan M. Lewis, and Anna-Jane E. Vine. The article describes novel strategies for screening designs that are popularly used in the discovery and development of high-quality products and processes. Screening designs are mainly for identifying active factors that have the greatest effect on the measured response. In practice, certain factors must function efficiently together. It is therefore vital that a screening strategy can identify active interactions as well as main effects. In the paper, the authors explore, extend, and compare different screening strategies that allow investigation of interactions to give insights into how the approaches might work in practice. Strategies that use supersaturated designs and group screening are investigated, together with several methods of shrinkage regression and Bayesian analysis. This article features discussion by Michael Hamada, Christine Anderson-Cook, William Brenneman, William Li, Ji Zhu, Philip Scinto, Robert Wilkinson, Zhen Wang, and Andrew Rose and a rejoinder by the authors.

    In recent years, a surge of interest has been witnessed in using nested space-filling designs for a wide range of applications, including multi-fidelity computer experiments, sequential evaluations, multi-step functional fitting, and linking parameters. In these applications, some factors are often believed to be more important or deserve more attention than others. In the paper titled “Asymmetric Nested Lattice Samples,” Peter Z.G. Qian, Mingyao Ai, Youngdeok Hwang, and Heng Su propose a new class of space-filling designs, called asymmetric nested lattice samples. The new designs can divide different axes at different scales of fineness, and it is demonstrated that this flexibility is useful for situations mentioned above where some factors are more important than others are.

    The paper titled “A Note on Dominating Fractional Factorial Two-Level Designs with Clear Two-Factor Interactions” by Ulrike Grömping considers the problem of selecting a two-level fractional factorial design that allows estimation of all main effects and some specified two-factor interactions (2fis). The paper concentrates on the “dominating designs” that have been introduced but not pursued in the published paper by Wu, Mee, and Tang (2012, Technometrics, 191–197). In the paper, the author shows that it suffices to search through the complete catalog of dominating designs with the required 2fis being clear (i.e., without aliasing from any main effects or other 2fis).

    Recurrent events are commonly seen in applications related to failure, repair, and replacement of industrial components or physical infrastructure. Observed data of recurrent events are often interval-censored (i.e., the events are known to occur in certain intervals only). Statistical analysis of interval-censored recurrent events data is discussed in the paper titled “Parametric Estimation for Window Censored Recurrence Data” by Yada Zhu, Emmanuel Yashchin, and J.R.M. Hosking. The authors derive the likelihood function for a model in which the distributions of inter-recurrence intervals in a single path need not be identical and may be associated with covariate information. The proposed method is demonstrated in a case study using the water distribution system maintenance records of a major U.S. city.

    In “Applying Control Chart Methods to Enhance Data Quality,” L. Allison Jones-Farmer, Jeremy D. Ezell, and Benjamin T. Hazen discuss the important problem of data quality. As Big Data become a more popular topic in scientific communities, the data quality problem is especially important to discuss. In the paper, the authors examine the data quality problem systematically and propose the use of statistical process control techniques as viable tools for data quality monitoring and improvement. A case study about an aircraft maintenance data set is discussed in detail.

    The paper titled “A Functional Time Warping Approach to Modeling and Monitoring Truncated Degradation Signals” by Rensheng R. Zhou, Nicoleta Serban, and Nagi Gebraeel focuses on statistical analysis of degradation signals. In the paper, the authors present a flexible modeling framework for characterizing degradation signals that can only be observed up to a pre-specified failure threshold. Under that framework, a novel method is proposed for obtaining real-time predictions for the residual lifetime of engineering components deployed in the field. The proposed method is then tested using vibration-based degradation signals from a rotating machinery experiment and simulated degradation signals.

    In the aerospace industry, failing to detect defects in side engines or airframe components can lead to a disaster. In such applications, nondestructive evaluation is used widely for detecting defects or flaws. The standard statistical method for analyzing such data is a simple linear regression between the signal response variables and explanatory variable(s) such as defect size. For some applications, such a simple empirical approach is inadequate. An important alternative approach is to use knowledge of the physics of the inspection process to provide information about the underlying relationship between the response and the explanatory variable(s). In the paper titled “Physical Model Assisted Probability of Detection of Flaws in Titanium Forgings Using Ultrasonic Nondestructive Evaluation,” Ming Li, William Q. Meeker, and R. Bruce Thompson describe a set of physical model-assisted analyses to study the capability of two ultrasonic testing inspection methods to detect synthetic hard alpha inclusion defects in titanium forging disks.

    Robust estimation is the theme of the next two papers. One major motivation to develop robust statistical methods is to diminish the impact of outliers on the related estimators. In the paper titled “Robust Estimators of the Generalized Loggamma Distribution,” Claudio Agostinelli, Alfio Marazzi, and Victor J. Yohai propose two families of robust estimators of the three parameters of a generalized loggamma distribution. One nice feature of the proposed estimation procedures is that they can be applied to other three-parameter distribution families with the parameters characterizing location, scale, and shape of the related distribution, such as the three-parameter log Weibull distribution family.

    In another article, titled “Robust Constrained Clustering in Presence of Entry-Wise Outliers,” Alessio Farcomeni proposes a robust heteroschedastic model-based clustering method when outliers arise component-wise. One major idea in the proposed clustering method is the use of observation snipping, which tries to discard some dimensions of an observation and use the remaining dimensions for estimation. An expectation-maximization–type algorithm is developed for deriving inference, and its convergence properties are studied.

    Motivated by an example of high-dimensional HIV-1 drug resistance data, the paper titled “The Cluster Elastic Net for High-Dimensional Regression with Unknown Variable Grouping” by Daniela M. Witten, Ali Shojaie, and Fan Zhang proposes the so-called cluster elastic net, which can perform less shrinkage on the coefficients corresponding to the cluster of highly correlated features that are associated with the response. Instead of assuming the clusters are known a priori, the cluster elastic net infers clusters of features from the data based on correlation among the variables and association with the response.

    1 Star2 Stars3 Stars4 Stars5 Stars (No Ratings Yet)
    Loading...

    Comments are closed.