Home » Additional Features, Technometrics Highlights

Technometrics Highlights: Regression Diagnostics, Control Charts, and Bayesian Methods Featured in August Issue

1 August 2014 233 views No Comment
Peihua Qiu, Technometrics Editor

Linear mixed models are effective for many applications. Standard estimation methods for mixed models are sensitive to bizarre observations. Such influential observations can completely distort an analysis and lead to inappropriate conclusions. In the paper titled “Case-Deletion Diagnostics for Linear Mixed Models,” Jianxin Pan, Yu Fei, and Peter Foster propose a case-deletion diagnostics method for identifying influential observations in linear mixed modeling. Their models are broad in the sense that any covariance structures can be specified in the covariance matrices of the random effects and random errors. Performance of the proposed diagnostic methods is evaluated by intensive simulation studies. In the paper titled “Point-Wise and Simultaneous Tolerance Limits Under Logistic Regression,” Zachary Zimmer, DoHwan Park, and Thomas Mathew propose methodologies to construct both point-wise and simultaneous upper tolerance limits for the binomial distribution when a logistic regression model is used to model the dependence of the success probability on covariates. Their methods are based on the observation that the required upper tolerance limits can be obtained using upper confidence limits for the success probability.

In the field of textured image analysis, researchers are typically interested in three problems: texture discrimination, classification, and segmentation. To this end, efficient modeling of the second-order properties of a textured image, including its stationarity, is important. In the literature, many methods have been proposed for modeling the structure of textured images when the image intensity process is either stationary or nonstationary. In practice, however, we usually do not know whether a given image is stationary. It is therefore difficult to tell which existing method should be adopted for analyzing a given set of images. In the paper titled “A Test of Stationarity for Textured Images,” Sarah L. Taylor, Idris A. Eckley, and Matthew A. Nunes develop a test of stationarity for random fields defined on a regular lattice, motivated by a problem arising from texture analysis. This approach is founded on the locally stationary two-dimensional wavelet process model for lattice processes. The approach is illustrated with pilled fabric data, and it is demonstrated that it can identify visually subtle changes in stationarity. In the paper titled “The Inverse Gaussian Process as a Degradation Model,” Zhi-Sheng Ye and Nan Chen systematically investigate the application of inverse Gaussian (IG) processes in degradation modeling. An IG process is shown to be a limiting compound Poisson process, which gives it a meaningful physical interpretation in modeling product degradation under a random environment. By treating the IG process as the first passage process of a Wiener process, the authors propose two approaches to incorporate random effects. These random effects models are analytically tractable and can be efficiently estimated from degradation data. With these nice properties, the class of IG processes greatly complements the family of stochastic degradation models.

Exploring multivariate data under a nonparametric setting is an important and challenging topic in many disciplines of research. Estimation of an unknown multivariate function when dimension increases is difficult due to the curse of dimensionality. In the paper titled “A Piecewise Single-Index Model for Dimension Reduction,” Tianhao Wang and Yingcun Xia propose a piecewise single-index model as a new dimension reduction approach to improve the estimation efficiency of nonparametric regression modeling. By that approach, the sample space is first partitioned into several regions adaptively, and different single-index models are then estimated from observations in different regions. Numerical studies show the proposed approach is capable of accommodating complicated data structures and making accurate predictions.

The next two papers discuss two interesting problems about statistical process control (SPC). In recent years, control charts with variable sampling rate (VSR) schemes have attracted considerable attention from statisticians. By a VSR scheme, the sampling rate can change over time, depending on the current and prior sampling results. In the paper titled “Statistical Process Control Using Dynamic Sampling Scheme,” Zhonghua Li and Peihua Qiu propose a continuously variable sampling scheme based on a quantitative measure of the likelihood of a process distributional shift at each observation time point. The resulting CUSUM chart with a variable sampling scheme is then combined with an adaptive procedure for determining its reference value, and the chart is shown to be effective in detecting a wide range of unknown shifts. In the article titled “Time-Between-Event Control Charts for Sampling Inspection,” Liang Qu, Zhang Wu, Michael B. C. Khoo, and Abdur Rahim discuss time between vvents (TBE) charts, or T charts. Currently, almost all studies on T charts focus on applications under 100% inspections. However, due to limitations in resources and working conditions, sampling inspection has to be adopted for many SPC applications. This paper studies four T charts under sampling inspection.

Traffic monitoring and forecasting is a challenging problem that has received much attention. Using GPS-enabled smart phones, which potentially generate high-quality position and speed data, provides new opportunities for highway traffic monitoring and forecasting. Motivated by this kind of problem, in the paper titled “Modeling Conditional Distributions for Functional Responses, with Application to Traffic Monitoring via GPS-Enabled Mobile Phones,” Kehui Chen and Hans-Georg Müller propose a novel approach to functional regression modeling, by which the entire distribution of functional responses is modeled conditionally on predictors.

The next three papers propose Bayesian methods for handling three different problems. In the paper titled “Partition-Based Priors and Multiple Event Censoring: An Analysis of Rosen’s Fibrous Composite Experiment,” John Grego, Shuang Li, James Lynch, and Jayaram Sethuraman present a Bayesian analysis of the components’ strength data obtained from Rosen’s experiments. One major structure of this data is the multiple event censoring that occurs when the censoring mechanism depends on preceding observations. This arises naturally in load-sharing systems when component failure under increased load can initiate a series of component failures due to load transfer from the failed components, causing the subsequent component strengths to be interval censored. The paper titled “Bayesian Uncertainty Quantification for Subsurface Inversion Using a Multiscale Hierarchical Model” by Anirban Mondal, Bani Mallick, Yalchin Efendiev, and Akhil Datta-Gupta considers a Bayesian approach to nonlinear inverse problems in which the unknown quantity is a random field. The authors show that this inverse problem is well-posed under their Bayesian framework. They also develop a two-stage reversible jump MCMC algorithm to handle the computational challenges. Statistical research on recurrent events data for repairable systems has been widely studied in the literature. The existing methods often rely on some simplified model assumptions, including the widely used Poisson process assumption, to quantify the dynamic risk of failure. There is an essential absence of literature on testing these assumptions. In the paper titled “A Bayesian Nonparametric Test for Minimal Repair,” Li Li, Timothy Hanson, Paul Damien, and Elmira Popova develop a Bayesian nonparametric test to detect departures from such assumptions. The test statistic is a pseudo-Bayes factor obtained by comparing two models with the usual Poisson process model under the null hypothesis and a two-stage generalization of the Poisson process model under the alternative.

Numerical methods such as the finite element analysis (FEA) are commonly used in simulating real-world phenomena like soil erosion, climate change, and so forth. These methods usually have a tuning parameter (e.g., the mesh density in FEA) that controls the numerical accuracy and computational cost. It would be beneficial to run the FEA with two choices of the mesh density to exploit the balance between numerical accuracy and computational cost. In the paper titled “Surrogate Modeling of Computer Experiments with Different Mesh Densities,” Rui Tuo, C. F. Jeff Wu, and Dan Yu develop a framework for studying the stated problem and propose a class of nonstationary Gaussian process models to link the outputs of simulation runs with different mesh densities to better use the data for modeling and prediction.

1 Star2 Stars3 Stars4 Stars5 Stars (No Ratings Yet)

Comments are closed.