Home » Additional Features, Technometrics Highlights

May Issue Features Advances in Reliability

13 May 2010 1,819 views No Comment

Aparna V. Huzurbazar and Brian J. Williams examine the use of flowgraph models for modeling recurrent event data. Their article, “Incorporating Covariates in Flowgraph Models: Applications to Recurrent Event Data,” proposes a framework for incorporating covariates in flowgraph models, with application to recurrent event data in systems reliability settings. A flowgraph is a generalized transition graph originally developed to model total system waiting times for semi-Markov processes. The focus of flowgraph models is expanded by linking covariates into branch transition models, enriching the toolkit of available data analysis methods for complex stochastic systems. The ideas are illustrated on two applications, one in reliability and the other in medicine.

Bending load tests are used to assess the strength of brittle materials such as ceramics for spacecraft and aircraft. The sizes of the test units and the fraction of units at each size affect the efficiency of these tests. Kazuyuki Suzuki, Toshie Nakamoto, and Yohtaro Matsuo address the test design problem in their article, “Optimum Specimen Sizes and Sample Allocation for Estimating Weibull Shape Parameters for Two Competing Failure Modes.”

Examining data on both fracture strength and location and taking account of both internal and surface cracks, the authors find that tests with two specimen sizes and a maximal ratio of volumes provide the most precise estimation. Tests with an equal number of specimens at each size achieve precision close to that of the optimal allocation. The proposed tests can drastically reduce both the number and total volume of specimens when compared to a conventional “single-specimen size test” without any reduction in the precision of the estimators.

The next Technometrics article, by Martin L. Hazelton, considers “Statistical Inference for Transit System Origin-Destination Matrices.” The problem here is to make inferences for the matrix of origin-destination (O-D) trip rates for a transit system based on counts of the passengers boarding and alighting at each stop. The observed data provide only indirect information about the O-D rates through a highly indeterminate system of linear equations. Calculation of the model likelihood is computationally prohibitive for even moderately large systems. So the article instead adopts a sampling-based Bayesian approach.

Existing work on the wider problem of O-D traffic rate estimation for general transport networks has failed to produce an efficient sampling methodology for sizable applications. However, this work derives a suitable MCMC algorithm by generating candidate trip vectors directly from the feasible set using a Markov model of passenger behavior. The resulting sampler moves freely around the posterior support without any need for explicit specification of the feasible trip set. This methodology is applicable regardless of whether the O-D matrix is assumed to possess any given structure. The methods are illustrated through analysis of a case study on O-D trip rates for a bus service in the San Francisco Bay Area.

The issue closes with three articles on experimental design. The first of these, by Pi-Wen Tsai and Steven G. Gilmour, examines “A General Criterion for Factorial Designs Under Model Uncertainty.” This work is motivated by two industrial experiments in which rather extreme prior knowledge was used to choose a design. Building off these examples, the authors study the QB criterion, which aims to improve the estimation in as many models as possible by incorporating experimenters’ prior knowledge. The generalization and application of the criterion to different types of designs are presented. The relationships between QB and other criteria for different situations are explored. The authors show that QB provides a bridge between alphabetic optimality and aberration. The two case studies illustrate the potential benefits of the QB criterion. R programs for calculating QB are available online as supplemental materials.

Sunanda Bagchi develops a novel twist on orthogonal main effect plans (OMEPs) in her article, “Main Effect Plans Orthogonal Through the Block Factor.” Many industrial experiments require different numbers of levels for different factors. OMEPs for such asymmetrical experiments often require a large run size. If blocking is needed, run size usually becomes even larger. This article shows that there are situations where the use of blocks may actually be helpful in finding an OMEP with a small run size. For example, the article shows how to set up a saturated OMEP for a 33n.23n. experiment in 3n. blocks of size 4 each for every Hadamard number n. In each of these plans, the three-level factors are nonorthogonal to the block factor but are pairwise “orthogonal through the block factor.” The two-level factors are orthogonal to the block factor.

The issue concludes with an article by Kenneth J. Ryan and Dursun A. Bulutoglu on “Minimum Aberration Fractional Factorial Designs with Large N.” Ryan and Bulutoglu extend our knowledge of minimum aberration (MA) regular fractional factorial (FF) designs with 2-levels and large run sizes. They extend the catalog of Xu (2009), adding 36 new MA designs: N = 256 (m = 29–36 and 100–108 factors), N = 512 (m = 26–29), N = 1024 (m = 25–28), N = 2048 (m = 24–32), and N = 4096 (m = 25–26). Such design enumeration problems are notoriously difficult with large N and/or m.

Ryan and Bulutoglu brought the newly solved problems within computational reach by changing the isomorphism check component of Xu’s algorithm. They present a new, compact graph to solve regular design isomorphism problems and use the program nauty (McKay, 2007) to solve the corresponding graph isomorphism problems.

1 Star2 Stars3 Stars4 Stars5 Stars (No Ratings Yet)
Loading...

Comments are closed.