Discussions, Teacher’s Corner Highlight the February Issue
John Stufken, Editor, The American Statistician
The February 2010 issue of The American Statistician opens with a discussion of the article “Desired and Feared—What Do We Do Now and Over the Next 50 Years?” by Xiao‐Li Meng, which appeared in the August 2009 issue. Thoughtful and thought-provoking contributions to the discussion are made by Robert Easterling (“Passion-Driven Statistics”), David Fox (“Desired and Feared—Quo Vadis or Quid Agis?”), Roger Hoerl and Ronald Snee (“Moving the Statistics Profession Forward to the Next Level”), Brian Kotz (“Thoughts on the Importance of the Undergraduate Statistics Experience to the Discipline’s [and Society’s] Future”), Frank Soler (“Who Is Teaching Introductory Statistics?”), Richard Cleary and Samuel Woolford (“Response to ‘Desired and Feared’”), and Elart von Collani (“Response to ‘Desired and Feared’—What Do We Do Now and Over the Next 50 Years?” by Xiao-Li Meng). This is followed by Xiao-Li Meng’s “Rejoinder: Better Training, Deeper Thinking, and More Policing.”
Regular sections in this issue contain some very interesting contributions as well. The Teacher’s Corner starts with a contribution to the discussion of p-values, with a strong emphasis on effect size, by Richard Browne (“The T-test P Value and Its Relationship to the Effect Size and P(X>Y)”). This is followed by an entertaining article by Marios Pavlides and Michael Perlman (“On Estimating the Face Probabilities of Shaved Dice with Partial Data”), who take us on a journey of surprises that resulted from their communication with Persi Diaconis. Following that is “Elementary Statistical Methods and Measurement Error,” by Stephen B. Vardeman, Joanne R. Wendelberger, Tom Burr, Michael S. Hamada, Leslie M. Moore, J. Marcus Jobe, Max D. Morris, and Huaiqing Wu. They make the case that we should pay more attention to sources of physical variation when teaching statistical methods, starting at the introductory level.
The final paper in this section, “Resequencing Topics in an Introductory Applied Statistics Course,” by Christopher J. Malone, John Gabrosek, Phyllis Curtiss, and Matt Race, proposes changes to the sequence in which core statistical concepts are presented in an introductory applied statistics course.
The first paper in the General section, “A Model for an Interdisciplinary Undergraduate Research Program” by Julie Legler, Paul Roback, Kathryn Ziegler-Graham, James Scott, Sharon Lane-Getaz, and Matthew Richey, is inspired by the May 2009 TAS article by Browne and Kass and discusses the philosophy and practice of the Center for Interdisciplinary Research at St. Olaf College. T. D. Stanley, Stephen B. Jarrell, and Hristos Doucouliagos (“Could It Be Better to Discard 90% of the Data? A Statistical Paradox”) offer that, in the face of publication selection bias, inference may be improved by discarding a large portion of the data. In “Beyond the Quintessential Quincunx,” Michael A. Proschan and Jeffrey S. Rosenthal suggest how a modification of the quincunx can be used to explore and introduce basic statistical and probabilistic concepts.
Finally, Henry S. Lynn, Zhanjian Dong, and Zhe Mu make good on what they promise in the title of their article, “Comparison of Software Algorithms for Calculating REML Wald Type Confidence Limits for the Between Group Variance Component in a Small Sample One-Way Random Effects Model Example.”