Home » Cover Story, Meetings

Reviewing Results of the JSM Presenter Satisfaction Surveys, 2010–2012

1 April 2013 1,056 views No Comment
Jean Opsomer, 2014 JSM Program Chair, and Melissa Francis, Colorado State University

    The ASA has conducted a “satisfaction survey” of presenters after each JSM for several years. The survey includes questions adapted for the various roles and focuses on the satisfaction of the speaker behind the lectern or the presenter in front of the poster. This survey has been conducted unchanged over the last three years, covering the JSMs in Vancouver (2010), Miami Beach (2011), and San Diego (2012). This gives us a unique opportunity to evaluate results over a longer period and, in particular, look for changes over time.

    The analysis discussed below can be considered a follow‐up analysis to that performed by Dave Judkins on the 2009 JSM presenter survey (see May 2010 issue of Amstat News). One important difference with the analysis by Judkins is that we did not have access to microdata, so it was not possible to look for determinants of satisfaction level beyond the type of presentation and the year. As was the case for 2009, this survey was conducted rather informally, with an email sent out to all the presenters with valid email addresses and no attempt to correct for nonresponse effects. So when interpreting the results, caveat emptor …

    Results

    With those disclaimers out of the way, let us take a look at the results. The total sample size across years and presenter roles was 3,187, ranging from 1,021 in 2010 to 1,095 in 2012. On the whole, corroborating the findings of Judkins for 2009, respondents were generally satisfied with their presentation experience. Figure 1 shows the breakdown of responses to the question, “Please relate your satisfaction with the presentation experience to the likelihood of [participating in the same role in the future] at JSM.” Only a small number of respondents, 80 out of 3,187, expressed serious dissatisfaction. While we would prefer that number to be lower, or even 0, the overall conclusion appears to be that the large majority of presenters are satisfied or very satisfied with their experience.

     

    Figure 1: Breakdown of responses to the question, “Please relate your satisfaction with the presentation experience to the likelihood of [participating in the same role in the future] at JSM.”

    Figure 1: Breakdown of responses to the question, “Please relate your satisfaction with the presentation experience to the likelihood of [participating in the same role in the future] at JSM.”


     

    When we consider the answers to the satisfaction question broken down by year, as shown in Table 1, a similar picture emerges. The percentages are remarkably consistent across the years, and despite the large sample sizes, a chi‐squared test of independence between years and answer categories cannot reject the independence assumption. While it might be somewhat surprising to observe no differences in satisfaction between what were arguably different locales, it should be remembered that this concerns the presentation experience, not the overall JSM.

     

    Table 1—Satisfaction by Year

    Table 1—Satisfaction by Year


     

    Things become a little more interesting when we consider the answers to the satisfaction question broken down by presenter role, with the results shown in Table 2. The distributions of respondents in the three invited and topic‐contributed categories (presenters, panelists, and discussants) are similar, with a small percentage reporting serious dissatisfaction. The poster presenters and contributed paper presenters are markedly different, however, and a chi‐squared test of independence between the presenter categories and the satisfaction levels strongly rejects the hypothesis of independence (glossing over the issues with the chi‐square approximation due to small sample sizes in some of the cells).

     

    Table 2—Satisfaction by Presenter Role

    Table 2—Satisfaction by Presenter Role


     

    As was the case in 2009, a much higher proportion of poster presenters expressed some level of dissatisfaction, with more than 7% reporting serious dissatisfaction and almost 60% reporting only moderate satisfaction. Among the contributed paper presenters, the percentage of seriously dissatisfied respondents was low, roughly in line with those in the invited and topic-contributed categories, but the percentage reporting only moderate satisfaction was substantially higher than in those categories.

    The above satisfaction results by presenter type results are what one might expect. Invited and topic-contributed session participants appear most satisfied with their presentation experience, regardless of their specific roles (presenter, panelist, or discussant). Contributed paper presenters are somewhat less satisfied, but not dramatically so. Poster presenters are the least satisfied, with two‐thirds expressing some level of dissatisfaction.

    The results for the poster presenters are somewhat disheartening, because the ASA and other sponsoring societies have been viewing poster sessions as a possible solution to handling the increasing size of JSM and corresponding growth in the number of participants interested in presenting their work during the meetings. However, there is actually a silver lining, which we see once we look at the responses broken down by year and presenter role. Table 3 shows the results.

     

    Table 3—Satisfaction by Year and Presenter Role

    Table 3—Satisfaction by Year and Presenter Role


     

    While the sample sizes are becoming quite small in many of the cells, the results for most presenter roles are fairly consistent over time. But unlike respondents in the other categories, the poster presenters exhibit a clear improving trend over these three years, going from almost 1 in 8 strongly dissatisfied in 2010 to a satisfaction profile in 2012 that is essentially the same as that of the contributed paper presenters. It may be that JSM participants are becoming more comfortable with the poster format, both as presenters and as session attendees. Or, we cannot exclude the possibility that these observed results are caused by selection bias and are, in fact, masking a consistent level of dissatisfaction each year. Whatever the reason for this apparent improvement, it certainly shows it is too early to write off posters as one of the mechanisms to rein in the number of parallel sessions at JSM.

    Personally, I am cautiously optimistic that the poster format can be made to work for JSM, despite statisticians, as a profession, not having a long history of doing poster sessions. The JSM Program Committee, through the introduction of new initiatives such as “speed sessions” for poster presenters in Montréal, intends to continue exploring ways to make the poster format more attractive.

    1 Star2 Stars3 Stars4 Stars5 Stars (No Ratings Yet)
    Loading...

    Comments are closed.