Reference: Stowell, J. R., & Nelson, J. M. (2007). Benefits of electronic audience response systems on student participation, learning, and emotion. Teaching of Psychology, 34(4), 253-258.
Summary: In this short article, the authors present the results of a study comparing three different in-class response methods: clickers, response cards, and hand-raising. About 140 introductory psychology students, mostly first-year undergraduates, were assigned to four different groups. The first group experienced a “standard lecture” with no formal response mechanisms. The other three groups experienced lectures enhanced with formal response mechanisms (clickers, response cards, hand-raising). The authors analyzed the students’ participation and accuracy rates on these in-class questions, as well as their performance on a post-lecture quiz and their responses to the Academic Emotions Questionnaire (AEQ) as a way of investigating impact on the students’ affective responses.
Key findings are as follows:
- The response card and clicker methods increased student response rates over the hand-raising method (97%, 100%, and 76%, respectively).
- All three groups involving formal response methods reported significantly less boredom during lecture than the “standard” lecture group.
- The clicker group appeared to answer in-class questions more honestly than the response card and hand-raising groups. This was the authors’ conclusion after noting that the percent of questions answered correctly using clickers more closely mirrored the percent of questions answered correctly on the post-lecture quiz. (There was a 22% drop in accuracy from during-lecture to post-lecture for clickers versus a 38% drop for hand-raising and 40% drop for response cards.)
Comments: The first two findings aren’t particularly surprising, but are encouraging. The third finding is the most interesting one. Anecdotal evidence from instructors has certainly indicated that the hand-raising method leads to the “bandwagon” effect, where students change their responses after seeing how their peers respond, making it difficult to use that method to accurately assess student understanding. However, I hadn’t heard until this article that the bandwagon effect was present in the response card method, a method that has a fairly long history in the teaching of psychology
It’s worth noting that in this study, the lecturers did not practice any kind of agile teaching; they just stated the correct answer to a question after the students voted. Although there was no significant difference on the performance of the four groups on the post-lecture quiz in this study, it’s quite possible that there would be an improvement in the clicker group were agile teaching to be implemented. By using response data to guide classroom lecture and discussions in ways that focus on student difficulties, I believe that students are likely to learn more. If that’s the case and if clickers provide more accurate data on student difficulties, then that would be a strong argument for using clickers to improve student learning, not just increase participation and reduce boredom.
One variable not discussed in the article is the type of question asked with the various response methods and on the post-lecture quiz. It’s possible that the effects seen here would be different with different types of questions: factual questions, conceptual questions, application questions, etc.
Finally, I like the use of the Academic Emotions Questionnaire here to explore the affective aspects of clickers. I’m glad to know about this instrument.