Article: Stuart, Draper, & Brown (2004)

Reference: Stuart, S. A. J., Brown, M. I., & Draper, S.W. (2004). Using an electronic voting system in logic lectures: One practitioner’s application. Journal of Computer Assisted Learning, 20(2), 95-102.

Summary: In this case study, the authors describe the use of clickers in a 140-student course (taught by the lead author) on philosophical logic, a course that meets with a fair amount of resistance and/or disinterest from philosophy students. Clickers were introduced to the course to try to encourage attendance and participation. Interestingly, clicker questions were not prepared in advance, but rather asked spontaneously by the instructor based on the instructor’s experience asking oral questions in class. Classic peer instruction was used occasionally, and the authors note that convergence to correct answers during second votes was often remarkable.

Types of clicker questions included application questions, confidence questions (in the form of “Do you understand this proof?”), and conceptual questions. Some confidence questions led the instructor to be surprised at what the students did and did not find difficult. The article includes an extended example of this phenomenon.

One interesting approach taken by the instructor was to ask an open-ended question, solicit a response from a student volunteer, then ask the other students whether they agreed or disagreed with the answer proposed. The instructor found that this method didn’t work well when the “conscientious responders” were present, since students would agree or disagree based on the volunteer’s track record, not on the merits of the response volunteered. However, the instructor found that this method prompted students to consider their own answers in light of the answer offered by the student volunteer, which seemed to be promising.

The authors also surveyed the students in the course, asking about their impressions of using clickers. The most-cited advantages of using clickers included the ability for instructors to find out what students don’t understand (and not have to rely on assumptions about student understanding), the ability for students to check their own understanding and compare their performance to their peers, and the anonymity that clickers provide students worried about embarrassment.

The instructor tried using clickers in another course in a lecture on the philosophy of mind. This didn’t go as well because (a) the instructor had too much material to cover to allow time for interactive voting and (b) the instructor wasn’t able to generate on-the-fly questions as easily as in the other course.

Comments: The use of confidence questions is interesting here. Certainly, asking students to rate their confidence in understanding concepts and performing tasks can serve a useful purpose, but I worry about students mis-rating their confidence levels. In the example in the article, the authors note their surprise at finding out students rated one topic as more difficult than another. I would have been interested in knowing how well students actually performed on questions related to these topics. I think the approach used by Dennis Jacobs in the chemistry department at the University of Notre Dame–asking students to answer a content question, then asking them to rate their confidence (high, medium, low) in their answer–provides a more complete picture of student learning than just looking at their confidence levels.

I’m impressed at the instructor’s ability to ask on-the-fly questions regularly during this course. I think most instructors prefer to plan their questions, but I can see some instructors really enjoying this more spontaneous approach. The method of asking students to agree or disagree with an answer provided by a student volunteer has a lot of potential. It would be worth exploring the conditions under which this approach can best be used.

What do you think?  Do you see value in “predictive” confidence questions–ones that ask students to assess their confidence in answering a question without actually answering the question?  Have you tried the method of having students agree or disagree with an answer put forth by a student?

Leave a Reply

Your email address will not be published. Required fields are marked *