Students on the Move – A Kinesthetic Classroom Response System

I can’t remember how I stumbled upon this, but back in 2008 three University of California-Berkeley students, Sohyeong Kim, Nathan Gandomi, and Kate Smith, prototyped an interesting kinesthetic classroom response system they called Students on the Move.  Instead of clickers (or smart phones, etc.) students were given joysticks.  They were told to push their joysticks forward if they felt the lecture should pick up the pace and to pull their joysticks backward if they were confused and thought the instructor should slow down.  These feedback data were visualized on the instructor’s computer screen as a sequence of circles, one for each student.  As a student pushed forward on the joystick, that student’s circle would move upward and turn green.  As the student pulled back, the circle would move downward and turn red.  More details and some photos can be found in the students’ project report [PDF].

The idea of opening up a backchannel for students to give this kind of one-dimensional feedback (speed up / slow down) isn’t new.  Turning Technologies’ “moment to moment” feature does something similar, producing a moving line graph that shows the average student response on, say, a scale of 1 to 5 to a question.  The line graph updates in real time as long as the slide stays on the screen.  You might have seen something similar during the 2008 US presidential elections, when CNN would aggregate the reactions of likely voters during certain presidential debates using a similar system.

You can also generate similar data using i>clicker if you leave visible the results display when voting is open.  For example, in my book I profile Adam Rich, who teaches biology at SUNY-Brockport.  He’ll give students a clicker question using i>clicker and have students respond and change their responses throughout a class discussion of the question.  He’ll monitor the current voting results on the i>clicker base unit during this time and keep the discussion going until the results indicate the students have reached consensus on the correct answer.

While the idea behind Students on the Move isn’t new, these students’ implementation of the idea has some very creative elements.  Using a joystick as the response device creates a more kinesthetic experience for the student than they get with a clicker.  Moreover, the joystick lets you treat student feedback as a continuous variable, instead of the discrete one that’s produced by clicker-based systems.  That is, instead of having students respond with 1, 2, 3, 4, or 5, you can have students respond with any real number between 1 and 5 for a richer set of responses.  I don’t know of any other classroom response systems that provide this kind of continuous data.

The joystick has another useful feature–it returns to its neutral state when it’s not pushed forward or pulled back.  I’ve talked to a few researchers interested in this kind of backchannel tool, and one of their concerns was that students would forget to change their response back to neutral after indicating a more extreme response.  The joystick handles that problem nicely, resulting in fewer of what the Berkeley students call “false positives.”  (It doesn’t handle one of these researchers’ other concerns: the challenge of prompting students to respond as appropriate without interrupting the flow of the lesson.)

The other key innovation here is the visualization of the aggregate data.  Instead of using a line graph or a bar chart, Students on the Move produces this very interesting set of circles that float up and down and change color as students provide feedback.  I don’t know if it’s necessary to change both position and color as students respond, since both visual elements are conveying the same information, but I like the creativity in this idea.  I can imagine that watching most of the circles drop to the bottom of the screen during a tough part of a lecture is fairly dramatic.

The Berkeley students assert in their project report that “traditional” classroom response systems are designed for teachers, not students:

Although many studies show the positive effect of a CRS, some studies demonstrate that students feel stressed and frustrated by being constantly assessed. Our study and interviews indicate that this negativity is due to the fact that these classroom response systems are mainly designed for teachers. We argue that classroom response systems should be redesigned to include the needs of students, as well as the instructors, in order to best benefit both.

It’s true that some instructors use classroom response systems just to give quizzes and take attendance, but as I’ve argued here before, using clickers just to monitor students is problematic (for the reasons the Berkeley students note above) and instructors who do so miss out on some of a classroom response system’s key benefits–student engagement and “agile teaching.”  On the other hand, many instructors use clickers to engage students in meaningful explorations of course content and practice “agile teaching” by altering the course of a lesson based on student feedback.  In the hands of these instructors, I would argue that clickers are very useful to the students as well as to the instructor.

I think perhaps that the real difference between a “traditional” system and one like Students on the Move is that in a traditional classroom response system, the instructor is the one who opens voting.  With Students on the Move (and the other, similar systems mentioned above), the students can respond whenever they like.  The feedback generated is student-paced, not instructor-paced in a sense.  That distinction is why I would classify Students on the Move as a backchannel system, and I’m begun to think of backchannel systems as complementary to more instructor-paced systems.  I see comments from time to time indicating that having students engage in backchannel conversations (say via Twitter) during class can be a replacement for clickers.  I don’t think that’s the case, because the two kinds of classroom response systems (ones that are student-paced and ones that are instructor-paced) serve different, but complementary, functions in the classroom.

More on the relationship between backchannel and clickers later!  For now, I’ll end with a couple of questions about Students on the Move.

  1. Does the system provide rich data to the instructor after class?  For instance, it would be useful to have time-stamped data available to match up with a recording of the class to determine what components of the lecture triggered particularly positive or negative reactions from students.
  2. Does it make sense to show students the feedback visualization during class?  I’ve talked to a couple of i>clicker instructors who have accidentally displayed the results chart while voting was still open.  They both remarked how quickly the students converged to the same answer, like lemmings.  I wonder if Students on the Move would produce similar counterproductive behaviors.
  3. How might instructors respond to this kind of feedback data during class?  That is, what kinds of “agile teaching” decisions might an instructor make?  Certainly, if most of the students say “slow down!” then it makes sense to slow down and ask students for questions.  (The Berkeley students wisely point out that their system doesn’t provide instructors with information on why students are asking for a slower pace, just how many of them are doing so.)  But what if students are split?  What then?  Or if students are asking for a faster pace, but the instructor hasn’t prepared to go that fast?

I’m glad I found out about Students on the Move.  If you can help me figure out how I found out about it, I would appreciate it!  Also, if you know of anything that resulted from this student project beyond the fall 2008 semester, please let me know.

Image: “In Control” by Flickr user Steve Snodgrass / Creative Commons licensed

Leave a Reply

Your email address will not be published. Required fields are marked *