EDUCAUSE Day Four

On the final day of the EDUCAUSE Annual Conference, I attended a session titled “Growing and Sustaining Student Response Systems at Large Campuses: Three Stories” presented by Christopher Higgins of the University of Maryland, Nancy O’Laughlin of the University of Delaware, and Michael Arenth of the University of Pittsburgh.  The presenters’ slides are available, and Inside Higher Ed ran a story on the session, too.

University of Maryland

There were three classroom response systems in use at the University of Maryland as of a few years ago, so the IT office got together with Undergraduate Studies and the Center for Teaching Excellence to form a review committee that recommended the adoption of the TurningPoint system.  Key factors included keeping student data on campus (because of FERPA), cost to students, integration with PowerPoint and Maryland’s course management system, and reporting options.

They now have over 12,000 clickers in the system with at least 75 faculty members using clickers, mostly with courses in business and the natural sciences.  (Some departments purchased their own sets, so IT isn’t sure how many faculty are using clickers in these departments.)  I haven’t spoken with many business faculty about how they use clickers, although the business and management section of my bibliography is one of the larger ones.  I might try to track down a couple of Maryland business faculty to find out how they are using clickers.

Challenges included registering student clickers, which required two different registration systems for a while.  Also, the software isn’t as robust on Macs, which poses a problem for some faculty.  They also went from 10 classrooms with receivers and software to 150 in a single semester which was challenging!  TurningPoint’s receivers also needed upgrading last academic year, which posed some logistical problems.

Currently, the IT office handles technical support for faculty using clickers, while the Center for Teaching Excellence handles training and promotion.  The two units seem to work well together, offering joint training sessions that have gone over well.  IT finds it necessary to have a staff member devoted almost entirely to clicker support at the start of a semester.

Christopher Higgins is particularly excited about TurningPoint’s new ResponseWare Web system, which enables any Web-enabled device (laptop, iPhone, etc.) to function as a clicker.  He likes the fact that the system leverages existing hardware that can also perform other functions, as well as the fact that the Web system is cheaper–$20 per student per year or $40 per student for four years.  Christopher found that many students took advantage of an Apple promotion this fall to purchase iPod Touches and iPhones along with their Mac laptops so a lot of students at Maryland have devices that can run the new TurningPoint system.

University of Delaware

The adoption committee at Delaware included not only faculty members and IT staff, but staff from the assessment offices and, I think, students, as well.  (I may have misheard that last point.)  They standardized on Interwrite PRS and spent the summer of 2006 training faculty and installing receivers and software in all classrooms with at least 75 seats.  (They now have receivers and software in all classrooms with at least 35 seats, which is most of the classrooms on campus.)  By the fall semester about 3,600 students and 40 faculty were using clickers.  More faculty started using clickers in the fall of 2007, but this year there are relatively few faculty new to clickers since most faculty have heard about them and decided whether or not to use them.

Clickers are popular in courses in the natural sciences, as well as psychology, political sciences, and nursing.  Many first-year undergraduate courses use clickers, which means that faculty teaching “downstream” courses are now more likely to use clickers, as well, since most of their students already own the devices.

Clickers are used in non-academic settings on campus, too.  Residential Life uses them to collect information on student experiences and opinions in the dorms.  The library and the office of assessment use them, as well.

Challenges to the support of classroom response systems on campus included a move to a new unique student identifier.  The Interwrite PRS system allows students to enter and store their unique identifiers on their clickers, but it took some work to have all the students request a new unique identifier on the Delaware Web site.  Other challenges included handling new versions of the software and a move from one course management system (WebCT) to another (Sakai).

One process Nancy mentioned that I particularly liked is that when faculty request clickers for their courses from the bookstore, there’s a checkbox on the form that asks them if they are new to using clickers.  Faculty who check this box are then sent resources by Nancy’s office and added to Nancy’s mailing list.  This helps faculty connect to useful pedagogical and technical resources and helps Nancy know who’s using clickers on campus.

Nancy also mentioned that she’s found it helpful to give faculty members their own receivers so they can practice as much as they need to outside of the classroom.  She finds that students know when their teachers aren’t comfortable with a technology, so time for practice is important.

Another point Nancy made was that the code of student conduct at Delaware has been amended to mention clickers.  Students are to respond for themselves, not on behalf of other students.  She indicated that faculty appreciate having this clause in the code since it means there’s a process they can follow if they suspect students of cheating by bringing other students’ clickers to class.

University of Pittsburgh

Things at Pittsburgh have been a little more chaotic.  A review committee consisting of IT staff, facilities staff, and registrar staff decided in 2003 not to adopt a single system on campus.  As a result there are now a few systems in use on campus now.  There’s now some move toward standardizing on eInstruction, but there doesn’t seem to be a central decision-making office that enforces that decision so faculty are still free to use other systems.

Clickers are popular in biological sciences, physics, nursing, and pharmacy.  Also, the School of Social Work uses them frequently in their gambling addiction counselor program.  I wouldn’t mind talking to some of those faculty to find out how they use clickers in that setting.

Michael Arenth named a few challenges they’ve faced at Pittsburgh, including managing faculty expectations (particularly for faculty who get excited by clickers but don’t plan on the time necessary to learn the systems), cheating (students who bring other students’ clickers to class to cheat on attendance grades), and set-up between classes since until recently, they haven’t been installing systems in classrooms.

I believe Michael said that Pittsburgh has still been using infrared clicker technologies until fairly recently switching to radio frequency.  (Most people I’ve talked to made the switch a couple of years ago.)  He noted that the IT group on campus had to approve the use of radio frequencies for this purpose.  I hadn’t heard of this kind of approval before, so I found this point interesting.

Common Issues

All three campuses have surveyed faculty and students about clickers, and they used some common questions to enable comparisons among the three campuses.  They found that faculty frequently use clickers to measure student comprehension, measure student opinion, obtain anonymous responses, monitor attendance, and facilitate quizzes.  The presenters spoke only briefly about these results, and it was unclear to me the extent to which faculty use comprehension or opinion questions to generate small-group or classwide discussion or to practice “agile teaching” by responding to the results of clicker questions during class.  I was, however, happy to see that clickers were used more for formative assessment (measuring comprehension and opinions) than summative assessment (quizzes and tests) since I think that’s where clickers really shine.

An audience member at the presentation asked about the student response to clickers.  The panel indicated that students like the interactivity that classroom response systems provide.  They confirmed what I’ve now heard from multiple sources, that students want to see some value added to their learning experience as a result of the clickers.  If a faculty member just asks a question and quickly moves on, there’s no interactivity and little impact on student learning.  Students don’t respond well to this.

Finally, I spoke with Danny Sohier of Université Laval in Québec after the session.  His school is using clickers to conduct end-of-semester course evaluations during class.  They found that online course evaluations resulted in low response rates, a problem I’ve heard about from many institutions.  They now use clickers to collect student responses to multiple-choice evaluation questions during class in some courses, inviting students to respond to open-ended questions online outside of class.  Danny indicated that this arrangement is working pretty well.  I might follow up with him to learn more about this process.

That’s it for my notes on this session.  I was glad to see a clicker session on the agenda at EDUCAUSE.  I was a little surprised at the number of audience members who asked questions at the end of the session and at the nature of those questions.  It seems there are a lot of institutions that are still just starting to work on adoption and support issues.  That indicates to me that use of classroom response systems will continue to grow over the next few years.

Leave a Reply

Your email address will not be published. Required fields are marked *