I’ve revised a number of mathematics courses over the years. This fall, I get to revise a MOOC.
Last year, I co-taught “An Introduction to Evidence-Based Undergraduate STEM Teaching,” a massive open online course (MOOC) distributed through Coursera. Having consulted with several Vanderbilt instructors on the design of such courses during Vanderbilt’s first year partnering with Coursera, I was eager to try my hand at a MOOC myself. Not that I was by myself! The course had nine lead instructors from five different universities, not to mention support from three grant-funded graduate fellows, the Vanderbilt Institute for Digital Learning, and staff from the CIRTL Network, a group of research universities collaborating in the preparation of future STEM faculty, the target audience for the course. The MOOC was a CIRTL Network production, funded as it was by an NSF WIDER grant with five CIRTL Network leaders, including me, as PIs.
I’ve blogged about the coordination required to produce a MOOC across so many institutions and instructors, and I’ve blogged about the local learning communities we helped create and support through the MOOC. In today’s post, I want to share some of the changes we’re making to the course for its second offering, which starts September 28th. But first, a few highlights from the Fall 2014 offering…
By the Numbers
We had 5,908 learners enroll in the course. Of course, enrollments in a MOOC are like points in Whose Line Is It Anyway; they don’t matter. More relevant is the fact that we had 4,009 active participants. These learners did something beyond enroll; they watched a video or took a quick or browsed the discussion forums. Over 4000 active participants isn’t bad for a course with such a targeted audience: STEM grad students and postdocs interested in improving their teaching.
Of those 4,009 active participants, 566 earned a Statement of Accomplishment. That meant scoring at least a 70% on the course quizzes, and, for the subset who earned Statements “with distinction,” scoring at least 70% on the peer-graded assessments, too. That’s a completion rate of 14% of all active participants, better than your average MOOC. (And don’t tell me that’s a terrible rate compared to traditional college courses. With no cost and no credits on the line, MOOC completion rates are a different beast.)
According to the pre-course survey, which had a 36% response rate, most of our participants were in our target audience, with 28% of them graduate students and 18% post-docs. Current STEM faculty were well represented, as well, composing 29% of course participants. Almost all of the participants (99%) already had a college degree, which makes perfect sense given our audience. The majority of participants came from the United States, but 25% lived elsewhere.
Our participants put the “S” in STEM, with 39% working in the biological sciences and 20% in the physical sciences. Engineers made up 12% of the participants, with earth sciences, mathematics, and computer science on the board with 5%, 5%, and 4%, respectively. Again, that’s from the pre-course survey (completed during the first two weeks of the course), but the disciplines identified in discussion forum posts support this distribution.
Feedback from the course participants was generally positive. (More on the “generally” below when I discuss revisions.) Here’s one of my favorite quotes from a student:
“I really loved the course and I will definitely be using the skills that I’ve acquired. I’m teaching my first class next semester, and I’ve already been putting these new tools you’ve provided me with to good use. I’m much more excited and less nervous about teaching now!”
MOOC-Centered Learning Communities
So far, these are the kinds of stats (and quotes) you see for most MOOCs. Our course had an additional layer, one we called MOOC-Centered Learning Communities (MCLCs). Since our target audience, STEM grad students and postdocs, came in clusters (at research universities), we encouraged participants to form local learning communities, meeting regularly throughout the run of the course to go deeper with the course material. We also reached out to colleagues in the CIRTL Network and at institutions outside the Network, encouraging them to facilitate MCLCs.
As best we can tell, 42 colleges, universities, and research institutes hosted MCLCs. Most were in the US, but we know of communities in the UK, Australia, and Germany, too. Most of the MCLCs were composed of grad students and post-docs, but some were all faculty. Several of the MCLCs were part of credit-bearing, graduate level courses in STEM teaching on their respective campus. According to the post-course MCLC facilitator survey (which featured responses from 24 facilitators), the average community size was 10, which means that approximately 420 of our 4,000 active participants (or 10%) were in local learning communities.
We have pretty good evidence, based on Coursera platform data and pre- and post-course surveys, that participants in local learning communities were twice as likely to complete the MOOC. That’s probably why our completion rate (14%) was above average. Certainly, we heard some very positive things from our MCLC facilitators. Here are just two quotes:
“Being at a research intensive institution, it’s been difficult to find and connect with other graduate students who are interested in education and teaching as a future career. This learning community allowed us to meet and connect and, hopefully, will serve as the start of an education interest group.”
“Our participants reported that the primary benefit of the MCLC was accountability to actually watch the videos and stay current with the course, and secondarily growing our local STEM teaching community.”
Each MCLC facilitator was sent a facilitator’s guide, which included information on learning goals, videos, and discussion prompts from the MOOC itself, along with suggested activities for in-person meetings that would build on the online learning activities. Facilitators reported finding the guides very useful, especially the suggested activities. In turn, we asked facilitators to share highlights from their local meetings. As an instructor of a 4000-student, wholly online course, it was quite gratifying to see how some of our students were applying the principles and practices introduced in the course!
What’s different for the Fall 2015 offering of “An Introduction to Evidence-Based Undergraduate STEM Teaching”? Lots.
The biggest change is that we’ve split the course into two smaller courses. By far, the biggest complaint we received about the Fall 2014 offering was that there was just too much of it. Most everyone said that the course learning activities (videos, discussions, activities, assignments) were very high quality, but asking participants to spend 8-10 hours per week (or more!) on the course was unrealistic. In hindsight, that was obvious. What STEM grad student or post-doc has that kind of time for a non-credit course on effective teaching? We aimed for 8-10 hours per week because that kind of time on task multiplied by an eight-week course came out to something equivalent to a one- or two-credit traditional course. That’s the scope we were aiming for, but, as so many participants said, it was just too much.
We’ve taken the eleven content modules from the Fall 2014 version of the course and distributed them across two eight-week courses. The first, still titled “An Introduction to Evidence-Based Undergraduate STEM Teaching,” will run this fall, September 28 through November 19. (Sign up here!) That course will focus on a few key principles of learning, crafting learning objectives, matching those objectives with assessment of learning, developing a framework for active learning in the classroom, and practicing inclusive teaching. The final deliverable (peer-graded, naturally) will be an annotated lesson plan through which participants apply what they’ve learned in the course.
The second course, titled “Advancing Learning though Evidence-Based STEM Teaching,” will consider learning through diversity, teaching-as-research, the flipped classroom, and a variety of specific active learning strategies, including cooperative learning, problem-based learning, peer instruction, and inquiry-based labs. The final project will involve the design of a teaching-as-research project that participants could implement in a future classroom.
If you’re keeping track, that’s four original MOOC modules in the first course and four in the second. Our module on lecturing has been reconfigured into one on the flipped classroom, the module on student motivation has been folded into the module on learning principles, and the module on learning through writing has been retired. (Long story, lots of lawyers, don’t ask.) And we’re developing four new modules, one on active learning that features highlights from existing modules, two on teaching-as-research to serve as a frame for the second course, and one on learning through diversity, which is one of CIRTL’s core ideas.
All that to say, we’re aiming for 4-5 hours per week of time on task for the courses. Each course features six weeks of “content,” with videos and various learning activities, one module per week in most cases, plus two weeks devoted to the final peer-graded projects. That’s sixteen weeks in total, but with the smaller weekly time commitment, we’re expecting greater participation and completion.
Red Pill, Blue Pill
Inspired by the “red pill, blue pill” approach used in the Data, Learning, and Analytics MOOC (#DALMOOC) co-taught by George Siemens, for the second offering of the course, we’re giving participants two very different ways to “complete” the course. (I’m not sure that the “red pill, blue pill” metaphor really works, given how it was used in The Matrix, but George Siemens used it, so I’m going with it.)
Blue Pill: If you’re interested in an instructor-guided path through the course, take the blue pill. In this case, that means your grade in the course will be the average of your quiz grades. Since the course quizzes are reasonably comprehensive, to do well on this path, you’ll need to watch the course videos and work through the course activities as laid out by the instructors. I’m guessing that most participants will take the blue pill, because most are leaning on the instructors to point them to key concepts and practices in STEM teaching. The downside of the blue pill (the instructor-guided path) is that the course quizzes don’t go very high up Bloom’s Taxonomy. Most of the questions are targeted at the “understand” and “apply” levels, which means the blue pill is comprehensive, but not that deep when it comes to practice and feedback.
Red Pill: If you’re interested in a most self-directed path through the course, take the red pill. Your grade in the course will be the average of your grades on the peer-graded assignments. The peer-graded assignments will be focused on key learning objectives, certainly, but they’ll allow learners to meet those objectives in different ways. For instance, the final project in the first course is an annotated lesson plan. If you can put together a lesson plan that meets the criteria on the rubric for this assignment, you’re good to go. You don’t necessarily need to watch all the videos to do so, so the red pill allows learners to pick-and-choose from the course learning activities to suit their individual interests and needs. Plus, the peer-graded assignments go a little deeper than the course quizzes and feature qualitative feedback from other learners. This means that the red pill is not as comprehensive, but involves deeper practice and feedback.
Functionally, your course grade will be the maximum of your quiz score and your peer-graded assignment score. You’re more than welcome to take both paths, completing quizzes and peer-graded assessments. You’ll likely learn more! Plus, you don’t have to choose in advance. You’re free to pursue both paths at first, and focus on one later during the course. We think the red pill / blue pill approach (thanks again, George!) will help more students get out of the course what they need, and lead to higher completion rates. Not that completion rates are all that (see above), but many students appreciate the feeling of accomplishment that comes with completing a course.
I’ll mention two more changes to the next offering of the course, one local and one global.
Half the Battle
Last year, given where we were in our production schedule, when most MCLC facilitators agreed to host local learning communities, they were going on faith that the course would be an interesting and useful one. We weren’t in a position to release course content (especially videos) early to facilitators, and the facilitator’s guide itself was a just-in-time kind of thing. Our facilitators made it clear in their post-course survey that they would have appreciated earlier access to course materials. This time around, we can do better!
Most of the videos from the first MOOC are now available on the course website (just click on Course Content), so facilitators — and learners — are welcome to preview this key part of the course. And our revised facilitator’s guide is almost done, so we should have that in the hands of our facilitators a couple of weeks before the MOOC begins. Also, this time around we have one of our grant-funded fellows, Noah Green, acting as coordinator for our MCLC facilitators. He’s already getting them organized, and he’ll be hosting a virtual Q&A with a few veteran facilitators on September 15th from 3 to 4pm Central.
If you’re at all interested in hosting a local learning community for the fall offering of the course, please visit the course website for more information and to sign up to get our facilitator’s guide.
After the Fall 2014 offering of the course, I was struck by a number of comments made by MCLC facilitators. Here are two examples:
“Some students engaged in the online forum earlier on in the MOOC; however, as time went on, it seemed apparent that the online discussion board was not used by our participants.”
“We were all very pressed for time, and I think most of us prioritized the in-person MOOC over the online forum since it was nice to be able to discuss things in person.”
It makes a lot of sense that participants in a learning community would reach out locally to discuss what they were learning in the course before going online to connect with the global learning community organized by the MOOC. But I can’t help but think that this is a lost opportunity. As I mentioned above, one of CIRTL’s core ideas is learning through diversity. There’s real value in interacting with someone interested in STEM teaching who is studying or teaching at a different institution. For instance, consider that many STEM grad students are interested in learning what it’s like to teach at a community college or small liberal arts college. We had faculty members from such institutions participate in the MOOC. Can’t we find a way to connect these two groups?
When I shared the above comments at an EDUCAUSE Learning Initiative conference back in February, my friend and mentor Gardner Campbell said simply, “Networked learning is hard.” He should know — he’s been facilitating networked learning communities of various kinds for years. His comment led me to consider the kind of experience our target audience, STEM grad students and post-docs, might have with online, networked learning. It’s likely pretty minimal, certainly when compared to the experiences of their colleagues in the digital humanities. In the world of DH, it’s common practice to connect with other scholars through Twitter and blogs. I don’t think that’s the case in most of the STEM disciplines.
My goal for the upcoming offering of the course is to design a series of networked learning activities, scaffolded to help participants develop their skills at learning online from colleagues. I still need a catchy name for these activities (“networked learning activities” is on point, but not that memorable), but I have a draft list of the activities themselves. We’ll have participants connect with each other on the Coursera discussion boards in targeted ways. We’ll ask participants to investigate Twitter and other platforms for social learning. We’ll have learning communities send representatives to virtual forums to exchange highlights from their local conversations. We might hack Coursera’s peer-graded assignment tools to create “flash” feedback sessions that last just a couple of days. And we’ll ask participants to come up with their own experiments in networked learning and report on the results.
All these activities will be optional, of course, but for those interested in developing their lifelong learning skills and in hearing perspectives they might not hear locally, I think they’ll be valuable.
The Fall 2015 offerings of “An Introduction to Evidence-Based Undergraduate STEM Teaching” starts September 28th. I hope you’ll join us.