Everywhere I look online these days, I see educators talking about the challenges that artificial intelligence (AI) poses to teaching and learning. While recent discussions have focused on AI writing generators, like Playground and ChatGPT from OpenAI, I’ll start this post with an example of an AI image generator. Dan Cohen, dean of the libraries and professor of history at Northeastern University, blogged the following earlier this month:
In my digital history class we used new AI tools to see how easy it would be to forge historical documents, e.g., to create fake photographs of D-Day, and the results were scary.
Over on Mastodon, I asked Dan what he meant by “scary.” He replied, “Very easy to deceive non-specialists.”
Davidson College’s Mark Sample posted the following image, saying, “I dunno, I’d say more cute than scary.”
This interchange points to at least one important observation about AI and teaching: We are going to have to start teaching our students how AI generation tools work. I don’t think that Mark Sample’s image will convince anyone that fictional character Paddington Bear was on the beach at D-Day, but the image that Dan Cohen shared is a more subtle fakery. There weren’t any tanks in the D-Day invasion force, right? And consider something I learned about ChatGPT at a conversation hosted by Dave Cormier and Brenna Clarke Gray: The text corpus used to inform the tool leans heavily on reddit posts, which will naturally lead to certain kinds of outputs over others. Just as we once needed to understand and explain how Wikipedia works or how Google search works, we now need to think about AI generation tools as part of the digital and information literacy we teach. Our librarian colleagues are great resources for this, but faculty can’t depend on librarians to carry the full load of digital and information literacy. Instructors in all kinds of fields need to start thinking about how to incorporate this kind of literacy instruction in their courses.
One way to help students understand how AI generators work is to have students play around with these generators, like Dan Cohen did in his digital history course. Students will be more savvy consumers of AI-generated media if they have some experience as creators of AI-generated media. Asking students to start using AI tools is something that a group of writing and rhetoric faculty at the University of Mississippi have been doing for the last semester. One of those faculty, Marc Watkins, wrote a guest post on Inside Higher Ed last week titled “AI Will Augment, Not Replace.” After having students use AI writing and research tools as part of scaffolded assignments, Marc and his colleagues talked to their students about the experience. Students generally found the tools helpful, noting that Elicit’s AI research tool and Fermat’s counterargument generator helped them approach their paper topics from new perspectives.
This leads to a second important observation about AI and teaching, a point that Watkins makes in his post: When used intentionally, AI tools can augment and enhance student learning, even towards traditional learning goals. When I talked with University of Mississippi’s Robert Cummings on my podcast, he mentioned that the pop duo YACHT used AI generators in the creation of their recent album. The duo fed their own lyrics and music into AI tools and used the output to inform, but not dictate their new compositions. YACHT member Claire L. Evans said in an interview, “Ultimately, we did that by cobbling together a bunch of different approaches, picking and choosing the best of what the technology could offer and combining it with far more analog strategies for composition and arrangement.” When Cummings heard this story, he told me it was like a light bulb moment. “That’s how we can use [AI] in the writing classroom. The way forward… is to look at a tool, figure out what the strengths and weaknesses of the tool are, and design specific interactions with specific composing purposes.”
I’m reminded of teaching linear algebra years ago shortly after the “answer engine” Wolfram Alpha debuted. Using the computational power of Mathematica, access to a number of databases, and (these days) AI tools, Wolfram Alpha can provide correct and comprehensive answers to all kinds of math problems. I invited my linear algebra students to use the tool on homework assignments and exams to handle some of the more tedious computations one finds in linear algebra. For instance, a lot of linear algebra problems involve modeling a situation with a matrix, using a row-reduction technique to simplify that matrix, then interpreting the results in light of the original problem. Row reduction is tedious and error-prone when done by hand, but Wolfram Alpha can handle it easily. I asked my students to row reduce by hand one decently sized matrix on their first homework assignment so they would have a feel for the row reduction operations, and then I never asked them to do that again.
Inviting my students to use Wolfram Alpha to do their row reduction meant, among other things, that I could ask different kinds of questions on my exams. I could ask students to model a more complex situation with a matrix, then use that matrix to solve a problem, knowing that the tool could handle the worst of those computations. Students could focus on the more challenging and interesting steps in that problem solving process: setting up the matrix based on the problem and interpreting the results of the computation. This leads me to my third observation about AI tools and teaching: We will need to update our learning goals for students in light of new AI tools, and that can be a good thing. As John (Why They Can’t Write) Warner has been writing this past month, and I’m paraphrasing here, if robots can write a decent English essay, then maybe we shouldn’t be asking our students to write decent English essays. We should be asking them to do something else, something more meaningful in their writing than generating five-paragraph essays. Warner has been arguing for a transformation in the teaching of writing for years, and these AI developments have illustrated very clearly why that transformation is needed.
There’s lot more to say on this subject, of course, but I hope these three observations are useful as you make sense of this new technology landscape. Here they are again for easy reference:
We are going to have to start teaching our students how AI generation tools work.
When used intentionally, AI tools can augment and enhance student learning, even towards traditional learning goals.
We will need to update our learning goals for students in light of new AI tools, and that can be a good thing.
How each of these plays out in individual classrooms will vary widely, of course. I would love to hear what you’re doing in your teaching in response to these new AI tools. Feel free to leave a comment here on the blog, or reach out to me through my contact form.