Generative AI as a Learning Technology

On last week’s episode of the Intentional Teaching podcast, I talked with educators and authors James Lang and Michelle D. Miller about ways we might rethink our assignments and courses in light of new generative AI tools like ChatGPT. Since we situated that conversation in some other technological disruptions to teaching and learning, including the internet and Wikipedia, perhaps it was inevitable that one of us drew a comparison to the advent of handheld calculators in math classes.

Jim pointed out that we don’t just hand kindergartners calculators and expect them to do anything useful. We have to teach kids numeracy skills before they start using calculators so that they know what they’re doing with the calculators. In the same way, Jim argued, we shouldn’t have first-year composition students use ChatGPT to help them outline their essays since those students need to develop their own pre-writing and outlining skills. The chatbot might produce a sensible first draft, but it would also short-circuit the student’s learning. ChatGPT might be more appropriate for use by more experienced writers, who can use the tool to save time, just as a more experienced math student would use a calculator for efficiency.

I generally agree with this analysis, but I had a different kind of experience using calculators in school. When I learned calculus my senior year, we used graphing calculators regularly both in and out of class. My memories are admittedly a little hazy now, but I believe that there was something of a symbiotic relationship between learning the concepts of calculus and learning how to use a calculator to manage the calculations of calculus. For instance, we might try to come up with a polynomial that had certain roots or certain slope properties, then graph our function using the calculator to see if we were correct. The tool provided feedback on our math practice, while we also got better at using the tool.

Four people in the woods looking at distant birds through binoculars and cameras
That’s me there with the telephoto lens on a bird walk.

Here’s another analogy: photography. I’m an amateur photographer. (Actually, I once got paid $75 to photograph an event, so technically I’m a professional photographer.) When I was learning photography, there was a lot of conceptual learning about light and depth of field and composition but also learning how to use my digital camera, what all the knobs and buttons did. As I experimented with taking pictures, my use of the camera helped sharpen my understanding of the relevant concepts of photography. And my better understanding of those concepts in turn informed the ways I used the knobs and buttons on the camera to take better photos.

Might AI tools like ChatGPT serve a similar role, at least for students with a certain level of foundational writing skills? It’s already quite easy to ask ChatGPT (or Bing or one of the other chatbots powered by large language models) to draft a piece of writing for you, and then to give it feedback or corrections to make. For an example, check out this post by John Warner in which he coaches ChatGPT to write a short story and then make it a better story. John is already an accomplished writer, but might a more novice writer use prompt refinement in this way to develop their own writing skills, much like I would use a bunch of different settings on my camera to take the same photo so I could better learn what those settings do?

All metaphors are wrong (to quote my colleague Nancy Chick), and none of the analogies I’ve laid out here are perfect. But I think there is some value in thinking about ChatGPT, etc., as similar to technologies like digital cameras or graphing calculators that we can use to learn skills and sharpen our craft as we learn to manipulate the tools.

Leave a Reply

Your email address will not be published. Required fields are marked *