How to... use AI to teach some of the hardest skills
When errors, inaccuracies, and inconsistencies are actually very useful
The AI panic is spreading in education. Released without guidance or clear suggestions for educators near the end of a school semester, the rapid adoption of ChatGPT might be among the largest, fastest transformations in education. It instantly challenges the entire existance of a valuable type of assigment, the essay, along with dozens of other assignment types in fields from programming to poetry. Cheating is an obvious outcome. Evaluations may require a return to oral exams and writing longhand in exam books. Or maybe AI-written text will be identifiable by other AIs. Or assignment types will adjust to the new reality. Either way, the cheating problem will be solved, eventually.
So I don’t want to discuss that.
Instead, I want to discuss the opportunity provided by AI, because it can help us teach in new ways. The very things that make AI scary for educators — its tedency to make up facts, its lack of nuance, and its ability to make excellent student essays — can be used to make education better.
This isn’t for some future theoretical version of AI. You can create assignments, right now, using ChatGPT, that we will help stretch students in knew ways. We wrote a paper with the instructions. You can read it here, but I also want to summarize our suggestions. These are obviously not the only ways to use AI to educate, but they solve some of the hardest problems in education, and you can start experimenting with them right now.
The Transfer Problem
The goal of education is straightforward: to permanent change someone’s knowledge about the world.
In order to do that, students need to be able to transfer what is learned in the classroom to other contexts. Transfer is difficult because it requires deep understanding of a concept. Initially, when students learn about a new concept in one context, they often fail to recognize that concept when they encounter it again in a new context. For instance, in a math class (context 1), students may learn about how to compare percentages and decimals but when faced with a food label (context 2) or a medical decision (context 3) students may fail to apply that knowledge. This is the case because a) students tend to focus on the concrete aspects of any given problem or situation and b) applying knowledge to a new context requires a deep understanding of the underlying structure of a concept. To use what they previously learned students need to recognize that the former problem (from math class) is the same problem (ah, just like in math class!) in a new context.
It is hard to prove that transfer occurs, and harder to teach people to transfer knowledge. But we think AI can help, because it is both so good at making stuff up and kind of bad at doing that accurately.
AI is a cheap way to provide students with many examples, some of which may be inaccurate, or need further explanation, or may simply be made up. For students with foundational knowledge of a topic, you can use AI to help them test their understanding, and explicitly push them to name and explain inaccuracies, gaps, and missing aspects of a topic. The AI can provide an unending series of examples of concepts and applications of those concepts and you can push students to compare examples across different contexts, explain the core of a concept, and point out inconsistencies and missing information in the way the AI applies concepts to new situations. You can use the confident errors of AI to your advantage and ask students to explore AI’s output and then do the hard work of improving that output.
The basic idea is to have students ask the AI to create scenarios that apply a concept they learned in class: Create a Star Wars script illustrating how a bill becomes a law. Show how aliens might use the concept of photosynthesis to conquer Earth. Write a rap that uses metaphors. Then, ask the students to critique and dive deeper into these models, and potentially suggest improvements.
There is more in the paper, including prompts, but the ability to generate endless convincing (but slightly wrong) examples of a concept being applied allows us to approach transfer in a new way.
Breaking the Illusion of Explantory Depth
We think we understand the world far more than we do, which makes us less willing to learn.
This is called the illusion of explanatory depth, a cognitive bias that occurs when an individual overestimates their understanding of a concept or phenomenon. This bias is often characterized by a person's ability to provide a detailed and seemingly knowledgeable explanation of a subject, despite their actual rather limited degree of understanding. For instance, most of us can’t quite explain how a car engine works, how a fridge works, or even how a pencil is made. But we are under the illusion that we have a depth of understanding about the topic. Students too can easily fall for this illusion, assuming that they understand how something works when in fact, they only have a shallow understanding of a topic.
Breaking the illusion requires confronting one’s own ignorance, which can be a upsetting and humbling experience, and hard for teachers to pull off. So let AI do it for you.
We created an assignment where students ask the AI to explain a particular concept step by step, something AI is very good at. Students should then improve this output by adding information, considering the order of the steps, and re-thinking the depth of their knowledge about the topic. Here, we are using AI to come up with steps in a process so that students can critique and improve upon a process. The prompt can include something that students feel they understand well or something complex that will require additional research or numerous steps to make whole.
Again, the details are in the paper, but students are asked to add, remove, and combine steps, using their own research, until they get an understanding of the process or approach, and realize how complicated it really is while achieving some mastery over it.
Practing Evaluation: The Power of Teaching Someone
When students hear you explain and discuss a concept, they often feel that they understand what you mean, but that feeling isn’t always accurate. One powerful way to turn concepts from theory into practice is to teach someone else, to evaluate their work, and to give concrete and timely advice about how to improve. As any teacher knows, the act of assessing and evaluating someone else’s work and teaching someone else improves our own knowledge of a topic.
By acting as a “student,” the AI can provide essays about a topic for students to critique and improve. The goal of this exercise is to have the AI produce an essay based on a prompt and then to “work with the student” as they steadily improve the essay, by adding new information, clarifying points, adding insight and analysis, and providing evidence. We take advantage of the AI’s proneness to simplify complex topics and its lack of insightful analysis as a backdrop for the student to provide evidence of understanding.
In this assignment, you’ll give students an essay prompt for the AI and it will be their job to give the AI suggestions for improvement. They’ll paste in both the original essay, their suggestions, and the final output. The process will push them to think critically about the content and articulate their thoughts for improvement in a clear and concise manner. They may need to seek out additional information to fill the gaps the AI essay might be missing or double check on the “facts” that the AI presents.
If nothing else, they will recognize how hard teaching and evaluating the work of others can be!
Embracing the power of AI for learning
On a Tuesday, I introduced my class to ChatGPT. By Thursday, over half the class admitted using it in a wide variety of ways. No one admitted to cheating, but students were ingenious in their applications: they used the AI for feedback on essays, suggestions on topics, as a tutor to explain concepts, and as a way of correcting and finding errors in assignments that ranged from literature to engineering. Educators face a new world of ubiquitous AI use in class, and the dominant stories are often ones of fear - what does this mean for education? The answers are not clear, yet, but it is important that we consider how having the ability to quickly and cooperatively generate content can be used to boost pedagogy, even as it threatens old methods.
We hope you will take a look at the draft paper, which includes a lot of details on how to set up and assess these assignment, and share suggestions in the comments below about how you are using AI to teach, or to learn.
If you think your students don’t know about ChatGPT, or aren’t using it, I would suspect you are probably wrong.
As a student myself I can confirm many students are aware of ChatGPT
I feel making the AI act as a “student” is a remarkable yet simple and practical approach to learning. I try Feynman method for learning a new concept, I.e. try to teach someone with a simple explanation, but often fall short of an obedient and dedicated student - ChatGPT may be one such patient student.