Cult of Pedagogy Search

How to Use ChatGPT as an Example Machine

February 19, 2023


Chris Mah

Close

Can't find what you are looking for? Contact Us


Listen to the interview with Chris Mah and Sarah Levine:

Sponsored by Listenwise and Wipebook


This post was co-written by Sarah Levine.

Editor’s Note: This post takes a pragmatic approach to ChatGPT while acknowledging some of the problems that have already arisen with the tool. Other ethical issues with the tool’s development have been brought to light in recent weeks that are not mentioned here. It is our belief that teachers are most effective when they are informed and empowered, so we urge you to stay mindful of these complexities when making decisions for your own practice.

You have probably already heard about or tinkered with ChatGPT (the “GPT” stands for “generative pre-trained transformer”). ChatGPT is a chatbot (or “bot”) powered by artificial intelligence (AI). You can have a conversation with it, prompt it to write essays, create recipes, make medical diagnoses, mimic famous authors, and code software. Its outputs are impressively human-like. In just five days, it gained one million users, a milestone that took Facebook ten months to achieve.

ChatGPT comes with risks: Students can now copy and paste a school writing assignment into the bot’s dialogue window, and the bot will spit out an essay in a couple of minutes. Also, because ChatGPT gets most of its content and genre knowledge from the web, it can generate sexist, racist, or blatantly false information, all in an authoritative tone. For example:

As literacy researchers and former high school teachers, we recognize these risks. BUT: we don’t think we can avoid ChatGPT and its family of AI tools. It is not just a future vision of learning; it is already here. And we are only scratching the surface of what AI will do in the next five to ten years. So this post is about one particular way that teachers can use ChatGPT to support kids’ learning: Using ChatGPT as an example machine.

ChatGPT’s ability to quickly produce a lot of examples of different kinds of texts makes it uniquely suited to learning strategies like concept attainment, inductive learning, or one that we’ll focus on here called contrasting cases

Why Multiple Examples Are So Important in Learning

The basic idea behind strategies like contrasting cases or concept attainment is that one example is not enough to help people learn about a thing (like an idea or a process). People need to compare multiple, contrasting examples.

Not coincidentally, ChatGPT is built on a similar type of machine learning. For example, for a computer to recognize an image of a dog, it has to process thousands of images of dogs — but also needs to process images of “not dogs,” like cats, foxes, and wolves — to accurately construct a general definition and model of dog. In other words, to know what a dog is, you also need to know what it is not. 

Across disciplines, teachers use multiple contrasting examples to help students develop mental models of processes and concepts. Social studies teachers provide examples of different types of governments, English teachers provide examples of different literary interpretations, and science teachers give examples to illustrate different types of energy. Students can sequence, rank, or categorize these examples in ways that help them understand underlying concepts. 

But developing these examples can take a lot of time. ChatGPT can generate any number of contrasting examples in a few minutes. Then teachers can revise and perfect those examples, and have more time to dedicate to other creative lessons. 

How to Get Good Examples from ChatGPT

To use ChatGPT as an example machine, teachers first need to know how to prompt it effectively. ChatGPT will do exactly what it is prompted to do and nothing more, so it’s important to give it precise instructions. Here are steps to a general  approach we have been using to prompt the bot: 

1. Establish criteria: Think about the elements that define a strong example of the thing you want kids to learn about (You can draw from your rubrics or final assignments to do this).

2. Prompt: Prompt ChatGPT to create an example of whatever concept, process, or phenomena you are trying to teach.

3. Review and adapt: Review the bot’s output and adapt it as you see fit.

4. Prompt for contrasting cases: Prompt it to create several non-examples, incorrect examples, or ambiguous examples. These are your contrasting cases.

5. Review and adapt: Review the outputs again and adapt them as you see fit.

Here’s an example. With this one, our goal was to help students learn the underlying principles of kinetic and potential energy. We used ChatGPT to generate a range of different examples of kinetic and potential energy. Kids could sort these examples into categories and then explain their choices. In the screenshots below, the text next to the yellow icon is our prompt, and the text next to the green icon is ChatGPT’s response. Following the process we introduced earlier:

Step 1 – Establish criteria: We wanted students to know that potential energy is energy that is stored, and kinetic energy is energy in motion.

Step 2 – Prompt: We asked ChatGPT to generate a list of five real-life examples of kinetic energy. (Click images for a larger view.)

Step 3 – Review and adapt: These examples looked pretty good. The fifth one — waterfall cascading down a cliff — might be a good example of both kinetic and potential energy. We could keep this one if we think our kids are ready for some ambiguity in the sorting exercise.

Step 4 – Prompt for contrasting cases: Next, we asked ChatGPT to generate examples of potential energy. These serve as our contrasting cases. (When prompting ChatGPT, you can basically talk to it as you would a person. For the most part, it remembers your previous prompts.)

Step 5 – Review and adapt: These examples also looked pretty good. We decided that we want to add one more ambiguous example that will increase the complexity of the contrasting cases. So we prompted  ChatGPT:

We revised that example to simply read: “A car moving down a hill.” That way, when we included it in our sorting exercise, students would have to do the work of defining the kinds of energy represented in the example. 

Reviewing the outputs again, we noticed that three examples were very similar: a car moving down the highway (kinetic), a roller coaster at the top of the hill (potential), and a car moving down a hill (both). The similarities made these three examples a great illustration of contrasting cases, because students would need to come up with a nuanced rationale for why they represent different types of energy. 

We could now mix up these examples on a Padlet or Google Slide, or we can print each one out on a separate piece of paper. Then students can debate with one another about each example, categorize them, and explain their choices. In their explanations, they would be constructing criteria that characterize each type of energy. 

Using the example machine for different learning goals

Here are some ways to use ChatGPT to generate contrasting cases to help students understand underlying conceptual principles and build mental models. We have organized our examples based on three common learning goals: learning about processes, concepts, and arguments. 

Examples to Explore Processes

Students learn about processes in many subjects. For example, they learn the steps for balancing chemical equations, solving mathematical inequalities, describing mitosis, or explaining how a bill becomes a law in the U.S. One way to help students learn the underlying logic of different processes is to generate contrasting cases of correct and incorrect sequences. 

In the example below, we ask ChatGPT to provide a few contrasting cases to teach young students about the life cycle of a seed. 

Step 1 – Establish criteria: First, we considered the steps we wanted students to understand.

Step 2 – Prompt: We prompted the bot to generate an accurate representation of that cycle. We also prompted the bot to write at the level of a five year-old, so we could ensure the explanation was accessible.

Step 3 – Review and adapt: We checked the bot’s output to make sure it was accurate. We liked the initial result, so we kept it without revision.

Step 4 – Prompt for contrasting cases: In process learning, students need to know both what happens, and in what order. A logical activity is for students to have to move the steps from incorrect to correct order. Normally, we would re-organize the steps manually, but ChatGPT can do it faster. 

We decided we also wanted a second contrasting case, so we asked ChatGPT to describe the seed’s life cycle using the correct order, but leaving out steps. Now students would have to figure out what steps were missing: sprout growth and seed production. Again, this was something we would typically do ourselves, but giving the task to ChatGPT saved us some time.

Step 5 – Review and adapt: We liked these examples and decided to keep them without revision. 

Just as with the activity about potential and kinetic energy, we could now present these contrasting cases to students either digitally or on paper, ask them to identify the most accurate representations, and then explain why. In doing so, students will start to develop “rules” about both the events in a seed’s life and their order. 

Using slight prompt modifications, teachers can change the order, accuracy, and completeness of the contrasting cases to deepen students’ thinking about complex processes. 

Adaptations for other subject areas:

Examples to Explore Concepts

Teachers can also use ChatGPT to teach students to articulate principles that underlie complex concepts like authorial style, statistical variance, or separation of powers. 

One of ChatGPT’s coolest tricks is its ability to adopt the style of different writers or historical figures. It can also write in the style of different genres, like a cover letter, advertisement, or movie trailer. Creating contrasting cases with combinations of these outputs can make for a powerful teaching tool. In the example below, we wanted students to understand how different literary elements can contribute to authorial style, and to consider how those elements might affect their emotional responses to texts. We asked ChatGPT to produce texts in the style of different authors — all authors that were familiar to our students. 

Using these contrasting cases, we could then ask students to match the review with the author, point to and name specific features of the writing that helped them identify the authors’ styles, and then talk about the effects created by those different styles. 

Step 1 – Establish criteria: We wanted students to understand that an author’s style includes a combination of word choice, sentence structure, and attitude toward a topic. We wanted the examples to have different emotional effects. 

Step 2 – Prompt: It would be easier for students to focus on style if the content of the text was similar. So we asked the bot to write about something students would be familiar with. We decided on a review of McDonald’s in the style of three authors. 

Step 3 – Review and adapt: In this review, ChatGPT used phrases like “dreary and tempestuous” and “wandering,” which work well to represent Poe’s language. The bot also did a good job creating a mood of gloom and foreboding. But we realized we wanted to emphasize Poe’s punctuation (lots of exclamation points, for example). So we adapted our prompt.

Well, the bot went a little overboard with the exclamation points, so we might revise one more time. 

Step 4 – Prompt for contrasting cases: Next, we wanted to generate our contrasting cases, so we asked ChatGPT to rewrite the reviews in the style of other authors that our students have read. We started with Sandra Cisneros.

Step 5 – Review and adapt: We didn’t think Cisneros’ characters would moralize about McDonald’s, so we asked the bot to revise. We also realized that a lot of Cisneros’ work is written in present tense. Making the change to present tense would give our students another element of style to consider and analyze. We prompted the bot again.

We might go for another round of revision, but we think this one is beginning to capture the style of Cisneros’ narrators. 

We tried for another example, this time turning to Zora Neale Hurston and Their Eyes Were Watching God for our model. 

We thought some of the phrasing was a little obvious, and when we compared this example with some of Hurston’s prose, we noticed that Hurston sometimes includes short, simple sentences in between longer ones. We adapted our prompt. 

Now, here is another example where we prompted ChatGPT to provide some contrasting cases. This time, we used the bot in the context of a social studies unit on forms of government. In the example, ChatGPT’s first response was pretty good. In this example, we will show you one way that we had to revise our initial prompts to generate more useful responses.

Step 1: Establish criteria: We wanted students to know that in a representative democracy, citizens vote in free and open elections to choose government leaders. In representative democracies, the rule of law applies equally to all citizens.

Step 2: Prompt:

Step 3: Review and adapt: In our first attempt, ChatGPT wrote a pretty accurate description. However, since we ultimately wanted students themselves to identify the types of government described in each example, we needed ChatGPT to remove the term “representative democracy.” We also wanted it to add the criteria that rule of law applies to all citizens. So we had to revise our prompt.

Step 4: Prompt for contrasting cases: Next, we needed ChatGPT to generate contrasting cases. In other words, we wanted it to describe other types of governments besides representative democracy, including monarchy and communism. Importantly, because we wanted students to understand representative democracy, we also wanted a contrasting case that showed a different type of democracy. So, we prompted ChatGPT to include a description of direct democracy. Having learned our lesson in our last prompting attempt, we also prompted the bot to leave out the names of the governments themselves.

Step 5: Review and adapt: We liked these examples, but after reading them through a few times, we decided that they might be boring for kids. We decided to revise them so that each offered a specific scenario that students could connect with different forms of government.

We could print these examples on cards, give them to students, and ask them to identify the kinds of governments described in these examples. We could also go a step further and ask students to rank the descriptions based on the degree of power given to people in each example. This would require students not only to identify the characteristics of representative democracies but also understand its relationship to other forms of government.  There are many variations to these types of sorting activities, and using contrasting cases will help students build rich mental models. ChatGPT magnifies our ability to use contrasting cases by creating useful examples more quickly than we could by ourselves. 

Adaptations for other subject areas:

Examples to Explore Arguments 

One of the most important skills taught students learn in all subject areas is the ability to evaluate and construct arguments. In English class, students might write essays about whether Troy, from August Wilson’s Fences, is a sympathetic character; in social studies, they might consider multiple causes and effects of World War I; in biology, they might make their case for whether a virus is a living organism. We can use ChatGPT to generate examples of arguments with contrasting structures or types of evidence, then students can evaluate each argument’s effectiveness. For such a lesson design, we would use the five steps we described earlier: 

  1. Establish criteria for strong argument
  2. Prompt
  3. Review and adapt
  4. Prompt for contrasting cases, including weaker or stronger evidence, or different structures
  5. Review and adapt again

We could also use the bot to address a long-standing problem when it comes to teaching students to build arguments: Sometimes we ask students to write an argument about a text, but we do not give students anyone to argue with. In this case, ChatGPT can provide a contrasting case for a student to push back against.

For example, imagine that a high school class has just finished reading Fences. The teacher would like kids to consider the role of the fence as a potential symbol in Wilson’s play. Instead of asking students to write an argument about the role of fences in the play, the teacher could offer students a ChatGPT-generated argument that students could respond to.

This argument in itself becomes a contrasting case for students to reflect on and respond to. They can agree with some of the interpretation, build on some parts of it, and refute other parts. In this way, their argument becomes more authentic. They might even direct their response to the bot. 

We might also offer students an opposing argument about Fences:

With these two arguments in hand, we could ask students to debate the merits of each, search the text for supporting evidence, and write about perspectives beyond just these two. Doing so would help students consider multiple interpretations of complex texts and develop and refine their own arguments. 

Adaptations for other subject areas:

Before ChatGPT, teachers might spend hours searching for or writing suitable examples of arguments like these. Now, we are learning how to prompt ChatGPT to generate these arguments to suit our teaching purposes. 

What’s Next?

There’s so much more this example machine can do to help us save time and employ our creativity so that we can better serve our students. Here are a few additional prompts that create other kinds of examples:

With all of its possibilities in mind, a reminder: There are lots of ways that ChatGPT can create problems for teachers. The technology is still young, and it does not know when it is wrong. For instance, when we first asked ChatGPT to generate interpretations of Fences, it made up lines from the play. When we asked it to talk about the short story “Linoleum Roses,” by Sandra Cisneros, it added characters that do not exist in that story, and offered unsupportable interpretations of the text. Part of our job now is to help students understand both what the tool can do and what it can’t.


Want some ideas for what to DO with these examples once you have them? Three instructional strategies that would work beautifully with multiple examples like these are contrasting cases, concept attainment, and inductive learning.


Come back for more.
Join our mailing list and get weekly tips, tools, and inspiration that will make your teaching more effective and fun. You’ll get access to our members-only library of free downloads, including 20 Ways to Cut Your Grading Time in Half, the e-booklet that has helped thousands of teachers save time on grading. Over 50,000 teachers have already joined—come on in.

14 Comments

  1. Mike Szczepanik says:

    I love the examples and non-examples ideas. I tried it out with a concept from AP Biology to see the complexity ChatGPT could generate.

    I asked ChatGPT the following: Write an essay using the example of insulin to explain the signaling transduction pathway. In the essay, make several mistakes.

    ChatGPT’s response:
    “Insulin is a hormone that’s produced by the liver to help regulate blood sugar levels in the body. When insulin binds to its receptor, it activates a series of intracellular events that lead to glucose metabolism.

    The first step in insulin signaling is the binding of insulin to its receptor. This receptor is a protein found on the cell membrane and has two subunits, an extracellular alpha subunit and an intracellular beta subunit. When insulin binds to the alpha subunit, it causes a conformational change that activates the beta subunit.

    Next, the beta subunit activates a protein called protein kinase A (PKA). PKA is a kinase that phosphorylates other proteins, leading to a series of downstream effects.

    One of the primary targets of PKA is glycogen synthase, which is an enzyme responsible for the synthesis of glycogen. PKA phosphorylates glycogen synthase, which inactivates it and prevents it from synthesizing glycogen.

    Another downstream target of PKA is glucose transporter 4 (GLUT4). GLUT4 is a protein that’s responsible for transporting glucose into cells. When activated by PKA, GLUT4 is inhibited, which prevents glucose from entering cells and leads to elevated blood sugar levels.

    In addition to PKA, insulin also activates a protein called phosphatidylinositol 3-kinase (PI3K). PI3K is a protein that phosphorylates phosphatidylinositol, which leads to the activation of another protein called Akt.

    Akt is a protein kinase that’s involved in a variety of cellular processes, including glucose metabolism. When activated by insulin, Akt activates glycogen synthase and promotes the synthesis of glycogen.

    However, in people with diabetes, the insulin signaling pathway is impaired, leading to elevated blood sugar levels. This can be caused by a variety of factors, including a lack of insulin production or decreased sensitivity to insulin.

    In conclusion, insulin signaling is a complex process that involves the activation of several intracellular proteins in response to insulin binding to its receptor. While insulin is essential for regulating blood sugar levels in the body, impaired insulin signaling can lead to serious health consequences, such as diabetes. By understanding the insulin signaling pathway, we can develop effective therapies for treating and preventing diabetes.”

    Then I asked ChatGPT: “What were the mistakes in the previous response?”

    ChatGPT’s response: “The previous response intentionally contained several mistakes, including:

    1. Insulin is primarily produced by the pancreas, not the liver.
    2. Insulin binds to its receptor on the cell membrane, not the nuclear membrane.
    3. The activated beta subunit of the insulin receptor has intrinsic tyrosine kinase activity, not protein kinase A (PKA).
    4. PKA is not a downstream target of the insulin receptor.
    5. PKA does not inhibit GLUT4; rather, it activates glycogen synthase kinase 3 (GSK3), which inhibits glycogen synthase.
    6. Insulin activates PI3K, not the other way around.
    7. Akt is activated by PI3K, not insulin directly.
    8. Akt promotes glycogen synthesis by activating glycogen synthase and inhibiting GSK3, not by directly activating glycogen synthase.
    9. Diabetes is not caused by impaired insulin signaling alone; it can also be caused by other factors such as genetics and lifestyle choices.”

    This can be a powerful tool to use! I could give the students ChatGPT’s mistake-ridden response and have the students find the mistakes.

    It further solidifies my belief that assessing students’ knowledge requires limiting their access to tech when they generate their responses. That doesn’t mean I think there shouldn’t have opportunities to use tech during assessments; rather, educators must know the purpose of the assessment and create the conditions necessary to elicit a response that we can assess what students know and can be able to do.

    • Margaret Harris-Shoates says:

      Thanks for sharing this example, Mike. Having clarity about the purpose of any assessment is definitely essential.

  2. Linda Oshman says:

    In order to get numerous examples for an English assessment, I first had to create an essay that included the criteria we were looking for. I then input the essay into ChatGPT and asked it to turn the essay into a rap, a diary entry, a talk show, and a monologue. Some were better than others and some required tweaking, but overall, it saved us so much time!

  3. Deborah Hoggan says:

    You mentioned that you receive a lot of spam comments. So as a real human, I thought I would drop you a line to express my admiration for what you are doing.
    Thank you for these podcasts and the information on your website. You have been an important inspiration for me over the last few years.
    Keep doing what you are doing.

    • Margaret Harris-Shoates says:

      Thank you for your kind words, Deborah! Jenn will be so pleased to know that the site has inspired you.

  4. It’s exciting to see how technology can help students develop critical thinking skills and help them improve their writing abilities.

    However… I do have some concerns about the ethical implications of using a language model like ChatGPT in the classroom. While the article acknowledges that the tool can sometimes generate incorrect information, it’s important to consider the potential consequences of relying too heavily on AI-generated content. For instance, if students become accustomed to using ChatGPT to generate arguments, they may not learn the critical thinking skills necessary to evaluate and construct arguments on their own.

    Furthermore, while the article mentions that ChatGPT is not connected to the internet, there is no guarantee that it will remain that way in the future. Some people argue that connecting ChatGPT to the internet could make studying easier, as the model could access a wider range of information. However, this could also make students lazier, as they would no longer need to do research on their own.

    I wrote an article about ethical use of ChatGPT for studying, if you’re interested: https://edumentors.co.uk/blog/chatgpt-for-students-ethical-and-effective-ai-powered-learning/

  5. Sarah Levine says:

    Thanks for this thoughtful comment. Completely agree that students –and everyone else– need to remember that current versions of ChatGPT do not actually know anything — they are just amazing versions of the autocomplete.

  6. Enrique says:

    Hello, as even since the writing of this blog post, ChatGPT has exploded to so many applications in various sections of our society. I view it as almost a new compliment to all the tools we use. Like Microsoft Office, I see ChatGPT as a new platform that will be used in so many career fields. As this post and interview shows, we as teachers are using it to amplify our practices. My one question would be, how are we to scaffold this across grade levels. If a district or Middle and HS were to adopt teaching the use of it, within any subject, where and when do we teach certain skills associated with ChatGPT and other AI? What are some best practices, I should be considering?

    • Margaret Harris-Shoates says:

      Enrique, you pose a great question! Teaching students the skills associated with understanding, using, and developing AI tools definitely requires intentional planning and scaffolding. To start building a toolkit of best practices, you may want to check out some of the posts saved to this Pinterest board. If you scroll through, there are several resources specific to teaching about AI.

      A couple of other great sites to explore are ISTE’s Artificial Intelligence in Education page and AI4K12.org’s Activity Resource Guides for Teaching Artificial Intelligence in K-12. These resources are both linked in the artificial intelligence section of the 2023 Teachers’ Guide to Tech, if you happen to have it. If not, the 2024 version will be released in January!

      You may also want to keep an eye on any computer science resources provided by your state’s department of education. Several states are moving towards incorporating AI concepts directly into their computer science standards and will be providing guidance specifically to support teaching AI. I hope this helps!

  7. Enrique says:

    Margaret,

    Thank you so much for your insightful feedback. I appreciate the timely and throughfall response, along with the helpful resources. I most appreciate the advice to keep a look out for monitoring states as they implement certain standards. This all helps a lot!

  8. Adam Agamenoni says:

    Hello, I really enjoyed reading about the successes and challenges that you shared about using Chat GPT in the classroom. I have found a lot of success using Chat GPT to create practice problems or quizzes in math. My dad is also a teacher and originally showed me Chat GPT. I ran into some similar problems when I was first using it, where problems weren’t correct or totally correct. I brought this up with my dad and we talked about the importance of writing successful prompts. It is interesting to think about but writing prompts for AI is probably going to be an important skill in the future.
    All in all, I think Chat GPT has a solid place in the classroom but it is important to still check over things and be involved as the human element to the activity. Thank you for sharing!

  9. Emile Bigirande says:

    ChatGPT is very important nowadays

Leave a Reply to Margaret Harris-Shoates

Your email address will not be published.