April 25, 2024
I went to lunch with my friend Lynn this week. She told me her son Miles was suspended for a day because he used ChatGPT to write an essay. He’s in seventh grade.
My daughter’s school has blocked all AI from her school computers. She can’t use ChatGPT at all on her school computer. Lynn’s son’s school allows them to use the tool for research.
There are so many things wrong with these scenarios.
Let’s look at the issue with Miles getting suspended. He’s the only one who used the technology to write an essay (or the only one who got caught). I was proud of him for his ingenuity, using a tool that could save him time and using it in the proper way. Since it was broadly launched we have enjoyed discovering what ChatGPT can do for us as professionals; we have sampled its use to help us wordsmith, write drafts, create marketing, write reports and so much more. What Miles did was use a skill that will be useful in the real world and got in trouble for it.
Miles’ school does allow students to access ChatGPT, but they are to use it for research. Let’s get one thing clear: CHATGPT IS NOT A RESEARCH TOOL. It is a generative, which means it makes things up; it is not reliable. It doesn’t spit out facts, it generates what it “thinks” you want it to based on other information it finds in the vast amounts of data on the internet. It’s perfect for writing essays, it’s terrible for research. Miles used the tool exactly as it’s intended.
Now let’s consider what he was supposed to be learning by writing his own essay. If I were to guess, he was supposed to have some research or knowledge to share, understand the format of an essay — introduction, body, conclusion — and convey his thoughts in an effective way. We learn to write essays so that we can learn to communicate effectively.
In order to get an essay that conveyed the information he intended to share, he would have to feed ChatGPT prompts which include that information and be able to verify that the information the tool spit out was correct.
✔️ Research or knowledge to share
In order to turn in an essay in the correct format he will need to follow the directions, understand the format, edit ChatGPT’s output accordingly, and ensure that the final document is in the necessary format.
✔️ Understand the format
Lastly he needs to double check the information is correct, edit the generated text into his own words and voice, and turn it in. The essay needs to effectively communicate his ideas, opinions, or research.
✔️ Effective communication
I would argue that by using ChatGPT he still has a full understanding of all of the knowledge that was meant to be learned through the assignment of writing an essay, and he did it while learning a tool that is useful in the real world.
If education is meant to give students the skills and experiences they need to be successful in the world after their graduation then we should be supporting their learning and use while in school. Instead of asking students to “do it the way it’s always been done” or “because I said so” we need to be teaching them effective use of the tools at their disposal. Their jobs will not be entirely replaced by AI, but they will be replaced by people who know how to use AI.
How could schools do this better? How can schools more appropriately incorporate (or acknowledge) ChatGPT?
- Use AI the way it was meant to be used. I’m sure there are research AI tools out there but ChatGPT is not it.
- Find ways to integrate the AI into the assignment.
- Have students turn in their prompts, the AI first draft, their research and verification of facts, notes, and their final product. That is how we use ChatGPT in the working world. Our role as educators is to teach students the proper way to use the tools and how to think about the implications and outcomes while learning the format and information. We can do that with AI.
- Another way to think of it is to have the students write their paper independently and then have them ask ChatGPT to write the counter argument, or what comes next in the story, or how not to do a thing; have it write the opposite of whatever your students have written and then have the students write a reflection on the two pieces and their experience.
- Have the students write using ChatGPT, have them edit in their own words, run the essays through Turnitin (or another AI ‘catching’ tool), see which ones get flagged. Do this several times throughout the year and have them figure out the probability that the tool will flag the AI. They will learn to use two tools and understand how to make their writing more effective and human.
The school needs to understand the specific use of the tool, identify the outcomes they are seeking, and support learning in all its forms to get to the outcome. Teaching the use of a generative AI for research purposes is against its best use, at best, and extraordinarily dangerous, at worst.
We need to teach how to responsibly use AI, how to use it for its intended purpose, and think about how our students will be using it in the world as we develop policies in our schools around its use.
And I’m still proud of Miles for using the tools at his disposal.
You may also be interested in reading more articles written by Tanya Sheckley for Intrepid Ed News.
Tanya, I understand what you are saying, and in the context of the writing goals you outline, what you are saying makes sense. However, I think the goals you outline are incomplete and actually miss the most important aspects of what students are “supposed to be learning” when they write. Because I wrote an essay about this issue, I cite it here instead of rewriting it: “Artificial Writers: The Brain-Snatchers.” (I don’t know how to insert a link here, but the essay is on the Intrepid Ed News site.)
I listened to an interview on NPR with a student from Brown who described his experience using GPT to write an essay. At one point, he said that the program did in 30 seconds the thinking that might have taken him a half hour or more. And that’s my point: Developing the thinking–the learning and discoveries–is the primary goal that many of us writing teachers have for our students (in addition to the ones you mention). Turning that thinking over to GPT during the formative years is a bad idea. Perhaps, once students develop this ability, they can use GPT as you suggest they must for their future jobs, but my own sense is that GPT will not improve the human ability to think and discover; GPT will undermine it by usurping it.