Skip to content

Why Students Write?

  • by

Something seemed off.

In December 2022, just one month after ChatGPT was released, Furman University professor Darren Hick was grading papers for a class he was teaching on Immanuel Kant’s categorical imperative. Hick, a teacher of art, ethics, law, and copyright, was naturally always on the lookout for cheating. Often, cheating looks like a student has lifted studies or information from the internet and cobbled it together, resulting in a paper that practically announces itself as plagiarism.

This paper did not do that.
Rather, it was clean. Too clean.

The closer he read, the more glaringly problematic it became. What struck him most was the incongruence of how confidently the student stated incorrect information as fact. Hick began to play detective. He Googled passages from the paper to see what might turn up, but the searches came back empty.

“At this point I had heard about ChatGPT, but it was a brand-new technology,” he said.

He created an account on OpenAI and tried to reverse engineer the prompt a student might have used to generate such a polished essay. That is when he learned something unsettling. Large language models sometimes hallucinate, confidently presenting false information as fact. After some digging, Hick was all but certain the paper had been written by AI.

Talk about horror.

Artificial intelligence had arrived, along with every teacher’s worst nightmare. Hick realised he was facing a technology that could write essays, potentially removing the learning from the process of writing altogether. So he did what any concerned educator might do and posted a warning on Facebook.

There was reason to be afraid, he wrote. GPT technology posed an existential threat to education as we knew it. His post captured the fear many people experienced when first encountering ChatGPT’s capabilities. The AI could generate essays, summaries, and analysis that were nearly indistinguishable from human writing. It was not traditional plagiarism, because the text was usually novel, but it raised serious questions about academic integrity. The temptation to outsource assignments to AI threatened to make teaching vastly more difficult.

What were educators supposed to do?

As one of the first professors to publicly flag these dangers, Hick watched his post go viral. Media outlets began calling. Within days, he was internationally known for catching a student using ChatGPT, effectively becoming the new academic sheriff in town (Smith, 2022).

The only issue was, students who used ChatGPT more subtly and took the time to fact check its output, Hick wasn’t going to be able to catch. Hick could continue to try spot AI written papers, or try to ban ChatGPT outright. But ultimately, the task was futile.

If students were not going to change how they used ChatGPT, perhaps Hick needed to change how he approached it.

Instead of spending all his time policing essays, he could lean into a more courageous approach and explore how generative AI might actually help students become better learners. Other educators reached the same impasse and turned it into an inflection point. One that forces us to revisit why we assign writing in the first place.

At its most basic level, writing is communication. It requires structured thinking, fluency with language and grammar, and crucially, something meaningful to say.

Editorial writing requires forming and justifying an opinion. Research papers require digesting existing literature and ideally building on it with original analysis. Journalism requires interviewing people and investigating sources. These are not skills ChatGPT can replace on its own.

If a teacher’s goal is to assess structured thinking, grammar, or basic argumentation, a traditional take home essay may no longer be the best tool. An in class, proctored essay, perhaps written over multiple sessions, might better ensure the work is genuinely the student’s own.

But if the goal is to assess research ability or investigative skills, is using ChatGPT necessarily bad? After all, AI cannot conduct experiments, interview sources, or observe events. In fact, these tools will increasingly be used in the workplace. Should students not learn how to use them effectively while still in school?

From conversations with friends and family, it seems the traditional state school system has not fully made this leap yet. For younger students, learning to structure thoughts and communicate clearly will always be essential. Writing is often the best way to practise this, and these skills translate directly into oral communication. Students who avoid developing them will quickly be caught out in the real world.

However, once students have mastered these fundamentals and move into the later years of high school and beyond, it becomes pivotal that they learn how to use AI tools to enhance their work. Entering university without familiarity with these tools will soon be a disadvantage.

And even with younger kids, AI can still be used to enhance these fundamental skills of structuring an argument and communicating clearly and grammatically.

New education focused AI platforms are emerging that aim to strike a middle ground. These tools do not write essays for students. Instead, they act as intelligent guides. A student might ask, “What themes should I consider when writing about The Great Gatsby?” The AI can suggest ideas such as the American Dream, social class, and symbolism, but the student still does the thinking and writing.

When the AI platform shares the final output of the student/AI collaboration with the teacher, it will also report on the process and the degree to which the AI assisted. The teacher will get a much clearer sense of a student’s strengths and areas for improvement. These apps can report back, “We worked on the paper for about four hours. Flynn initially had trouble coming up with a thesis, but I was able to help him by asking some leading questions. The outlining went pretty smoothly. I just had to help him ensure that the conclusion really brought everything together. Flynn did most of the writing. I just helped him tidy up the grammar and strengthen his argument in the third paragraph. Based on the rubric for the assignment, I’d recommend Flynn get a B+ on the assignment. Here is a detailed breakdown of how I rated this paper in the dimensions on the rubric.” It would also be difficult for a student to cheat using ChatGPT in this context. If they use ChatGPT to write the essay but simply copy and paste text into their assignment, the app will tell the teacher, “We didn’t work on this essay together; it just showed up, so we should be suspicious.” When used in this context, AI is not something education should fear.

For younger students, teachers should prioritise in class writing where the work is unquestionably their own. But once students have learned the basics, we must allow and teach the responsible use of generative AI. Taking the moral high ground will not stop the technology. It will only disadvantage those who do not learn to use it.

Spell checkers once seemed like cheating. Grammar tools such as Grammarly, which can rephrase entire paragraphs, are now completely normal. AI will follow the same path.

The real losers will not be the students who use AI. They will be the ones who were never taught how to use it well.

Bibliography

Smith, J. (2022) ‘Professor catches student cheating with ChatGPT: “I feel abject terror”’, The New York Post, 26 Dec

Leave a Reply

Your email address will not be published. Required fields are marked *