Some Duke faculty members are worried that academic dishonesty will become a more common occurrence with the emergence of OpenAI’s ChatGPT, a service that allows users to ask a highly-trained artificial intelligence model questions and receive fairly robust responses.
ChatGPT has taken the world by storm with its ability to answer open-ended questions, draft email templates, work through philosophical quandaries, write poems, solve math problems and more at a basic level.
Because of ChatGPT’s writing capabilities, some Duke professors believe that the platform is able to complete some of their students’ class assignments without much trouble. Other professors believe that the technology is not advanced enough yet to emulate high-quality student work, particularly for longer assignments such as research papers.
The Director of Undergraduate Studies Advisory board met on Jan. 31 and discussed the use of ChatGPT in classrooms. According to meeting notes obtained by The Chronicle, faculty are permitted to ban ChatGPT’s use in their classes, but the University has not taken an official stance on the matter yet.
So how are professors adapting to the new tool that could potentially solve students’ homework woes?
Eileen Chow, associate professor of the practice of Asian and Middle Eastern Studies, started by feeding some of her assignments from previous classes into ChatGPT. She said that the output that she received “felt like a ‘C’ paper.”
Professor of History Thomas Robisheaux agreed with Chow.
“[The essays] tend to be lame, blandly written, not able to engage critically first-hand evidence or offer an original argument — the outputs are blended non-offensive points of view,” Robisheaux wrote in an email.
“What’s missing is the original, unique, creative voice of a researcher-author. But I do think some students in a crisis may find it [to be] a quick solution,” he wrote.
Chow is concerned that while ChatGPT might produce mediocre work currently, the artificial intelligence will improve at producing results that will be indistinguishable from student-generated responses. This has prompted a revision in her syllabus.
“I did take out my original second assignment, which is to write a short essay on thematic and historical analysis, and I'm actually making [it] an oral defense,” she said.
However, Chow said that the assignment she changed builds off of weekly forum posts, which will remain the same.
“I suppose if someone wants to cheat, they could just have [artificial intelligence] write weekly for posts, but the point is you have to defend,” she said. “So, I feel like the energy of the learning is defending and critiquing and revising.”
Both Chow and Robisheaux believe that the goal of college classes, particularly in the humanities, is to teach critical thinking. But the two have different thoughts on the utility of the tool to achieve this goal.
“Offloading thinking and organizing — even if the content is mostly lame from what I gather — to a machine simply makes it impossible to improve one’s own thinking and writing and research abilities, which have to be repeatedly practiced to become effective,” Robisheaux wrote. “One learns through practice, practice, practice.”
While Robisheaux wrote that relying on ChatGPT will “degrade the critical writing & thinking capacity of a student” and “undermine the real goal of their college education,” Chow is not opposed to its use for assignments that test critical thinking.
“I feel like if it's a reality, then you have to kind of have a conversation [about] it,” she said. “You teach to the platform rather than avoid it or pretend it doesn't exist.”
In Chow’s opinion, whether or not using ChatGPT is a form of cheating depends on the assignment in question.
She is open to the idea of students using ChatGPT to do the more mundane tasks of writing, such as gathering facts for an interpretive assignment that asks students to then distill and synthesize the information.
“If you cited [ChatGPT], it's your work because you're still taking that, just like you would do if you read on Wikipedia, and then be able to evaluate, critique, revise,” Chow said.
When the goal of an assignment is to learn a “fact-based set of things,” such as those that require memorization, Chow is unsure of whether using ChatGPT constitutes cheating.
“I would like [future] doctors to not have just faked it. They should have learned stuff,” she said.
Get The Chronicle straight to your inbox
Signup for our weekly newsletter. Cancel at any time.
Adway S. Wadekar is a Trinity junior and former news editor of The Chronicle's 119th volume.