“It’s not fair to them,” says Cryer.
More than three years after ChatGPT’s debut, generative AI has become part of everyday life, and professors and students are still figuring out how and if they should use it, especially in humanities courses.
A recent study shows that many college students are diving right in: According to a survey of Inside Higher Ed and Generation Lab conducted last July, about 85% of students used AI for coursework, including brainstorming ideas, planning papers and preparing for exams. Approximately 19% of students also reported using AI to write complete essays.
More than half of students who used AI for coursework had mixed feelings about it, reporting that it sometimes helped them, but it could also make them think less deeply.
Aisa Tarana, a recent college graduate, was in her first year at the University of Minnesota Twin Cities when ChatGPT was launched. She says she started using the chatbot for small tasks, like suggesting research topics.
But Tarana says she eventually stopped using AI because it made her feel like “I was outsourcing my thinking, and that felt really weird.”
That’s exactly what Cryer is worried about.
After spending a sabbatical studying generative AI, he came to his own conclusion: Cryer believes that educators should use AI tools as little as possible in their teaching.
“It seems like one of the main purposes of these tools is to keep you from having to think so hard,” he says.
Cryer says she now spends more time convincing her students of the value of putting in the work to become better writers. He says he explains to them that the purpose of their education is the process, not the product — because society doesn’t need more college essays. “What we need is for students to go through the process of writing research papers so they can become better thinkers, so they can put together a compelling argument, so they can distinguish between a good source and a bad source,” says Cryer.
And if students rely on AI to do their work for them, Cryer says, it could lead to cheating out of the education they signed up for.
A professor who sees value in generative AI
In Charlotte, North Carolina, Leslie Clement says she has come to see generative AI as a powerful collaborator that can improve student learning.
“We encourage (students) to use it because we know they will use it, but to use it responsibly,” said Clement, the Black Johnson C. Smith University Professor of English, Spanish and Africana Studies.
Clement says it allows students to use AI to create plans for their papers, get feedback on ideas and compare different sources of information.
Clement also created a course called “The African Diaspora and AI,” which looks at how AI is impacting people of African descent globally, including hazardous cobalt mininga key component in AI technologies, in the Democratic Republic of Congo. The course also covers the potential future benefits of AI, as well as the contributions of black researchers and scientists.
“We’re looking at Afrofuturism, how students can use these tools to reimagine their future,” says Clement.
She says her goal has always been to foster critical, ethical, and inclusive thinking — and she wants her students to apply those skills when using AI tools.
“I want students to not only use the tools for good, but also to question them,” says Clement.
The AI study buddy
A few hours northeast of Clement, in Durham, North Carolina, medical student Anjali Tatini has found her own ways to use AI for good. Tatini double majored in global health and neuroscience, and says AI tools have helped her better understand some of the complex topics she’s been studying.
Take last semester, when Tattini, a 19-year-old sophomore at Duke University, says she was confused by some concepts in a biology course. She turned to Gemini – Google’s AI chatbot – for help.
“I’d be like, ‘That’s the concept — can you explain what it means?'” Tatini recalls. “And it would just respond to me. And if it was too high, I could ask it to turn it down a little bit, which was very helpful.”
In other classes, like chemistry, Tatini says she used AI to create practice problems to help her prepare for exams; in a marketing class she used it to brainstorm ideas; in statistics, she used it to help her generate lines of code for data analysis.
Having an on-demand tutor is helpful, Tatini says, because she can’t always meet with her tutors in person.
“I have work, I have other classes, I have clubs. I don’t have time to always do all these office hours,” she says. “So it’s nice to have something that’s in my spare time that can respond to me in the same way that maybe a human would.”
Tatini draws the line to have the AI write for her. She says she will use these tools to outline and organize her ideas, but the actual writing is all hers.
“If I’m releasing something, I want it to be something I can proudly say is mine. So I would never use AI to write something because it wouldn’t sound like me.”
“What you produce is like a fingerprint to the world”
Nearby in Chapel Hill, Hannah Elder, a 21-year-old student at the University of North Carolina, also takes pride in owning her writing assignments.
“I’m such a big believer in cultivating your own thoughts and being able to articulate them,” she says.
Elder is a law student and takes a mix of courses, including public policy and philosophy classes. She says she uses generative AI to correct her work and check it against course rubrics.
But Elder says she would never use it to write or generate ideas for her.
Learning how to articulate her own ideas and beliefs and communicate them through writing has been one of the most valuable parts of her college experience, Elder says. She worries that if students rely on AI to do this for them, they won’t learn to think for themselves.
“I still use notebook paper (for) all my notes because I just believe so strongly that what you write down and create is like a fingerprint on the world. And I think in a way that’s getting lost,” says Elder.
Still, Elder doesn’t think the solution is to ban AI entirely.
“We can’t deny that it’s going to be part of (the college experience),” she says.
She wants educators to integrate AI learning into curricula so that students learn to see the line between beneficial and harmful use.
“If teachers incorporate it in a responsible way through academia,” she says, “I think it will be seen less as a cheat code and more like, ‘Oh, here’s the reality, here’s how I can use it well, and here’s how it can help me.’
