Local colleges talk ChatGPT, other bots and how students might use them
Published 8:00 am Saturday, August 26, 2023
By: Charlotte Reames and Olivia Johnson
When Stephen Waers attended a technology conference last year, he heard author Malcolm Gladwell describe the future of AI as the “first true revolution of his lifetime.”
Waers, chief academic officer and chief information officer for Point University, has seen AI integrate with daily life. From ‘People you may know’ to ‘Items others have bought’ suggestions, AI has a hand in making our everyday lives easier.
Email newsletter signup
“Artificial intelligence has been around for over a decade, and it’s been mostly invisible to us,” Wears. “Most of the large tech platforms are built on machine learning algorithms.”
But just in the past five years, the tool has become more readily available to consumers.
Waers has watched AI pervade the college sphere in particular. Online work has had a helping hand with autocorrection, predictive text and even generating entire essays.
Point University English Professor Taylor Bowman still remembers the first time he caught a student using AI for an essay. When he realized they couldn’t define the concepts in their own paper, the student admitted it wasn’t their work.
But how do instructors determine if an assignment was written by a bot if their student is less forthcoming?
According to Waers, they don’t and they probably shouldn’t.
“Trying to prevent students using AI is kind of quixotic — tilting at windmills,” Waers said. “It’s like playing a game of Whack-a-Mole.”
In fact, Waers believes that the time it takes to circumvent the “cousin to plagiarism” could be better spent utilizing it for instruction.
“It takes a lot of time to prove that a student plagiarized and a lot of time to figure out how to punish them,” Waers said. “And that doesn’t result in better learning for the students generally.”
Working with AI instead of against it
Bowman, who serves on the curriculum committee, said he most often catches students using AI when the system doesn’t respond to the prompt accurately. One assignment prompted students to write about the significance of their favorite passage from a text they read in class. The AI wrote a summary of a different story by the same author.
“Especially if they haven’t read the material they’re supposed to read or thought carefully about the assignment they’re trying to do, they don’t know when the AI has done a bad job of addressing the assignment,” Bowman said.
The primary function of AI chatbots is a prediction based on a large language model, Waers said. ChatGPT uses known languages and statistics to determine the most probable set of words as a response to a prompt.
“If AI is primarily about prediction,” Waers said, “Then, one of the ways we can design learning to account for it is to introduce unpredictable elements into our assignments.”
To help combat the use of AI, LaGrange College has decided to take the bull by the horns and incorporate some AI technology in its academics. Dr. Justin Thurman, director of the Writing Center and professor of English, said he has integrated AI into his classroom.
“In terms of my first-year writing classes, students were surprised that I addressed AI use and have policies in our syllabus directed toward the use of generative AI. If anything, I think AI has improved my classes a little bit. This is the reality of the world and the reality of the labor force students will be entering. So, I’d say it’s a net positive,” Thurman said.
Educators can also take advantage of AI limitations. One of the shortcomings of AI chatbots is that responses are based on the information trained into their model.
ChatGPT only has data up to September 2021 and cannot search the web in real-time, so it cannot generate accurate sources. It is also limited by its impartial programming. It won’t discuss partisan political topics or predict future political or sports events.
Many professors like Bowman try not to be punitive toward the dishonest use of AI in assignments. Thurman said he uses any instance of academic dishonesty as a teaching tool.
“My approach to plagiarism has not changed at all since generative AI jumped on the scene,” Thurman said. “It’s always been: ‘bring in the student and let’s have a conversation.’ Let’s make it a teachable moment and see if they’re willing to add value to what they copied or if they’re repeating verbatim. If they’re not willing to do that, then that’s a choice.”
How AI is helping faculty
This semester, Point University hosted a faculty meeting that discussed how to navigate academics in the age of AI. If done well, Waers said AI can, and in some cases, already does foster better instruction.
“Part of the point of academic research is to try and understand things that are actively happening in the world,” Bowman said. “And so there will have to be someplace for AI in the discussion somewhere.”
Point University’s online education program uses AI-enhanced programs to track student activity logs and completion. That way educators know when academic intervention is needed.
“AI can actually offload occasionally ‘busy work’ that allows us to spend more time doing things that only humans can do,” Waers said.
Though some educators are still deciding what that role might be, Thurman said in education AI has many possibilities.
“In the future, I see it being a possibility to be a learning aid, almost like a partner to students if we can get them to engineer prompts, that it’s where it can be more of a help and not a replacement for them or not a proxy for them,” Thurman said.
Thurman said last year during his script writing class, students actively used Chat GPT for exercises like generating character names, plotting and putting together lists of scenes that would be used as a starting point for class discussions.
“I would tell them, ‘GPT is a reflection of what’s the most common and popular stuff on the internet.’ We talk about it being a reflection and how writers want to come up with new things. So, when it’s suggesting you go right, what would happen if you take the scene left? I like what it’s brought to my classes,” Thurman said.
By evaluating or reflecting, students learn to engage with concepts on a deeper level. Students would have to edit and tweak the work the AI provided.
“For a writing class, there are lots of options, lots of room for creativity,” Bowman said, acknowledging that it would be harder to circumvent in a math class.
Thurman believes educators should have deeper classroom discussions on the ethics, copyright, and labor implications of AI. But ultimately, the hard work of learning still needs to fall on students.
“I mean, they call it the humanities for a reason, right?” Bowman said. “It really does require some humanity to think about the various kinds of ideas that literature engages.”