3 a.m., overwhelmed and only a few clicks away, the urge to turn to artificial intelligence platforms to complete assignments is on the rise across the University.
The boundary between cheating and the constructive use of AI in the classroom, however, has yet to be clearly defined.
The task of drawing this line falls on the UM Undergraduate Honor Council. Established in 1986, the Council hears cases of students accused of plagiarism, cheating, collusion and academic dishonesty.
“While we, the Honor Council do not have a written policy on AI, we view it as plagiarism and are supporting the faculty members stance on usage within their classroom as outlined in their syllabus,” they said in a statement to The Miami Hurricane.
Violations are broken into three categories (Class Violations I, II or III) depending on the severity of the actions. Nearly a year after the launch of ChatGPT, the Council has not yet heard an AI Class III Violation, traditionally for repeat offenders, or any appeals of a Class I or II violation.
“There have been Academic Integrity incidents submitted that involve AI that were resolved on the Academic Integrity Committee level as they fell under the Class I violation of plagiarism,” they said.
With nearly no precedent set by the Honor Council, the guidelines for AI usage have almost entirely been led by individual professors. While some have banned the technology outright, others are finding ways for it to strengthen their curriculum.
Following a discussion with her students at the beginning of the semester, English professor Dr. Pamela Hammons decided to alter her syllabus to allow for the incorporation of AI technology.
“My students and I discussed AI at length early in the semester, and I learned from [my students] about how they use it when they write,” Hammons said. “I discovered, for example, that they sometimes use AI as a tool to help them play with, reflect upon and develop their ideas for writing assignments.”
Within her class syllabus, Hammons has made two key changes.
“The first change is to make clear that there is no penalty for using AI if it helps, as long as they take an additional step in their learning process by reflecting explicitly upon and sharing how they have used the tool,” Hammons said.
“The second change is meant to allow for ways of engaging deeply with literature that encourage a greater individual connection to it that might—for some students—be more purposeful than writing in a strictly analytical manner and that might discourage overreliance on AI.”
These emotional and creative responses are less easily replicated by generative language models like ChatGPT, she noted. Hammons also specifies that students are required to detail how AI was utilized in the writing process for any applicable assignments.
Dr. Doctor Paul Russel Shockley, a professor of philosophy and religious studies in the College of Arts and Sciences, has also allowed students to open up a dialogue with ChatGPT.
“What motivated me to incorporate AI is the idea to recognize the educational resources that AI can provide, but also to be able to critique the claims and the positions, etc, that AI gives,” he said.
He plans to have students engage in discussions with chatbots and reflect on the AI-generated responses.
Shockley believes that AI has the potential to allow students to think critically in new ways by bouncing ideas off the constantly evolving, collective intelligence of the internet. According to him, AI has the ability to go beyond the individual mind and, with this strength, it creates an interesting educational opportunity for students.
“No matter if we are engaging AI, or engaging each other, critical thinking is paramount,” Shockley said. “The focus should be on the application of critical thinking skills, regardless of the medium.”
Isabella Lozano, a senior creative advertising and sports administration major, has taken advantage of AI to aid her in the process of creating visuals. For a recent mockup assignment, she gathered photos of a train cabin seat and a bookstore then used AI visual tools to merge the two together. She instructed AI to add legs to the seat and a rug under the coffee table, creating a lifelike image from parts of other real photos.
“It is like any other system. I would fully say that I used AI to make this, but I made it technically.” Lozano said. “I use this to help me.”
In many of her creative design classes AI is encouraged, but students are required to cite any AI technology used. The inclusion of a citation may also be helpful in denoting the difference between plagiarism and using AI as a helping-hand, she explained.
“If you use it, you have to cite it. And if you don’t, and if you get caught, then that is plagiarism,” Lozano said.
Because of the difference among professors, she advises any AI incidents to be approached by the Honor Council on a case-by-case basis until a further consensus is reached on its impact on enhancing the collegiate experience.
While the confusion among acceptable usages of AI remains to be debated, it seems likely that AI will eventually be as essential to the learning process as the internet is today.
“Whether we like it or not, this is the way the world is going, we must embrace it and learn how to use it,” said Anthony Miles, a senior at the University of Miami majoring in motion pictures.
Jenny Jacoby contributed to this story.