In this era of modern technology, it is becoming more and more difficult to distinguish between original thought and generated information. Artificial intelligence has been the culprit of making the conversation even louder, especially in the realm of college admissions.
There are often two sides to this conversation – accessibility and ethicality.
The college admission process is often regarded as a strenuous one, and each year acceptance rates seem to drop, including at the University of Miami.
With a 19% admission rate at UM as of last year, it’s evident that the process is becoming more competitive every day.
This competitive nature can lead students and families to look for extra guidance, and many hire private college counselors if they have the financial means to do so. These counselors help students with essays, test scores and GPAs, while specifically targeting a major part of the application – the personal statement.
In roughly 600 words, students are meant to humanize themselves to the college of their dreams.
The applicant’s goal is to display why they are the best fit for the school and what makes them different from the rest.
This process of writing the perfect essay can take countless hours and sleepless nights, and the guidance of a private counselor is a secret weapon not everyone can afford.
In addition to the Supreme Court decision that eliminated affirmative action in college admissions this past summer, the personal statement has now become an invaluable guide to the diversity of an applicant, arguably holding a greater monetary value than the rest of the application.
The ruling directly impacts many applicants who are struggling financially or come from first-generation, low-income backgrounds. Without the funds to pay for a counselor, many students reach for ChatGPT, an AI-powered language model (LLM).
Katrin Hussmann Schroll, the Associate Dean of Admissions for the University of Miami’s Law School, is very aware of the changes AI might make to the admission process. As this upcoming admission cycle is the first where AI will be playing a factor, the outcomes are still unknown.
“Many students have used consultants for years, and AI can bridge the gap from an equity perspective. However, using AI and consultants on your application is noticeable when there is a polished piece of work on the personal statement, and the LSAT or GRE writing skills don’t match,” said Schroll. “Ultimately, law schools want to welcome students who have strong writing, reading and analytical skills, and are ready to take on this new academic challenge.”
Due to the fact that AI has the potential to become a useful tool in any application process, Jennifer Khan, an assistant professor in the School of Education and Human Development, has a unique view on how AI intersects with the education system.
“On the student side, I expect ChatGPT or other generative AI tools will be used by applicants in the same ways they are being used in classrooms; to assist with writing,” said Khan. “Specifically for writing essays, whether to come up with ideas or to get feedback, or in perhaps the least desired way by admission administrators and educators, the actual writing of the text.”
According to Khan, this has led to concerns that the use of ChatGPT in admissions, especially in something as intimate as a personal statement, may give rise to question the ethics of AI in order to earn a competitive spot at a top-tier school.
This question of ethics also includes the well-known fact that ChatGPT cannot function like a human mind. The tool can certainly take the information given and cultivate an essay, yet it takes away the authentic and personal voice of the student.
However, a major part of the admission process is the personal statement, and the admissions office is looking for original thought.
“The admissions office is looking for original ideas in writing, because the point is to get to know each applicant, so when applicants use AI, they’re doing themselves a disservice. AI is a powerful tool that supports writing beyond grammar corrections. However, when it crosses the line of original idea and authentic thought we have a problem,” said Schroll.
This inauthentic voice generated by AI can also become a disadvantage to applicants. If too many students use ChatGPT for their application essays, there is a greater potential for admission counselors to notice the generic voice of the AI tool.
“College admissions officials will likely need to employ the same kinds of changes that educators are having to make to their courses, creating prompts that are more personal, applied, and verifiable,” Khan said.
This issue has led universities nationwide to look into extra steps to mitigate the use of AI to limit the technology’s unethical use.
Many universities, including UM are now looking into putting new systems in place to determine if AI-generated text was used, differentiating between sentences written by humans and those generated by tools like ChatGPT.
“I would imagine, if admissions were to use AI technologies, the same discriminatory biases that have been reported in the job market for filtering applications, that particularly negatively affect applicants from underrepresented or minority communities, will persist if used in a similar way for college level admissions,” Khan said.
The use of AI in admissions is a novel phenomenon, and with the current lack of research to view the effects on the admission process, this dialogue still has significant room to evolve.
With the college admission process as competitive and expensive as it is today, it is key to understanding whether applicants are utilizing AI as a free admissions counselor or as a copout.
“It really just depends on how these tools are used. But if they are used without caution and for decision-making, that’s where I see the risks,” Khan said.