With the end of affirmative action, generative AI could ‘democratize’ admissions by giving students who don’t have tutors or counselors a leg up
Emily Bobrow
Sun 27 Aug 2023 12.00 BST
Chatter about artificial intelligence mostly falls into three basic categories: anxious uncertainty (will it take our jobs?); existential dread (will it kill us all?); and simple pragmatism (can AI write my lesson plan?). In this hazy, liminal, pre-disruption moment, there is little consensus as to whether generative AI is a tool or a threat, and few rules for using it properly. For students, this uncertainty feels especially profound. Bans on AI and claims that using it constitutes cheating are now giving way to concerns that AI use is inevitable and probably should be taught in school. Now, as a new college admissions season kicks into gear, many prospective applicants are wondering: can AI write my personal essay? Should it?
Ever since the company OpenAI released ChatGPT to the public in November, students have been testing the limits of chatbots – generative AI tools powered by language-based algorithms – which can complete essay assignments within minutes. The results tend to be grammatically impeccable but intellectually bland, rife with cliche and misinformation. Yet teachers and school administrators still struggle to separate the more authentic wheat from the automated chaff. Some institutions are investing in AI detection tools, but these are proving spotty at best. In recent tests, popular AI text detectors wrongly flagged articles by non-native English speakers, and some suggested that AI wrote the US constitution. In July OpenAI quietly pulled AI Classifier, its experimental AI detection tool, citing “its low rate of accuracy”.
Preventing students from using generative AI in their application essays seems like shoving a genie back in a bottle, but few colleges have offered guidance for how students can use AI ethically. This is partly because academic institutions are still reeling from the recent US supreme court ruling on affirmative action, which struck down a policy that had allowed colleges to consider an applicant’s race in order to increase campus diversity and broaden access to educational opportunity. But it is also because people are generally confused about what generative AI can do and whom it serves. As with any technological innovation in education, the question with AI is not merely whether students will use it unscrupulously. It is also whether AI widens access to real help or simply reinforces the privileges of the lucky few.
These questions feel especially urgent now that many selective colleges are giving more weight to admissions essays, which offer a chance for students to set themselves apart from the similarly ambitious, high-scoring hordes. The supreme court’s ruling further bolstered the value of these essays by allowing applicants to use them to discuss their race. As more colleges offer test-optional or test-free admissions, essays are growing more important.
In the absence of advice on AI from national bodies for college admissions officers and counselors, a handful of institutions have entered the void. Last month the University of Michigan Law School announced a ban on using AI tools in its application, while Arizona State University Law School said it would allow students to use AI as long as they disclose it. Georgia Tech is rare in offering AI guidance to undergraduate applicants, stating explicitly that tools like ChatGPT can be used “to brainstorm, edit, and refine your ideas”, but “your ultimate submission should be your own”.
According to Rick Clark, Georgia Tech’s assistant vice-provost and executive director of undergraduate admission, AI has the potential to “democratize” the admissions process by allowing the kind of back-and-forth drafting process that some students get from attentive parents, expensive tutors or college counselors at small, elite schools. “Here in the state of Georgia the average counselor-to-student ratio is 300 to one, so a lot of people aren’t getting much assistance,” he told me. “This is a real opportunity for students.”
Likening AI bans to early concerns that calculators would somehow ruin math, Clark said he hopes Georgia Tech’s approach will “dispel some misplaced paranoia” about generative AI and point a way forward. “What we’re trying to do is say, here’s how you appropriately use these tools, which offer a great way for students to get started, for getting them past the blank page.” He clarified that simply copying and pasting AI-generated text serves no one because the results tend to be flat. Yet with enough tweaks and revisions, he said, collaborating with AI can be “one of the few resources some of these students have, and in that regard it’s absolutely positive”.
Although plenty of students and educators remain squeamish about allowing AI into the drafting process, it seems reasonable to hope that these tools could help improve the essays of those who can’t afford outside assistance. Most AI tools are relatively cheap or free, so nearly anyone with a device and an internet connection can use them. Chatbots can suggest topics, offer outlines and rephrase statements. They can also help organize thoughts into paragraphs, which is something most teenagers struggle to do on their own.
“I think some people think the personal application essay shouldn’t be gamed in this way, but the system was already a game,” Jeremy Douglas, an assistant professor of English at the University of California, Santa Barbara, said. “We shouldn’t be telling students, ‘You’re too smart and ethical for that so don’t use it.’ Instead we should tell them that people with privileged access to college hire fancy tutors to gain every advantage possible, so here are tools to help you advocate for yourselves.”
In my conversations with various professors, admissions officers and college prep tutors, most agreed that tools like ChatGPT are capable of writing good admissions essays, not great ones, as the results lack the kind of color and specificity that can make these pieces shine. Some apps aim to parrot a user’s distinctive style, but students still need to rework what AI generates to get these essays right. This is where the question of whether AI will truly help underserved students becomes more interesting. In theory, AI-generated language tools should widen access to essay guidance, grammar checks and feedback. In practice, the students who might be best served by these tools are often not learning how to use them effectively.
The country’s largest school districts, New York City public schools and the Los Angeles unified school district, initially banned the use of generative AI on school networks and devices, which ensured that only students who had access to devices and the internet at home could take advantage of these tools. Both districts have since announced they are rethinking these bans, but this is not quite the same as helping students understand how best to use ChatGPT. “When students are not given this guidance, there’s a higher risk of them resorting to plagiarism and misusing the tool,” Zachary Cohen, an education consultant and middle school director at the Francis Parker School of Louisville, Kentucky, said. While his school joins some others in the private sector in teaching students how to harness AI to brainstorm ideas, iterate essays and also how to sniff out inaccurate dreck, few public schools have a technology officer on hand to navigate these new and choppy waters. “In this way, we’re setting up marginalized students to fail and wealthier students to succeed.”
Writing is hard. Even trained professionals struggle to translate thoughts and feelings into words on a page. Personal essays are especially hard, particularly when there is so much riding on finding that perfect balance between humility and bravado, vulnerability and restraint. Recent studies confirming the very real lifetime value of a degree from a fancy college merely validate concerns about getting these essays right. “I will sit with students and ask questions they don’t know to ask themselves, about who they are and why something happened and then what happened next,” said Irena Smith, a former Stanford admissions officer who now works as a college admissions consultant in Palo Alto. “Not everyone can afford someone who does that.” When some students get their personal statements sculpted by handsomely paid English PhDs, it seems unfair to accuse those who use AI as simply “outsourcing” the hard work.
Smith admits to some ambivalence about the service she provides, but doesn’t yet view tools like ChatGPT as serious rivals. Although she suspects the benefits of AI will redound to those who have been taught “what to ask and how to ask it”, she said she hopes this new technology will help all students. “People like me are symptoms of a really broken system,” she said. “So if ChatGPT does write me out of a job, or if colleges change their admissions practices because it becomes impossible to distinguish between a ChatGPT essay and a real student essay, then so much the better.”
topLeft
bottomLeft
topRight
bottomRight
heading
#paragraphs.
/paragraphshighlightedText#choiceCards/choiceCards
0 Comments