Professors Discuss How ChatGPT May Reshape College Classrooms




College classroom
A.I. may forever change higher education. Dom Fou/Unsplash

It’s the end of August, which means fall semester is right around the corner. But this year, educators are more anxious about the school year than their students. When ChatGPT first erupted onto the scene last winter, no one knew quite what to expect. Many saw it more as a flashy new tech toy than a time bomb set to revolutionize society.


Educators were in for a rude awakening during the spring semester when the use of generative A.I. technology snowballed rapidly until it couldn’t be ignored. Many teachers were appalled to realize a large percentage of the papers they were grading were written not by their students but by large language models (LLMs). The world has shifted, and higher education has been scrambling to catch up. Because the spring semester caught so many professors and institutions off-guard, it’s fair to say that this fall, despite being the second grading period in our new post-ChatGPT world, will be our first indication of what education might actually look like going forward.


This summer, Observer spoke with three experts about the impact of ChatGPT on the future, and present, of higher education. They expressed different ideas about the promises and perils of the controversial new technology, which underscore how in flux these questions—and any possible solutions—are at the moment.


As hard as it may be to hear, the truth is A.I. is here to stay. “I don’t care if you’re teaching writing, math, law or chemistry, nobody can ignore this technology. We have to all be planning for it,” Inara Scott, an environmental law professor at the Oregon State University, told Observer.


The possibility of keeping A.I. entirely out of the classroom seems to be getting smaller and smaller, much to many professors’ dismay. Even so, few advocate for banning the technology, although their reasons vary. Some say not pursuing it would be a waste, while others believe A.I. will be an integral part to the future of work and it’s educators’ duty to prepare their students for the world after college.


“It would be malpractice for us to not be teaching our students how to use it appropriately and effectively,” said Scott.


Even professors skeptical of the technology rarely recommend A.I. bans, often because they just don’t work. While A.I. detection tools, lock-down programs, and proctoring systems were originally viewed as the last line of defense for educators to ensure the originality of student-submitted work, many teachers now harbor immense doubts of the efficacy and safety of these measures. Scott attested to this, as did Marc Watkins, a writing lecturer at the University of Mississippi, who points toward OpenAI’s recent shutdown of its text classifier as proof that such instruments are ineffective at best and downright dangerous at worst.


The world is thinking more concretely about the role A.I. will play in higher education, but that doesn’t mean our lives, values and professions have to undergo massive shifts to accommodate the new technology. Rather, we need to begin the process of addressing it in a productive manner.


There are a myriad of routes to take, especially in higher education. And professors need to find the approach that works best for them. Whether you’re enthusiastic about the ChatGPT revolution, convinced it augurs the death of civilization, or just a bit A.I.-curious, here are some suggestions from those in the know about practical steps educators could take moving forward.


A.I. could make personalized education a reality


Perhaps not surprisingly, leaders in education tech are quite optimistic about the promise of using generative A.I. for pedagogy. John Katzman, the founder of The Princeton Review and Noodle, an ed-tech company, believes ChatGPT can and should become central to how professors approach their jobs and stands to change education for the better.


“The Holy Grail is personalized education,” Katzman told Observer. “Imagine that every textbook was written for you. When you’re ready for chapter four, it writes you a chapter four [attuned] to your reading level, focusing on what parts are really interesting to you.”


Scott said something similar, “If I have a class of 60 students, I can’t interact individually and engage in this method with each one of them. An A.I. can.”


Katzman believes, that rather than banning A.I. from the paper-writing process, professors should incorporate it early on and often. For example, teachers could ask students to prompt a chatbot to write a first draft and then revise and fact-check it before turning it in with a full log of chatbot conversations and drafts that result in the final product.


For professors going this route, the basis for grading the assignment would not solely be the paper itself, but the dialogue the student had with the bot, their ability to improve upon the bot-generated draft, and how the piece was sculpted into something coherent and persuasive. In that sense, Katzman said he has no problem with there being assignments “where 80 percent of it was written by a bot, [so long as] you’re cleaning it up, you are looking for hallucinations, you’re looking for ways to tighten it.”


It’s worth noting Katzman doesn’t envision these papers making up large percentages of students’ final grades. Instead, they should be intended to teach them what good writing looks like, how to recognize it, and what skills are necessary not only for writing but also for writing in an A.I. world, which Katzman believes will be our future.


Another approach Katzman recommends is to capitalize on the increased efficiency only possible through A.I. integration.


“Instead of [saying] ‘By Tuesday, I want a four-page essay,’ it’s ‘By Tuesday, I want a 20-page essay, and I want it to be good,’” he said. “You have to use the LLMs to spit that out, but you’ve got to be thoughtful about how you structure the whole thing, and I can demand that you are so much more productive than somebody would have been two years ago.”


Balancing automation and critical thinking


Inara Scott of the Oregon State University said A.I. has the potential to change how she teaches and how students learn. For example, if A.I. can provide students with massive amounts of content, educators should shift focus from solely teaching content to providing students with means of curating the content they receive from chatbots, she said.


“In a law class, I used to spend an entire class period essentially lecturing about something like employment discrimination, let’s say,” Scott said. “Going forward, what I may do is have a 10-minute framing activity: I may have to give students a hypothetical and tell them, ‘Okay, I want you to go into the chatbots and use your A.I. and come up with three different answers to this question, and then we’re going to regroup and learn from what you got.'”


The process in the classroom will require students to “go back and forth into A.I. and learn from it,” she added.


Like Katzman, Scott is also a proponent of introducing dialogues with chatbots throughout the writing process. But rather than having A.I. write the first draft, Scott envisions A.I.-integrated paper-writing as a continuous conversation with a chatbot.


“I could say, I want you to come up with two or three different theses and put each one into A.I. and have it generate an outline. I’d like you to take that outline and edit it so that it reflects five external sources. And then I want you to put it back in and ask the A.I. to write some text, and so on,” Scott said.


By engaging in an ongoing conversation with A.I., the student is “writing the paper” while “learning from the A.I. and external sources,” Scott said. “You’re still engaging in critical thinking. What you’re creating doesn’t look like the same process that I went through when I used to write papers, but that doesn’t mean it’s not a valid thinking process.”


A non-intrusive way to integrate A.I. in the classroom


Watkins, who teaches writing at the University of Mississippi, recommends bringing A.I. into the classroom as more of a conversation topic than an essential tool for now, as many students seem to be receptive to talking about the downsides of the tech. Last spring, Watkins found students were taken aback when he showed them how frequently and disastrously chatbots could hallucinate, and it sparked numerous class-wide conversations about the danger of A.I. overtaking writing and communication.


“Even when I had a very lax policy about using it in the spring, about half the class said A.I. freaked them out too much,” Watkins said.


Rather than asking students to produce drafts on ChatGPT and then refine the results, an alternative approach would be to integrate A.I. into the writing process at a later stage, Watkins said. For example, he would tell his students to write a first draft and then ask an A.I. to help polish it up.


“What I really like are tools that are designed to help use different forms of assistance beyond just writing,” Watkins said.


There are speech recognition programs that help students summarize lectures and organize notes and reading assistance programs that scan uploaded PDFs and allow users to ask questions about the document, making it possible for students to understand text beyond their educational level. Watkins said these tools are “game-changing for education, especially with students who have disabilities.”


Both Scott and Watkins emphasized the potential of A.I. to reorient higher education (and possibly society at large) away from so-called “objective” means of assessing intelligence and success and towards valuing learning itself.


“I think as class sizes have risen, resources have shrunk, you start to rely more and more on these more outcome-based metrics to try to measure that, and here’s where I think A.I. is really exciting,” Scott said. “A.I. might be the way for us to actually push back on that way of assessing and interacting with students.”


Watkins concurred, “We’ve been so focused for the past almost 50 years on telling everyone ‘you have to have to have a college degree and everything else to have a life’ that we’ve lost the fact that part of the reason you go to college isn’t to get to that degree, but to teach yourself to ask questions about the world. And I think generative A.I. will maybe drive a much deeper conversation about what that question is.”



How Will ChatGPT Reshape College Classrooms This Fall? Experts Weigh In



Post a Comment

0 Comments