Community members consider impact, ethics, possible applications of AI technology, ChatGPT

This+illustration+of+a+computer+with+ChatGPT%2C%C2%A0a+pencil+and+paper+represent+that+with+a+rise+in+AI+technologies%2C+educators+are+starting+to+think+about+its+impacts+on+student+writing+and+learning%2C+especially+in+the+humanities+classes.+OpenAI+released+ChatGPT+in+November%2C+a+bot+that+can+generate+human-like+responses+to+user+prompts.+Graphic+Design+by+Audrey+Chang.%C2%A0

Photo credit: Audrey Chang

This illustration of a computer with ChatGPT, a pencil and paper represent that with a rise in AI technologies, educators are starting to think about its impacts on student writing and learning, especially in the humanities classes. OpenAI released ChatGPT in November, a bot that can generate human-like responses to user prompts. Graphic Design by Audrey Chang. 

By Audrey Chang, Editor-in-Chief

Whether using Grammarly, opening a phone with face ID, using Google maps, having a personalized feed on Instagram or viewing recommendations from Netflix, Artificial Intelligence is behind it all, impacting people’s daily lives.

One of the most recent examples of advances in AI is ChatGPT, a chatbot developed by OpenAI, which was released Nov. 30. According to OpenAI, ChatGPT is designed to interact “in a conversational way,” and “answer followup questions, admit its mistakes, challenge incorrect premises and reject inappropriate requests.” Users can input questions or prompts, and ChatGPT generates responses. Some students have been using the technology to write essays and assignments, claiming it’s their own work.

OpenAI also initially launched DALL·E in January 2021, which uses AI to generate visual creative works when given language prompts. InstructGPT is another bot similar to ChatGPT, as it is also designed by OpenAI to “follow an instruction in a prompt and provide a detailed response.”

With AI technologies — specifically ChatGPT — that use large amounts of data, vocabulary and information to respond to prompts, many educators across universities and high schools have started considering how this might impact the way they assess student learning.

History Department Chair Bethany Neubauer said Archer humanities departments are in the process of understanding and evaluating possible challenges or applications of AI technology. She said humanities classes traditionally have many writing assignments outside of class, and ChatGPT may change the way they assess students.

“We’re still definitely figuring it out, as are probably every other school and university and college — this is getting a lot of discussion at every level of education right now, but especially in the humanities because ChatGPT is so good at generating written responses to even pretty sophisticated prompts,” Neubauer said. “While we are figuring this out, certainly there are a few different things that teachers can do to try to ensure that the work that they’re seeing is really the students’ own work.”

English Department Chair Brian Wogensen said the English and history departments met to discuss approaches to AI in regards to academic integrity and possible applications in the classroom.

“There were really two prongs to our conversation about it, and it’s clear that we need to address AI in the classroom and as it relates to student work and chatbots,” Wogensen said. “One prong is ‘How do we approach talking to students about academic integrity as it relates to this?’ And the other prong is ‘What are the implications and possibilities of AI chatbots for learning for education?'”

According to a New York Times article on restructuring education with the rise of AI, in order to prevent cheating, some public school systems, including those in New York City and Seattle, have banned ChatGPT on school Wi-Fi networks and devices. However, students are still able to find ways around the bans and gain access to the tool.

On the other hand, colleges and universities have focused more on changing the way classes are taught, as they don’t believe blocking the tool would be effective and would infringe on academic freedom. Neubauer reflected on the banning approach, and she said she doesn’t think it would align with Archer’s philosophy.

“One advantage that we have at Archer is that there is a fair amount of trust between teachers and students, so I think the first place to start is really conversations with students about why teachers wouldn’t want them to use AI tools like this,” Neubauer said. “I know that some school districts have blocked ChatGPT at least on school devices … I don’t think that that’s in keeping with our philosophy as a school — we don’t just say, ‘don’t do this.’ We have conversations about it.”  

I really think the place to start is with what’s our philosophy of teaching and talking with students about that openly, and talking about where does this fit in? And why would it be problematic?

— History Department Chair Bethany Neubauer

Wogensen said he thinks devoting time to discuss the reasoning behind why someone might plagiarize as opposed to how they would do it is more important to the conversation of AI and academic honesty.

“The chatbots just make it that much more accessible to make quick and poor choices. I spent a lot of time with my students talking to them about why someone might plagiarize. ChatGPT is the how. Now, the why is, I think, much more interesting and much more important,” Wogensen said. “Is it because they don’t feel good enough or their work’s good enough? Or they’re comparing it to other people? Is [it] because they just don’t have time? … And so dispelling some of those or getting students to engage with some of those questions of the why [is important].”

Chair of Honor Education Council Rachel Heyman (’23) said she wants to have a discussion about AI and the ethical aspects of it in a future HEC meeting, especially because it was brought up and discussed during her English class.

“Plagiarism is something that is super easy … there’s so many different roots of it, and I think discussing what it means for something to be your own is very important and the impact that it has in taking someone else’s work and how that makes you feel and the guilt and all of that emotion surrounding it as well as the ethics,” Heyman said. “Knowing that they exist and being aware that they exist and using them, as opposed to preventing people from even looking at them, is very interesting approach. I think that immersing them into our curriculum can be really interesting … if done properly and if people use it responsibly.” 

As AI technology advances, schools are considering how it could possibly play a role in the classroom. Neubauer said one possibility would be using AI tools to generate an outline or a starting point for an assignment, while ensuring the actual writing draws from students’ own abilities. Wogensen and Neubauer both said another possibility is using examples generated by chatbots as comparative tools and having students discuss its strengths, weaknesses and how it could be made more creative or interesting.

Neubauer said this is a conversation that applies to all classes and levels of education, not just humanities. However, for the humanities specifically, she said she has been thinking about designing or tweaking prompts in a way that allows for more student voice and experience, as well as considering the possibility of moving towards oral presentations and exams.

“Another thing that teachers can be thinking about is ways to bring in more of the student voice in the writing prompts,” Neubauer said. “I think [all departments are] in slightly different places, but all confronting some of the same issues around how do we make sure that we are really assessing what our students know. We’re also designing our courses in a way that that could potentially make use of some of some of the technology in places where we think it’s appropriate.”