Human innovation has always been driven, at least partially, by the desire to make life easier. From the invention of the wheel to the modernization of agriculture to the creation of the printing press, tools that simultaneously reduce effort and increase efficiency have been extraordinarily successful and even revolutionary throughout history. The latest addition to this lineage of tools is artificial intelligence, or AI. Those who grew up without the Internet will often say information that once required a full day in the library to obtain can now be accessed instantly at our fingertips, but it goes far beyond that. Today, AI powered tools can generate fully-fledged essays or research papers with, quite literally, the click of a button.
This poses significant challenges to academic institutions. Not only does this hinder learning, but generated information can be erroneous or misleading. The Fieldston School website states that its students become “active learners and engage in vital discourse in an atmosphere of intellectual discipline and creativity fostered by a community of dedicated teachers.” However, upholding this promise becomes increasingly difficult when so many students have access to AI tools that undermine their opportunities to engage meaningfully with their work.
In an article published May of 2024 on the role of AI in education, Alkis Karmapolis and Constantine Svoronos describe an “epidemic of AI” that plagues our school: “According to Assistant Principal for Student Life and Chair of the Academic Integrity Board (AIB) Grace Yun, there are currently more cases of students cheating on schoolwork using chatbots than ever.”
In its efforts to mitigate the detrimental effects of using AI in school, uphold academic integrity and fulfill their claims stated on their website, Fieldston has adopted a department/class-specific approach to managing AI use rather than an outright schoolwide ban.
Charlotte Selous, the Upper School Ethics and Technology Lead, explains, “Because it’s still a fairly new technology that’s available to the general public, we are still figuring out what an AI policy will look like. Some departments across the board will say it’s not acceptable, but others are very willing to embrace the use of AI, so the project is ever developing. What we’ve landed on is that individual teachers can decide whether or not it’s acceptable within their own classes, and I’m working with department chairs to create a list of acceptable and non-acceptable uses for each department.”
Upper School Principal Dr. Stacey Bobo further explains the necessity of this approach due to the unique needs of different academic disciplines: “[AI use] varies based on what you are doing, and each subject is so different,” she says. “ Teachers know where each of you are and what each form is capable of doing. In science and math, they tend to allow you to use it a little more. But in English and history, where you are still learning how to research and write papers, they may have very specific skills they want you to learn.”
To better understand the distinct policies across departments, I interviewed Fieldston department chairs about how their disciplines are navigating AI use in the classroom. Here’s what they said.
History
In the History Department, AI use is not permitted for any part of the process in any assessment. The policy was amended at the end of last year – initially, the department considered allowing AI for creating outlines or organizing notes/thoughts, but ultimately concluded that these are critical skills students need to develop on their own.
Chair Miriam Paterson explains, “We arrived at the current policy because the way we see it, all of the things that the AI is doing for you, we feel it is important for you to do yourself. We want students to think about the questions and problems we present them and to be able to complete all the different steps of the process independently because we are trying to help you build the skills of problem solving, critical thinking, and strategic planning for outlines, essays, projects, and other exercises. You will not develop those skills if we allow AI to do those things for you. So, that’s why we don’t allow it.”
While the policy is currently a blanket prohibition, the department is open to revisiting the role of AI if a convincing argument is presented. Paterson acknowledges, “There are arguments being made about the productive use of AI in the classroom for teachers and students, but we have not yet heard a compelling case made for its usefulness. We have not found that it’s helping the students; rather, it’s just doing the work for them. We remain open minded, and if someone were to make a good argument, we would be willing to rethink our policy and innovate both curriculum and pedagogy in interesting and meaningful ways that involve AI. Until then, it is not allowed.”
English
The English Department has a similar stance. According to English teacher Vincent Drybala, “AI is not to be used without explicit permission. Usually that is a discussion between the teacher and the class.”
Drybala himself recently integrated AI in his class where students studied “The Three-Body Problem”, a science fiction novel by Liu Cixin that “employs different discussions and uses of AI”. “Given that we actually read a science fiction book that dealt with very realistic scientific conundrums, actually employing a scientific solution to an essay made a lot of sense,” he says.
Math
The Math Department regards AI as a form of collaboration, meaning that for assignments where collaboration is prohibited, the use of AI is also not allowed. Stephen Chu, Chair of the Math Department, remarks that the policy is likely to evolve as AI technology advances, “It’s probably going to change pretty rapidly because AI itself is changing pretty rapidly.”
Chu also acknowledges the potential benefits of certain AI tools. Although he would “strongly discourage” students from using the free version of Chat GPT, other AI chatbots like the paid version of ChatGPT or Khan Academy’s AI, Khanmigo (read about in this article), can generate problem sets or walk students through various methods for solving problems. However, he isn’t endorsing these services and recommends other resources like consulting a teacher or using a textbook.
“I think it is an interesting tool to help you understand mathematics – it’s like a conversation partner,” Chu says. However, he cautions against overreliance, noting that it can frequently generate incorrect answers. “Right now, you have to use it with care,” he adds.
Science
Science Department Chair Paul Church advises, “with any assignment, if there’s ever a question, clarify with your teacher beforehand whether or not you are allowed to use AI for it.” In most cases, using AI for a lab report or something similar would constitute cheating and may not even be accurate.
Church compares AI to Wikipedia many years ago, “When Wikipedia first came out, students would use it as ‘the source’ and believed that whatever was in Wikipedia was true and the ultimate authority on whatever they were looking up. You run into similar problems when using AI. AI makes mistakes, so you can’t take what it says as if it’s the absolute truth. It’s perhaps a place to get started when you are doing research, but you have to have the skill to go back to original sources and sus out what’s real and what’s not real.”
However, Church, like Chu, does see potential benefits in using AI as a study tool. “It’s a good source for creating questions when you are studying for a test, particularly application type questions,” he says. “Other sources like Quizlet are fine and create decent multiple choice questions but it doesn’t ask the kind of in-depth questions we ask in science class.” Church suggests that students may use AI tools as a supplement when preparing for exams that require deeper applications of concepts.
But, echoing Chu’s perspective, Church warns about potential inaccuracies, “AI can be used as a resource for gathering information, but it has to be used cautiously. You can’t use it blindly and assume what it says is true.”
In any case, the role AI will play both at Fieldston and in the broader world of academia is still being defined. Some members of the faculty, like Yun, foresee more in-person assessments such as essays and tests conducted in class, as Karmapolis and Svoronos report in their article. Others, like Drybala, may discourage students from using it to complete their work while appropriately integrating it into their curriculum. The Math and Science Departments may utilize it more if it becomes more accurate and reliable.
For now, Selous advises students, “School is a great place to learn to find your voice and to learn how to research in a low-stakes environment. As long as you’re using it in line with your teachers’ policies and in a way that generates your creativity and maintains your ability to create your own thoughts about situations, then it’s great. But if you are just using it as a way to create work for you, then I would say that’s an inappropriate use of it and I would stay away from it.”
The question remains: can AI be leveraged in a way that enhances learning, or does it simply replace it?