Op-Ed: Human Connection and a Request to Bush
- Clara Thorsen
- 6 hours ago
- 3 min read
Last summer, at my press job, my coworkers and I reviewed countless applications for a teen writer’s position. While reading the writing samples, the most jarring ones were the driest, with immaculate grammar, generic phrasing, and safe, middling viewpoints. These interruptions from the genuine, imperfect applications were unpleasant, synthetic, and unacceptable.
Since the beginning of the AI boom in my freshman year, generative AI has felt inescapable. My humanities classes, however—history, English, language, and art—have sometimes felt like a rare escape. Before I graduate, I ask Bush to restrict the presence of generative AI in our humanities education for both students and faculty, a presence that obstructs and devalues authentic connection within our small community.
In a 2023 newsletter, musician Nick Cave criticized the ease and speed that AI brings to songwriting. “In the story of the creation,” he wrote, “God makes the world, and everything in it, in six days. On the seventh day he rests. The day of rest is significant because it suggests that the creation required a certain effort on God’s part, that some form of artistic struggle had taken place. This struggle is the validating impulse that gives God’s world its intrinsic meaning. The world becomes more than just an object full of other objects, rather it is imbued with the vital spirit, the pneuma, of its creator.” This same relationship is reflected in how AI functions in a classroom. A teacher’s AI-produced assignment doesn’t hold any spirit, and in turn, a student has no reason to pour their spirit into their submission. Similarly, a teacher should have no reason to use their spirit grading a student’s AI-generated submission. It’s very difficult for inspiration to flourish in this suffocating environment.
A student at another Seattle independent school told me that his teacher admitted to using AI for grading. He put it simply: “Having AI grade my work makes me want to have AI do my work.” In a small community, your values have a lot of power—you must anticipate that your community will reflect your spirit to some extent. AI assignments also reduce creativity and passion into a grade. By obstructing the flow of human connection, an assignment is just an assignment. It is a checkbox with no deeper meaning. In my past years at Bush, the humanities department has brought me invaluable connections. For assignments, I’ve interviewed local business owners, learned about my family, found un-Googleable history facts, and met one of my closest friends after going to the Writing Center. If I had entered Bush with generative AI use normalized, I might have chosen the prevalent, easy route, and missed out on all of those connections—because why wouldn’t I have chosen the path of least resistance? Would I have ever become a writer? Would I have ever joined The Rambler or applied for my job?
Generative AI will always be an option, but it is our responsibility as community members to not choose it. If we, teachers and students, do, we will lose the inherent human connection that comes from effort. At that point, I would achieve more humanities education by just walking outside and talking to a stranger than using or responding to generative AI. Ultimately, AI causes a deficit in wonder, creativity, imagination, mistakes, and inspiration—the core of experiential education.
As I begin my last semester here, I ask Bush to restrict the use of generative AI in the humanities department on all levels—student and faculty, prompts, grading, and submissions. Using AI not only harms your intelligence and contributes to harmful exploitation, but it shows your surrounding community that it is acceptable. Hard things are hard. All of us collectively need to choose the hard path because it’s the spirited path—and the human path.
.png)

Comments