Morals vs. Convenience: What Lessons Is Bush Really Teaching Us?
- Chloe Gabrielson
- 1 day ago
- 4 min read
It was 5:21 pm on a mundane Monday night. I sat down and pulled up my French homework: Speakology: pratiquez le dialogue avec l’IA. What followed was one of the least productive ‘conversations’ of my life, during which I half-intentionally claimed to have experienced homelessness, got asked what I personally was doing to stop coastal erosion in Côte-Aignan (a city I’ve never heard of), and was told that my two-word response “Je pense” (I think) was a great idea on learning how to solve the homelessness crisis.
Let’s take a step back, shall we?
The language department seems to have acquired a new, shiny toy. Speakology AI is a platform designed to help students efficiently practice speaking by using, as is in their name, AI. It asks questions, replies instantaneously, grades you, corrects you, does your laundry, raises your children, etc. The student response has been mixed; Aylenn Collazos ‘27 sums up the general consensus pretty well with “It’s just a little weird.” As a reluctant user of the platform, I agree! It’s just a little weird. The way it always asks you a question is strange. The way it claims to have all the same hobbies and interests as you is strange. The way it interrupts conversations about homelessness to correct your pronunciation of “without housing” is strange. Weird isn’t bad, though, and weird certainly doesn’t seem to be reason enough for the language department to let go of their new prize.
AI can do what nothing else can. It’s fast, it’s efficient, and it’s a way to practice speaking a language as homework that doesn’t involve extensive work on the teacher’s side. In contrast, Boomalang—a platform where you talk to real native speakers—takes time, money, and scheduling (I, for one, have managed to miss not one but two of these appointments). In-class oral quizzes take an entire class period, with five minutes of talking on the students’ side. Conversation is a difficult thing to practice, and AI helps like no other.
Well. I guess that’s all. We have no choice! AI can do a thing we can’t do exactly the same with people! We just have to use it. Sorry folks. The show is over. Somebody build a new data center. Get ready to stop taking showers. Brace for the energy bill spike. Ignore those reports that AI platforms were trained by exploited workers making literal cents overseas.
Those among us who haven’t been asking ChatGPT to summarize every reading might have noticed a strange wording choice of mine: “AI can do a thing we can’t do exactly the same with people.” Besides the multitude of ethical concerns, I would also propose this: Speakology AI doesn’t even work. It doesn’t teach you to talk.
There are the weird, unnerving social cues, sure, but it also fundamentally isn’t teaching you to have a conversation. People speak differently from how they write, but it corrects you as if you had written down what you said. At one point in my conversation, I stuttered and repeated two words while trying to think of the right way to finish a sentence. In a real conversation, that would be perfectly acceptable; I restated the beginning of the sentence to provide clarity after taking a second to think. The AI, however, told me that I should have only said it once.
Within a few minutes of ‘talking’ to it, I could tell the way I was speaking had changed. I was waiting up to a minute just planning what I was going to say before recording my answer. I was rereading what it had said three or four times. I was talking slower and more deliberately, so the microphone could pick up my voice better. I wasn’t saying anything interesting, either; it didn’t care about the conversation, so why should I?
Conversations with AI, shockingly, teach you to speak like AI. I’m sure some of us have one friend who has already gotten a bit too used to using ChatGPT for every query and is speaking a little funny these days. It’s more dangerous with foreign languages, though, because we talk to so few people using that language already—often, only the teacher of your class and your fellow classmates.
And… circling back to the ethical concerns. It’s incredibly frustrating to go from one class, where we receive a four-page document with twenty sources on environmental damages and negative mental health consequences of using AI, then go to another where we are forced to use it. It’s frustrating to be expected to discuss climate change in French, to bring up important moral conversations across the language barrier, and then stop thinking about the environment when it becomes complicated. The ethical concerns don’t disappear when it becomes convenient to ignore them. At the very least, the students want a choice. I’ve heard from my fellow students: “Bush is a place of privilege, and we should [use] our funds ethically;” “It ends up feeling more like an interrogation… it doesn’t feel like an effective way to boost our speaking skills;” and “It’s ironic for a class that had a unit on climate change to be using a tool that damages our environment.” We don’t like it. The planet doesn’t like it. It doesn’t even work. Why is it still here? Because it’s easy?
Is that the lesson Bush wants to teach us?
.png)




Comments