
I assigned a writing prompt a few weeks ago that asked my students to reflect on a time when someone believed in them or when they believed in someone else.
One of my students began to panic.
“I have to ask Google the prompt to get some ideas if I can’t just use AI,” she pleaded and then began typing into the search box on her screen, “A time when someone believed in you.”
“It’s about you,” I told her. “You’ve got your life experiences inside of your own mind.”
It hadn’t occurred to her — even with my gentle reminder — to look within her own imagination to generate ideas. One of the reasons why I assigned the prompt is because learning to think for herself now, in high school, will help her build confidence and think through more complicated problems as she gets older — even when she’s no longer in a classroom situation.
She’s only in ninth grade, yet she’s already become accustomed to outsourcing her own mind to digital technologies, and it frightens me.
When I teach students how to write, I’m also teaching them how to think. Through fits and starts (a process that can be both frustrating and rewarding), high school English teachers like me help students get to know themselves better when they use language to figure out what they think and how they feel.
Unfortunately, it’s becoming harder to teach them that their ideas have value because they’ve subcontracted out their minds to their screens. They get their news on TikTok and YouTube and do their shopping based on ads they see in between the videos they watch.
One of my students told me there was no point to writing anymore for my class because now “AI just does it for us.” He doesn’t value the writing process because — despite how hard I’m trying — he’s constantly being bombarded with messages that he shouldn’t.
Whether it’s an advertisement for Grammarly on YouTube encouraging my students to add its new Chrome extension on their Google Docs or a video on TikTok enticing them to download the latest version of ChatGPT, my kids are constantly inundated with carefully curated messages that encourage them to be passive consumers in the classroom.
The messaging they receive is so strategically targeted to my students, it can give them a false sense of who they are, while at the same time increasing their dependence on these products.
It’s working. I see it every day. One of my 12th-graders told me he “can’t write even one sentence without Grammarly.”
“ChatGPT is right there with me all the time,” another student said, “like a friend.”
Many educators allowed AI to be used in the classroom to help students complete a preliminary task, like building an outline or a draft. That way they won’t feel like they’re sneaking behind their teacher’s back. However, students also use AI to do other assignments — like writing full essays — and claim that work as their own.
Some teachers have begun to require their students to write their essays by hand using a pen and blue books. “At least they’re off screens and writing their own words,” a colleague said to me recently. “Still, it feels as though we’re no longer teaching writing,” she continued. “It’s a diluted form of communication.”
If you believe, as I do, that writing is thinking — and thinking is everything — things aren’t looking too good for our students or for the educators trying to teach them.
In addition to teaching high school, I’m also a college instructor, and I see this behavior in my older students as well.
One of my undergrad students used AI to write all four essays that were assigned last quarter. It was easy to tell because the papers he turned in were full of generalizations expressed in boring, yet grammatically correct, sentences. When he came to class, he didn’t contribute to discussions because he hadn’t read the articles that the essay prompts were designed to make him think critically about. When I asked if he’d used AI to do the work for him — even though the syllabus stated not to use it for these kinds of assignments — he said he did nothing wrong. He “did the assigned work,” he told me.
The “work” he did was to feed essay prompts into an AI generator and watch his papers be produced in just seconds. He didn’t engage with the material, yet he felt he came to class prepared because he did something. He didn’t do nothing.
During class I could see the light from his laptop screen reflected in his eyes, which widened as he scrolled. I thought I’d be angry. I’d spent hours preparing for this three-hour class. But I wasn’t mad. Instead, I became overwhelmed with sadness. In that moment — and there are thousands of such moments in an educator’s career — I felt I could not reach him. I couldn’t get to know him.
We’d spent hours together and never had a real conversation. When I asked him questions about his life, he replied with one-word responses. He needed the credit hours, but he contributed nothing and wanted nothing from me except a good grade. It’s not just the technology that’s hurting us, but the ideology of a transactional teacher-student relationship that privileges results over experience.
I’m old enough to remember class discussions before Big Tech made its way into schools, when students’ eyes widened when they realized something new for the first time — when ideas were born and developed in a classroom instead of via a superficial 15-second video that unfolds passively on a screen. These moments still happen, but they occur more and more infrequently with each year as our students become more and more dependent on what Big Tech companies offer them.
It’s certainly not my college student’s fault. Like my high school students, he’s been trained to be a passive consumer rather than an imaginative innovative thinker when he’s in school. As a sophomore in college, he’s probably been using some form of AI to do his work since he was halfway through high school. He likely doesn’t know what it feels like to turn in writing he did himself — to own it and to take pride in the thought that went into it. I’m sure he isn’t aware of what he hasn’t felt. How could he be?
What a terrible disservice we’ve done to our youth. We expect them to be able to monitor when and when not to use some of the most enticing technology we’ve ever encountered and we aren’t eloquently providing convincing reasons not to indulge in AI any time they face a challenge of any kind.
As a society, we’re not talking enough about what the long-term effects will be for our kids and what it will mean for an entire generation of students to sail through school and not learn the basic rudimentary skills that are designed to teach them how to think and problem-solve on their own.
And we’re not listening to the educators who are concerned. Seventy-two percent of college professors who said they’re aware of Chat GPT are concerned about its impact on cheating, but many of us don’t know what to do about it and don’t have the support we need to push back against it.
Meanwhile, educational institutions continue to uncritically embrace AI, moving at lightning speed to bring it into classrooms without thinking through its dangers, limitations and consequences. AI wasn’t designed for schools, yet we continue to operate as though it was.
Things are, however, looking great for Big Tech companies, who are earning billions of dollars to get our students to outsource their minds, thereby losing the ability to think critically.
The monthly revenue for OpenAI, the company that created ChatGPT, hit $300 million in August 2024, which was up 1,700% since the beginning of 2023. Google, which has infiltrated practically every classroom in the U.S., U.K. and India through its educator products, is now worth over $2 trillion. Yet, as these companies’ profits continue to soar, public schools remain massively underfunded in every state in the U.S.
When I assigned that essay prompt to my ninth-graders a few weeks ago, one of my students asked me, ”How am I supposed to answer this?”
“Think about your own life,” I said. “Use language to write about it.”
“Nah,” he said. “I’m just going to zone out.”
I’ll continue to try to reach my students. It’s my job, and I’m passionate about it. However, the time and energy spent debating the merits of doing work without AI — or trying to root out when AI is being used without permission — take away time and energy that could be used for teaching the things I’ve been hired to teach.
Still, I will do whatever I can in hopes of making my students see the value in not always using AI and what is possible without it. But I wonder — as do many of my colleagues who teach high school and college courses — to what end are educators fighting an uphill battle, trying to convince students why learning — and life — without always using technology should matter.
“Talk to you later,” my student said as he turned his attention to Google on his school-issued Chromebook, as if he was being sucked into his screen by some inescapable gravitational pull. His desk is 2 feet from mine, but in that moment, he was a universe away.
Go Ad-Free — And Protect The Free Press
Already contributed? Log in to hide these messages.
Liz Rose Shulman’s work has appeared in Slate, The Boston Globe, Newsweek, Los Angeles Review and Tablet Magazine, among others. She teaches English at Evanston Township High School and in the School of Education and Social Policy at Northwestern University.
Do you have a compelling personal story you’d like to see published on HuffPost? Find out what we’re looking for here and send us a pitch at [email protected].