Artificial Intelligence Is Already Changing How Teachers Teach

Reuters
A smartphone with a displayed ChatGPT logo is placed on a computer motherboard in this illustration taken February 23, 2023.

Mike Harris knew it could take a full day to design a new class he was teaching at a Kansas high school. But before he started last winter, he turned to artificial intelligence, curious about how it might help. He asked ChatGPT to outline a 16-week drama class at a large urban school, using his state’s standards.

It took two or three minutes.

Then he pressed again: Could it now craft daily lesson plans, tied to state and school district standards, for 90-minute classes every other day? In minutes, they appeared.

“To me, that’s the wonder of the tool,” said Harris, who’s in his 10th year as a teacher in Wichita. “This is one of those once-in-a-millennia technology changes.”

As AI jolts education, public school teachers and university professors are discovering that it is not just for students. Educators are using it to help develop tests, generate case studies, write emails and rethink teaching strategies.

Increasingly, they see AI as both friend and foe – with the capacity to enrich learning, spur creativity and save time on tasks, even as it raises alarms. Some public schools have blocked access to it, including those in New York City and Los Angeles, citing concerns about student learning and cheating.

The technology is not always accurate, sometimes making mistakes or inventing facts. But many educators say they take caution, using it for topics they know deeply and looking out for errors.

Facebook groups about AI have drawn teachers and professors from across the globe – looking for tips, posting about successes, raising ethical quandaries. More than 250,000 people have signed on to one of the more popular pages, “chatGPT for Teachers,” which launched this spring.

Harry Pickens, an instructional designer, has watched the exploding interest in awe. He wishes AI had been available to his mother, who was a teacher and spent long evenings on lesson plans and student papers. “Most teachers get into it because they really love kids,” Pickens said. “I see ChatGPT as one tool to get back to the things that really matter.”

In the months after ChatGPT was released in late 2022, K-12 teachers began to discover its potential to produce rubrics, lesson plans, quizzes, classroom management strategies, step-by-step answers to math problems and more. Some AI, including ChatGPT, is free, while others require payment.

Sarah Alvick, an English and social science teacher outside Spokane, Wash., found the generative bot also can be especially helpful for students who struggle with writing and might otherwise spend days staring at a blank page, unable to come up with an idea.

“You’ll have a kid who sits for a whole week, saying, ‘I don’t know what to write about,'” she said.

This spring, many of her sixth-graders used AI to help write letters to public officials. Alvick used it to rewrite readings about current events at different levels so that her 12-year-olds could grasp various topics, including Pride Month.

In the coming school year, students may be able to use ChatGPT to decide on the structure of an essay, she said, or to get feedback on something they’ve written – with one key provision: “only using it to assist you, not using it to do it for you.” Students who cheat with it will lose a lot, she said.

“I worry about a loss of critical thinking,” she said.

But she has seen it work well for an after-school Dungeons & Dragons club. Alvick is the adviser. Her students use it to help write missions, and it’s working well, she said. “It’s really good at organizing language,” she said.

For some, AI has led to greater creativity, said Melissa Wright, the executive director of the Center for Engaged Pedagogy at Barnard College: less robotic assignments that fall further from the beaten track. Some faculty use it to reach multiple types of learners by asking for three different examples of a concept – to find one that might click.

In university computer science classes, professors have told students to compare the code they write with that generated by AI. In English classes, they’ve encouraged students to use the bots to get started on their essays.

Most essays generated by ChatGPT aren’t good enough to satisfy a college professor, said Wright, who is also an affiliate faculty member in English at Barnard. But some faculty allow students to use chatbots for more formulaic writing such as policy memos. In a photography class, she said, an instructor told students that they could use the technology to help generate artist statements explaining the project, as long as they are transparent about the process.

Professors are also looking to technology for more routine tasks. Wright said one of her colleagues uses AI to rewrite emails to students when he’s feeling annoyed with them but wants ChatGPT to lighten his tone.

“Humans are still needed,” she said, to check the work. But the bots save plenty of time and energy.

For his teaching on communications and data science at the University of California at San Diego, Stuart Geiger completely redesigned a class on the social and cultural implications of data and AI after ChatGPT burst onto the scene. He talked to the students about how large language models work. He required students to use AI for an assignment and then evaluate the result as though they were professors grading a student’s work. In some cases, he said, students got full marks for an essay with errors generated by the bot – because the students identified the problems and cited authoritative sources to provide accurate information.

His policy now requires students to write a reflective passage alongside each assignment about the writing process, asking them to document what sources they’re using and how a site such as Google Scholar or a certain database might skew their results. They can use AI but must provide their full dialogue with the bot – to show how it helped them get past writer’s block or refine an idea in creative writing.

One of the dangerous things about ChatGPT is that it sounds so authoritative, he said. The bot can answer every question – but not always correctly. In his research, he has found many people who seem to accept the answers given.

“There’s a reason we have therapists, doctors, stockbrokers, accountants,” he said. AI isn’t about to replace that expertise, but it can echo the language they might use. “I’m always surprised,” he said, “at what it does not refuse to answer.”

Ronak Shah, a science teacher in Indianapolis, introduced the technology to students last spring, challenging them to show they knew more than the bot.

“You guys don’t want a robot to be smarter than you,” he teased his seventh-graders. They jumped at the challenge.

Shah gave them a five-paragraph essay the software wrote, asking his students to edit and improve it. Some found words that they considered inaccessible or imprecise. One found a quote that appeared to have been completely made up. Some said it was a good essay.

Shah said it was probably worth a B.

More often, Shah uses ChatGPT to “level” science texts to individual student reading levels, which he said is “super helpful, especially when you have students who are super far behind.” The finessing helps students grasp science concepts without navigating dense vocabulary, he said.

One example: When he teaches plate tectonics, he first shows a time-lapse depiction of the growth of the Himalayan mountains, then asks that students build a mountain range using modeling clay. Students are often interested by that point. But the reading that follows – about Mount Everest and climbing expeditions – can become frustrating if it’s mismatched to their skill level.

Now, Shah can paste a passage into ChatGPT and it’ll return a version at a grade level or Lexile of his choosing. Doing it without the bot could take hours, he said.

Many teachers remain reluctant to use the technology, but Shah expects that to steadily change. For him, it is an imperative.

“At the very minimum, I want to expose students to it, so they know it exists, and I want to make sure they know what it can be used for and the fact that I use it too,” he said. It’s akin to the advent of calculators, he said: At first, the device was considered a method for cheating, but ultimately it became a tool for working on complicated math problems.

Caleb Curfman typically asks students in one of his classes at Northland Community and Technical College in Minnesota to design an ideal government.

This spring, they asked ChatGPT instead.

Students were all surprised when the first response ended with, “it must be a democracy,” he said. Many laughed. But it led to a conversation about the potential for bias in AI, as well as a discussion of what government is really for, anyway.

When they asked ChatGPT for a Socratic critique of the argument that the best government would be a parliamentary system with a president and a Bill of Rights like the United States has, it spat out questions such as: How would this system address the potential concentrations of power? Are there checks and balances in place to prevent abuses of power? How would this system ensure representation and participation of diverse groups in society? Does it adequately protect minority rights and prevent majority tyranny? How would this system promote transparency, accountability and effective governance? Are there mechanisms in place to prevent corruption or ensure responsive decision-making?

It enriched the discussions in class, Curfman said. And that’s not all. “The projects have become better,” he said. “They’re trying to stump it. It added a little bit of competition” that brought energy to a class assignment.

ChatGPT made its teaching debut at Bear River Middle School in the spring, shortly before school ended. There, ninth-graders were studying the classic novel “To Kill A Mockingbird,” and the generative bot took on the persona of unsavory character Bob Ewell.

Students asked why he was racist.

“Now, hold on just a minute there,” the bot-as-Ewell said. “Who says I’m a racist? Just ’cause I had some issues with that Atticus Finch fella and accused Tom Robinson of somethin’ he didn’t do doesn’t mean I’m a racist. I was just tryin’ to protect my family and my reputation.”

The Ewell exchange was part of a lesson that started with brainstorming about themes. Students then broke into small groups, did interviews with characters via ChatGPT, and eventually worked on posters, group presentations and a written assignment.

The familiar was made fresh with the new.

“We wanted to show kids how much it could do,” said Tyrell Neal, an English language arts coach at Bear River. “We were finding that kids only know one thing about AI – that it can type papers for them or do their homework for them.”