Skip to main content
Full Menu Close Menu
Learning Curve

Harnessing AI to support educators

New York Teacher
Harnessing AI to support educators
Erica Berger

Educators gather at UFT headquarters for a workshop at the National Academy for AI Instruction, launched by the AFT and the UFT.

As artificial intelligence moves into public education faster than policy can keep up, educators are being asked to navigate powerful new tools with little guidance. To help fill that void, the American Federation of Teachers has released “Commonsense Guardrails for Using Advanced Technology in Schools,” a framework meant to ensure AI supports teaching and learning rather than undermines it.

AI’s rapid rise has left many of us struggling to keep pace. ChatGPT reached 100 million users just three months after its debut in November 2022, far outstripping the early adoption rate of the internet or the personal computer. Today, AI is embedded in industries from health care to banking, and education is no exception. Estimates show that up to 84% of K–12 students use AI for schoolwork, with even higher rates in middle and high schools. Educators are adopting it, too. A 2025 study found that 60% of teachers use AI in their work, with one-third using AI on a weekly basis.

Yet with few or no federal regulations, consumer protections or industry guidelines, educators face a “Wild West” where they must sort out AI’s risks and potential on their own. The AFT’s new guardrails are designed to support teachers and students in navigating AI and demonstrate how this new technology can be wisely used in education.

Created by and for educators, the framework comprises nine core values (see sidebar).

“Given the potential benefits and risks of AI, educators and policymakers must work together to make sure it’s used safely and responsibly,” said AFT President Randi Weingarten in her introduction to the framework. The core values, she said, “ensure these new technologies support teaching and learning, not control them.”

The guardrails are a focal point of workshops now underway at the National Academy for AI Instruction at UFT headquarters in Manhattan. “Every session we lead begins with the nine core values,” said Vincent Pilato, the academy’s chief operating officer. “They aren’t abstract principles — they’re practical expectations designed to protect both teachers and their students.”

The guardrails, he said, “give educators the confidence to explore AI’s potential while knowing how to keep their classrooms safe, ethical and focused on learning.”

Natalia Babushkina, an ENL and math teacher at John Dewey HS in Brooklyn, attended the institute’s inaugural in-person workshop in October. She uses AI in her work to help with lesson planning, to differentiate materials and to scaffold algebra instruction for her English language learners.

The core values, she said, address the pitfalls with AI use in education.

“It’s very important to teach AI literacy,” she said. Incorrect use of AI by students can “result in either no learning or permanent frustration” for students. Students should not use AI as a crutch or “magic tool,” she said, but be encouraged to see it as “a partner, a tutor or an editor.”

Babushkina noted that many educators are trying AI tools on their own without sharing their experiences. She would like to see more formal communities established “where educators can come together and collaborate,” especially by subject or grade level.

Brian Cheng, a school librarian at PS 229 in Queens who attended the same AI workshop, already incorporates information literacy into his work with students and sees AI literacy as a natural extension. Though the subject matter may switch from misinformation in advertisements to spotting fake AI-generated videos, the critical thinking strategies he teaches remain the same.

“We have to build students up to have a thirst to know the truth,” he said.

If a piece of content provokes a strong emotional reaction, he teaches students to pause and do some research before sharing it. “It takes some time to think about it and find some resources that are credible,” he said.

Cheng believes students also need to understand the environmental impact of AI — one of the AFT’s guardrails. “Before you go ahead and make that video, think about the actual impacts it might have on the environment and the energy it’s taking to create it,” he said.

“There’s so many things we need to teach students to be mindful of,” Cheng said.

“Our jobs have always been about molding the students to be good citizens, online and in person,” he said. With AI, he added, “our value is to teach them how to navigate it and be responsible.”