As artificial intelligence makes its way into schools, a paradox is emerging.
Many educators, concerned about cheating and shortcuts, are trying to limit student use of A.I.
At the same time, teachers are increasingly using A.I. tools themselves, both to save time on rote tasks and to outsource some of their most meaningful work, like grading essays and tutoring struggling students.
That tension has prompted some difficult ethical questions. For example, is it fair to use A.I. to grade student essays, if you’ve prohibited students from using A.I. to write them?
School leaders are grappling with these dilemmas as they confront a barrage of marketing claims around how A.I. could “transform,” “personalize” and “accelerate” learning.
A.I. “is already being used by the majority of teachers and students,” said Jennifer Carolan, a former history teacher and founder of Reach Capital, a venture capital firm that invests in A.I. learning tools.
But as the technology works its way into schools, some educators say they are concerned that tech companies are pouring resources into A.I. applications, like tutoring bots, that disrupt the human relationships at the core of teaching and learning — instead of creating tools to ease the bureaucratic burdens that shift adults’ attention away from children.
Cheating, or Homework Help?
Among middle school students, word has gotten out about a solution for tricky math assignments. If you take a photograph of a problem and feed it into one of several free A.I. apps, the software will show you the correct answer and break the solution down step by step.
It’s easy to then copy those steps out, exactly as if you had solved the problem by hand.
Alex Baron, an administrator at E.L. Haynes Public Charter School in Washington, D.C., said he considered the widely used math apps a form of cheating.
But he acknowledged that he has found some compelling uses of A.I. in his own work. For instance, he can analyze students’ academic and behavioral data, and then split them into groups for targeted support.
Several of the popular math apps that concern Mr. Baron are owned by Google, including PhotoMath and Google Lens.
Robert Wong, Google’s director of product management for learning and education, said the tools are invaluable for students whose parents cannot help them with math homework.
He suggested that cheating had less to do with access to A.I. than with “other factors, like, are students engaged in the class?”
Indeed, educators experimenting with A.I. are seeking to solve the perennial problem of how to get students more excited about learning.
In Llano, Texas, Maurie Beasley, a school district technology administrator, has suggested to teachers that they use A.I. to personalize assignments.
With activities written by a chatbot, some students can work on a word problem about velocity using the example of a speeding baseball, while others consider a dancer leaping through the air.
Then there are the gray areas.
Teaching transparency
In Providence, R.I., middle school history teacher Jon Gold has found generative A.I. useful in lesson planning.
He trained ChatGPT by feeding it dozens of pages of curriculum materials he wrote over many years. That helped the bot spit back useful material. It can edit a long reading assignment down to three paragraphs for a short exercise, or create dummy essays that illustrate for students the difference between an effective essay and one that lacks supporting evidence.
Transparency is key, he said. He explains to students exactly how he has used A.I., in part to model ethical use.
Asking a chatbot to summarize notes into a study guide is a good idea, for example. But he does not want students using A.I. to draft essays or conduct research. He tells them that finding various sources of information and synthesizing them in writing is key to learning history.
He also talks with students about knotty ethical issues around how chatbots rely on copyrighted material and consume an immense amount of energy.
“I am more pro-A.I.-literacy than I am pro-A.I.-use,” said Mr. Gold, who teaches at Moses Brown School, a private Quaker academy.
Writing Help (for Teachers)
Writing is one of the most challenging tasks for students, which is why it is so tempting for some to ask A.I. to do it for them. In turn, A.I. can be useful for teachers who would like to assign more writing, but are limited in their time to grade it.
Companies like MagicSchool and Brisk Teaching already offer A.I. products that give instant feedback on student writing.
Automated scoring is also happening in high-stakes scenarios — on exams that determine whether students graduate from high school.
In 2020, the state of Texas signed a five-year, $391,000,000 contract with the education technology firm Cambium Assessment, in part to deliver automated scoring of student writing.
The technology is not generative A.I. with access to the open internet, but instead uses an older form of artificial intelligence trained on human-graded writing samples.
Earlier this year, Dallas school officials complained after some questions on state tests were graded by the software, and scores were lower than district leaders expected. When the district submitted about 4,600 student writing samples for regrading, about 2,000 received a higher score.
Jake Kobersky, a spokesman for the Texas Education Agency, said the adjustments were minor in the context of Dallas’s 71,000 writing samples. He said the state remained confident in the technology.
Studies show that human grading of writing, too, is prone to bias and error — and that advanced forms of generative A.I. may be accurate and consistent graders of simple writing tasks.
The Dallas superintendent, Stephanie Elizalde, acknowledged that even as she raised concerns about the state’s use of automated scoring, her own district has embraced A.I. It has used the technology to grade students’ practice Advanced Placement essays, she said, and to assist staff members in summarizing documents and analyzing large sets of data.
Dallas students are already using A.I. to conduct research, she said, and learning about the importance of verifying information they receive from chatbots.
“It’s irresponsible to not teach it,” she said of A.I. “We have to. We are preparing kids for their future.”
A.I. in Schools Is Big Business
Over the past two years, companies working at the nexus of artificial intelligence and education have raised $1.5 billion, according to an analysis by Reach Capital, the venture firm.
The biggest players in education technology, like Google, Microsoft and Khan Academy, are also heavily promoting A.I. for student research, tutoring and teacher lesson planning.
Mr. Wong, of Google, said the company’s vision for A.I. is to provide “a tutor for every learner and a T.A. for every teacher.”
Google’s Gemini chatbot, for instance, can probe students with questions that prompt them to demonstrate and practice what they know.
School leaders are parsing which of these technologies might become essential and which should be passed up.
Mr. Baron, the administrator in Washington, D.C., said one company had pushed an A.I. product that watches video footage of teachers teaching, and offers feedback.
Teacher observation and evaluation are among the most complex and important tasks he does, he said. Instead of outsourcing that job, he would prefer an A.I. tool that helped him maintain his school’s master schedule and search for substitutes — chores that routinely eat up hours of his day.
“I really want the teacher to be reading students’ work and helping them become better writers. I want school leaders to observe teacher practice,” he said.
Ms. Carolan, the Reach Capital partner, said marketing around A.I. could sometimes be “overly aggressive.” She added, “It’s important for everybody to be facile with the language and technology and be able to evaluate these products.”
But many educators — and policymakers — are not deeply conversant on A.I.
Los Angeles, for example, hired an inexperienced start-up to create an A.I. chat bot for students and families. The effort failed and the company’s chief executive was charged with fraud.
Many individual teachers are finding themselves navigating this terrain on their own, determining when A.I. is the enemy, and also, how it could be a friend.
Mike Sullivan, a middle school math teacher in Brockton, Mass., estimated about half his students are using problem-solvers like Google Lens. Some constrain their use to homework help. But he has also caught students using A.I. tools during in-class quizzes.
“It’s just too easy,” Mr. Sullivan said. The experience has made him rethink the wisdom of relying on computers so heavily in the classroom.
Still, he would love it, he added, if he had access to an A.I. tool that would make his own life easier, by taking student work produced on paper and seamlessly transferring it to a digital grade book.
Content Source: www.nytimes.com