Imagine this: A new technology has arrived, drawing enormous public discussion. Among the questions is how it might be used in schools. Advocates call for its widespread adoption, saying it will revolutionize schooling and motivate students. Others are wary, fearing it will make kids lazy.
The year is 1922; the leading advocate is Thomas Edison, who has a special place in my heart for founding what became General Electric (GE) in my hometown of Schenectady, NY. He promised that his invention, the motion picture, “is destined to revolutionize our educational system”—even replacing textbooks.
A century later, it’s safe to say that Edison’s revolution didn’t play out exactly as he pictured (nor justify his critics’ worst fears), but it surely had profound effects. Movies didn’t replace the written word—nothing can—but they put new tools at teachers’ fingertips. Like waves of technological innovation that followed, movies also presented schools with crucial choices about their responsible use in ways that both benefit and protect students.
Introducing generative AI
And now comes generative AI (GenAI), best recognized as the chatbots that have exploded into popular awareness. Once again, schools will be a place for crucial decisions. Companies like ours—the largest provider of learning solutions to K-12 schools—have a role here, and I believe
we should publicly state principles to keep students and teachers at the center of GenAI development.
Half of educators say they are currently using generative AI, and it is saving them time. Recent studies suggest that teachers spend over 50% of their time on non-teaching tasks—imagine what could be possible if they spent more of that time directly connected to students and teaching.
That idea, I believe, only scratches the surface of AI’s potential benefits. AI tools can enhance teachers’ productivity by helping them plan lessons and activities, convert text into presentations, and create summaries of texts—just to name a few. AI tools can also help enhance students’ literacy learning withpersonalized learning experiences—such as providing teachers with suggested feedback and revision on student writing.
It’s exciting, but AI will earn the trust of schools, teachers, families, and education leaders only if it’s used with wisdom, guidelines, and safeguards that ensure it genuinely supports teachers, benefits students, and never compromises children’s privacy or safety. That’s why we’re outlining five recommended principles we believe should guide the responsible adoption of AI technologies in K-12 schools.
Keep teachers at the center
The teacher-student relationship is crucial. We believe in a “high-tech, high-touch” approach in which technology should support, not mediate, this connection. Teachers are closest to the educational experience, and their voices must also inform the development of new technologies intended to serve them.
Teachers will need support and professional development to build “artificial intelligence literacy” to effectively leverage the technology in the classroom. Most educators (76%) identify a need for education on ethical AI usage and its integration into the classroom.
Uphold student privacy, safety, and well being
Protecting student privacy and data is non-negotiable. Existing federal laws provide strong protections that must apply to the new uses that may be associated with GenAI. Many state laws also protect children’s and students’ privacy, and third-party organizations must uphold and promote data privacy and student safety.
Lawmakers should ensure that existing laws and regulations properly account for and clarify how these levers can be used or applied to GenAI.
Ensure responsible and ethical use
Families need to understand how GenAI is being used in schools—without being overwhelmed with information that’s too detailed or technical to understand. Federal and state policymakers should work with AI experts to determine appropriate disclosure requirements and provide guidance for how districts and schools can access the information they need about GenAI systems they choose to use.
Encourage continuous evaluation and improvement
Systemic integration of AI into education technology and practice requires analysis of which strategies work, for whom, and why. Creating a culture of ongoing evaluation and improvement will ensure the technologies genuinely support teaching and learning. Even these trials must include guardrails to protect student privacy, safety, and wellbeing.
Prioritize accessibility and inclusivity
As classrooms become more diverse in demographics and learning needs, GenAI tools can equip teachers with personalized approaches, recommendations, and supplemental materials to meet each student’s needs. As new bias, equity, and accessibility considerations emerge with the use of GenAI, regulations need to evolve.
Our schools, like our society, face the task of defining guardrails for a field that’s evolving with astonishing speed. Policymakers, and companies like ours, must put empathy, safety, and privacy at the forefront to maximize the benefit that these technologies will surely have to elevate teaching and learning.
Jack Lynch is CEO of HMH.
Login to add comment
Other posts in this group
TikTok’s future prospects in the United States looked grimmer than ever Friday, following a bruising d
As TikTok pleads with the U.S. Supreme Court to let it continue operating in the United States, on
Jeff Bezos‘ Blue Origin is set for an inaugural launch of its giant
When Russell Maichel started growing almonds, walnuts and pistachios in the 1980s, he didn’t own a cellphone. Now, a fully autonomous tractor drives through his expansive orchard, spraying p
DoorDash is expanding its portable benefits pilot program to certain gig workers in Georgia starting next year, the food-delivery giant tells Fast Company.
Dashers (which is wha
To get from 0 to 60 in Formula 1 engine design while competing against organizations with much more experience, Red Bull Ford Powertrains will need extra help (and, no, that boost won’t come in th