Generative AI in Schools: What Leaders Need to Know Now

generative AI in schools

In a recent episode of The Table, I had the pleasure of sitting down with Sarah Hanawald, executive director of the Association for Academic Leaders and a former academic dean with more than 25 years of experience in independent schools. Sarah is one of the most thoughtful voices in the field right now when it comes to generative AI and education, and our conversation covered a lot of important ground — educator resistance, policy pitfalls, the real risks to students, and what school leaders can do to get ahead of this.

If you work in an independent school, this episode is worth your time. And if you don’t have time to listen, here are the insights that I think every school leader needs to hear.

The “Policy Whack-A-Mole” Problem

When I asked Sarah to describe where independent schools currently stand with generative AI, she didn’t pull any punches. “Schools are in a policy whack-a-mole phase,” she told me. “They react to something that happens, they add a new phrase in the policy or do an update, rather than thinking strategically.”

This reactive posture is understandable — generative AI is evolving faster than most institutions can keep up with. But Sarah makes an important distinction between problems that are complicated and problems that are complex. A complicated problem has a lot of moving pieces, but each one has a known solution. A complex problem requires iteration. You make a decision, watch its impact, and adjust. Generative AI is firmly in the complex category, which means there is no policy document that will hold. Schools need to build the capacity to move with it, not just write rules about it.

Why Educator Resistance Comes from the Right Place — But Still Causes Harm

One of the things I appreciated most about Sarah’s perspective is her genuine empathy for educators who are struggling with this shift. Resistance to AI, she explained, is rarely about laziness or antagonism. It runs much deeper than that.

“An educator’s professional and personal identity gets tied to their ability to guide student growth,” she said. “It’s not the content mastery — it’s that skill of passing knowledge to the next generation that AI destabilizes.” When you’ve spent decades watching students learn and grow through your guidance, it’s understandable to feel like something fundamental is being threatened.

Sarah also acknowledged the legitimate ethical concerns: bias baked into AI models, copyright violations in how those models were trained, and real environmental costs around data center energy and water use. These are valid concerns, and educators are right to name them. But she’s clear that opting out entirely is not a healthy response, because the cost of that decision lands on students.

What’s Actually at Risk When Schools Opt Out

This is where the conversation became especially important for school leaders to hear. Students are already using these tools. That’s not a future concern — it’s the present reality. And when the knowledgeable adults in their lives disengage, students navigate these tools alone.

“When adults opt out, students navigate alone,” Sarah said. “They shortcut their own learning, and in some cases they’re headed into dangerous territory — mental health, legal risks, their wellbeing.” She drew a direct parallel to what schools learned (often too late) about social media and online gaming: adult guidance matters enormously, and absence doesn’t make the problem disappear.

There’s also a larger competitive picture. As Sarah and I discussed, generative AI is not a technology that emerged in the US and then slowly spread outward. It arrived everywhere, all at once. Students whose schools engage thoughtfully with AI will be better prepared for a college environment and workforce where AI fluency is increasingly expected. Those whose schools refuse to engage will be asked to catch up on their own, without the scaffolding schools are uniquely positioned to provide.

The Assessment Reckoning Schools Aren’t Talking About

One of Sarah’s more striking observations was about what we lose when AI can simulate the outputs of learning. For years, schools have used student-produced text, imaging, and sound as a shortcut for understanding what’s happening inside a student’s mind. AI complicates that, and it forces a question schools haven’t fully grappled with yet.

“We have to find other ways to figure out how students are growing,” she said. She was careful not to frame this as a crisis, but as a reckoning that was probably overdue. Special educators, she noted, have been pointing out for years that assessing students through standardized outputs missed a lot of students. This moment may require schools to build richer, more varied ways of understanding student thinking — which is ultimately a better outcome, even if getting there is difficult.

Practical First Steps for Academic Leaders

When I asked Sarah what academic leaders can do right now, she was direct: get hands on with the tools yourself, and then carve out real time and space for your faculty to do the same. Not a mention in a faculty meeting. Not a suggestion to explore on their own time. Dedicated, supported, structured time to experiment.

She recommends CoLab, a group of independent school teachers designing and sharing prompts that go well beyond surface-level AI use. “It takes hands-on experience,” she said. “Get a little frustrated with it. Think about your relationship with your AI tool and how you want it to work with you.”

For leaders managing faculty through this change, Sarah offered a reminder that will resonate with anyone who has navigated institutional change before: change is a whole-organization process, but it happens one person at a time. Large faculty meetings and top-down announcements will not move people. What works is individual conversation — affirming an educator’s expertise and identity, then helping them think through how AI fits into what they already do well.

Using Mission as a Guide, Not Fear as a Driver

Perhaps the most clarifying thing Sarah said during our conversation was this: schools that approach generative AI from a place of fear tend to overcorrect. They pull back toward highly controlled, low-stakes assessments. They try to eliminate the variable rather than work with it.

“We have a mission that helps us prepare kids for a future we don’t really understand,” she said. “Our missions will really guide us if we let them.”

That reframe matters. Most independent school missions say something about preparing thoughtful, creative, adaptable young people for an uncertain world. That isn’t a contradiction to engaging with AI — it’s actually a mandate for it. Schools that let mission lead, rather than fear, are the ones most likely to find approaches that hold up over time.

Why Independent Schools Are Uniquely Positioned to Lead

Sarah closed our conversation with something worth sitting with. She believes independent schools are better positioned than almost any other type of institution to navigate generative AI thoughtfully — and potentially to model what this looks like for the broader education sector.

The reason? Community proximity. Independent school boards are close to their communities in a way that large school systems and corporations aren’t. That closeness allows for genuine dialogue about values, wellbeing, and appropriate pacing. It allows schools to make mistakes, learn from them, and correct course quickly. “We can afford to make some mistakes, re-correct, and see how we can help kids launch,” Sarah said. “We do that really well.”

That’s not complacency — it’s confidence in the model. Independent schools that lean into that strength, engage their communities honestly, and lead with their missions are the ones that will come through this transition in a position to be proud of.

One Thing to Try This Week

Sarah left listeners with a challenge that I think is genuinely worth trying: go to the paid generative AI tool you use most and ask it what it’s learned about you. Then bring it a problem you wouldn’t have thought to approach with AI before, and ask how it would suggest you handle it — given what it knows about you. Then ask it a clarifying question.

Simple. Specific. And a good reminder that these tools are more capable than a lot of us have tested them to be.

If you want to go further, explore what the Association for Academic Leaders offers — from asynchronous courses for AI beginners to workshops on building custom bots and monthly AI meetups for academic leaders who want to stay current together. Sarah and her team are doing important work, and independent school leaders would do well to know about it.

Bridget Johnson's Signature

Bridget Johnson, Founder, Deans' Roundtable

Bridget Johnson, a former associate executive director, has worked in education for much of her career, primarily in independent schools and nonprofits. As a former dean of students and director of special programs, she has helped schools expand their offerings while maintaining their core values. Bridget now works as the founder of the Deans’ Roundtable and an independent consultant helping educational institutions implement data-driven strategies that support their unique missions.

Skip to content