How Can European Schools Innovate Under The EU AI Act?

How Can European Schools Innovate Under The EU AI Act?
Artificial intelligence is being used in classrooms across the world, but how do schools in Europe do it while navigating the EU Artificial Intelligence Act?
A recent conference on AI in European schools brought together a group of education leaders to address this question. Their discussions were watched by educators across the world as they explored AI adoption. How to lead. How to adapt. How to prepare schools for a future that will seemingly be a very different reality from today.
The EU Artificial Intelligence Act
As of June 2025, key provisions are already in effect, with the bulk of obligations from August 2026. Its goal is to ensure that AI systems are safe, fair and transparent.
The Act classifies AI by risk. Tools that pose an unacceptable threat to rights or safety are banned outright. High-risk systems, must meet strict criteria around transparency, data governance, human oversight and security. Even systems considered lower risk have to meet new transparency rules.
The stakes are high. Non-compliance could cost companies up to €35 million or 7% of global turnover. The rules apply to any provider or deployer whose system reaches users in the EU, regardless of where the firm is established.
The law has prompted debate, especially in the United States. American firms and policymakers have raised concerns over competitiveness, innovation and data transparency. There’s talk of regulatory overreach.
So, how should EU schools respond? How do they move forward in a way that meets this new standard, while still putting students and learning at the center?
That’s exactly what the conference set out to explore.
Starting with Urgency
As host of the event, I opened with urgency. I work strategically with schools on a global level and some governments on AI adoption. I started by explaining the current global developments related to AI and emphasising the innovative mindset that education must adopt in order to progress with hope.
I asked the educators in the audience to lead with purpose but also acknowledged that change is emotional. It’s tiring. And not everyone feels ready. Leaders can’t ignore that. They need to support their teams with empathy, not just plans. If we can build trust, we can build momentum.
It’s not just about adding tech to what we already do or leaping recklessly into new trends.
Unpacking the EU AI Act in Education
Speaking next was Matthew Wemyss, author of AI in Education: An EU AI Act Guide and a leading figure on the EU AI Act in schools. His session was a practical primer on how schools can get started with the EU AI Act. He walked educators through what they needed to understand and do to begin the path to compliance.
The law does not treat all AI the same. Some AI tools present minimal or limited risk. AI systems used for determining student access to educational programs, evaluating learning outcomes, assessing appropriate education levels or monitoring student behavior during tests are listed as high-risk and carry stricter rules.
Wemyss was clear: compliance is not optional. But the Act is not just about avoiding fines. It is a framework to support responsible and transparent use of AI in education.
He framed his message around three key actions: assess, review and comply. Schools need to start by auditing what AI is already in use. That means knowing which tools are being used, what they actually do and who is responsible for them. This includes not just formal platforms, but also tools with built-in AI features used informally by staff.
From there, Wemyss encouraged schools to examine how those tools are being used. Are decisions fair? Are outputs explainable? Is human judgement involved? Schools should not take vendor claims at face value. If an AI tool affects student learning or access, leaders need to understand how it works. If providers aren’t compliant, the school faces compliance risks as a deployer.
Compliance, he explained, is not a checklist. It means building systems that are ethical, safe and appropriate to each school’s context. What is necessary in one setting may not apply in another. Even when using third-party systems, schools remain responsible as deployers. “Ask hard questions,” he said. “Get the clear documentation you need about compliance measures.”
He also urged schools to appoint someone who can lead on AI governance. Not just someone technical, but someone who can understand the ethical dimension and translate regulation into daily practice.
Wemyss’ closing message was practical: start now, but start smart. “You do not need to solve everything at once,” he said. “But you do need to know what you are working with.” Schools should be aiming for compliance by August 2026. Leaving it too late risks rushed decisions and missed risks.
Strategy Over Hype
Next, author and educational consultant Philippa Wraithmell took the conversation in a new direction. She’s worked with schools from Dubai to Dublin, helping them use digital tools with purpose. Her big message is to not confuse activity with strategy.
AI isn’t helpful just because it exists. It’s helpful when it’s tied to a goal. Wraithmell showed how some schools are doing this well. They’re not just using AI to speed up grading. They’re using it to personalize support. To build better lesson plans. To give teachers real insights into how students learn.
But none of that happens by accident. It takes planning. It takes training. It takes trust. Wraithmell stressed that trust has to start with the teachers. If they don’t feel confident, the tech won’t stick. That’s why she recommends starting small. Pilots. Coaching. Time to reflect. And always, space for teachers and students to build together.
One of the most practical pieces of advice she shared was a simple decision matrix. For every AI idea, schools should ask: does this support learning goals? Is the data safe? Do teachers feel confident using it? If it doesn’t check all three boxes, they wait.
Her strongest point came near the end. “If your AI strategy doesn’t include the whole school community,” she said, “then it’s not really a strategy.”
Informed Governance
Al Kingsley MBE stepped in last. He’s been in education leadership for decades, both in tech and in schools, and is a prolific author. He brought focus. He talked about governance. That part of school leadership that too often stays in the background.
Kingsley explained that schools need more than leadership. They need structures that support good decisions. Who approves new tools? Who monitors their impact? Who makes sure policies stay up to date?
He laid out a maturity model that boards can use to evaluate their readiness. Are they passive? Reactive? Strategic? Most sit somewhere in the middle. Kingsley challenged them to move further. He reminded everyone that if the people making decisions don’t understand AI, they’ll end up letting someone else decide for them.
He pushed for ongoing training. Leaders and governors need time and space to learn. Otherwise, the school will move forward with a digital blind spot.
He also stressed the need to bring parents into the conversation. Families want reassurance. They want to know how AI is used. And why. Kingsley said schools must be ready to explain both. Not with jargon, but with clarity. With examples. With honesty.
Mindset Over Tools
What tied the entire session together was not a single answer. It was a mindset. AI is here. But whether it becomes a tool for change or a source of confusion depends on how schools respond.
This isn’t a moment for education to ask better questions. What do our students need? What do our teachers need? What do we want learning to feel like?
The schools that thrive will not be the ones who move the fastest. They’ll be the ones who move with intent.
This means setting goals before downloading tools. It means listening to teachers before writing policies. And it means being honest about what’s working and what isn’t.
So what now?
Use what you have. Learn what you don’t know. Invite your whole community in.
And do it all like the future depends on it.
Because it does.