There’s a reasonable chance that right now, somewhere in your school or trust, a member of staff is using an AI tool to write a report, plan a lesson, summarise meeting notes, or draft a parent communication. They might not have mentioned it. They might not even think it’s worth mentioning.
That is not necessarily a problem. It is simply where many schools now find themselves. AI tools have become easy to access, easy to test and, in some cases, hard to ignore. For staff managing heavy workloads and stretched budgets, any tool that promises to save time will naturally attract attention.
The issue for school leaders, MAT executives and governors is that adoption has outpaced any formal conversation about whether, how, or under what conditions schools should be using them.
Once tools begin to influence day-to-day operations, schools need proper policy, oversight and accountability in place.
The reality on the ground
Research from the Department for Education, alongside findings from teaching unions, points to the same trend: many teachers are already using AI in everyday work — creating resources, drafting feedback, summarising information and cutting down admin. In most cases, this is informal experimentation rather than a planned rollout.
There are real benefits here — time saved, workload reduced, resources generated faster. But benefit and risk can coexist, and where there is no policy, there is no consistency. Data may be entered into third-party platforms, content may be used without proper review, and student use can raise questions about whether work is genuinely their own. One department may be cautious, another more relaxed. One individual may understand the risks well, another may not have thought about them at all.
Why governance can’t wait
AI governance in schools is not an optional extra. Once staff begin using tools in live school environments, there needs to be clarity about what is acceptable, what is not, and who is responsible for oversight.
There are several areas where the absence of clear guidance creates genuine risk.
Data protection is perhaps the most immediate concern. Staff may paste information into an AI tool without fully considering where that data goes, how it is processed, or whether it could be retained. Schools need to know which tools can be used, what data can and cannot be entered, and how third-party providers are handling that information.
Even where names are removed, schools still need to think carefully about what may be identifiable. Schools need a consistent approach so staff are supported in making those judgements.
Accuracy, appropriateness and student use raises its own questions. AI-generated content is not infallible. If inaccurate, inappropriate or incomplete output is used in teaching, communication or operational decisions without human review, the consequences could be serious.
There is also the question of how students themselves are using AI, and what responsible use looks like in practice, particularly in relation to homework, coursework, online safety and digital literacy.
Responsible and equitable use also matters. If AI tools become part of teaching and learning, schools need to think about whether all students have equal access, whether bias may affect outcomes, and whether AI-assisted work is being assessed fairly.
None of these risks mean schools should avoid AI. They mean schools need a clear and consistent approach to using it well.
The role of leadership and governance
It may be tempting to treat AI as an IT issue or leave it with an enthusiastic early adopter. In practice, it reaches much further than that. AI adoption in schools touches teaching and learning, safeguarding, data protection, staff development, communications and operational practice. It is a whole-school leadership issue.
For senior leaders and governors, the priority is setting direction and asking the right strategic questions. What AI tools are currently in use across the school or trust? Is there a policy in place? What guidance are students receiving? How is compliance being monitored? How often will this approach be reviewed?
Governors in particular do not need detailed technical expertise, but they do need assurance that the school is taking a considered, documented approach and that appropriate policies are in place and reviewed regularly.
From informal use to formal policy
Before an AI policy is written, schools need a clear view of what is already in use, where it is being used, and which questions are starting to surface. In some schools, those questions will centre on data protection. In others, they may be more closely tied to safeguarding, assessment, workload or student use.
A formal approach also needs some agreed principles behind it. Without that, policy can quickly become a list of permissions and restrictions without much coherence. For most schools, that conversation is likely to include data protection, safeguarding, transparency, professional judgement and fairness. Our AI Policy – Key Considerations document highlights the main considerations schools are likely to want on the table at this stage.
That discussion also needs to reflect the reality of staff experience. In many schools, AI use has not arrived through a leadership decision, but through individual staff testing what is useful in practice. Formal policy is more likely to be workable when it takes account of that, while still setting clearer boundaries and expectations.
Student use also needs to be considered. A policy that focuses only on staff use is unlikely to answer some of the more difficult questions schools are now facing, particularly around homework, coursework, assessment, online safety and digital literacy.
And, like the technology itself, this is unlikely to stand still for long. Any school putting policy in place now will need to treat it as something to return to, refine and review, rather than a document that can be written once and left alone.
A sensible next step for schools and trusts
The schools and trusts that respond well to AI are unlikely to be the ones that rushed in first. More likely, they will be the ones that took a measured approach, put governance in place early and gave staff clear guidance on how to use these tools safely and responsibly.
That work starts with leadership, but it should be supported by the right procurement and governance structures too. In many cases, the greater risk is not simply choosing the wrong tool, but adopting AI without the right structure, oversight and safeguards around it.
As a DfE-approved ICT procurement framework, Everything ICT can support schools and trusts in making technology decisions through a compliant, well-governed process that reflects the realities of education.
In many schools, the conversation around AI is already under way, whether formally or informally. The priority now is making sure policy, oversight and decision-making keep pace with it.





