AI use grows in school trusts, but impact remains unclear

AI

Multi-academy trusts are increasingly turning to artificial intelligence to support teaching, learning and school management, but evidence of its impact remains limited, according to new research from the Education Policy Institute (EPI).

The study explores how multi-academy trusts (MATs) are adopting AI, drawing on a review of existing research and discussions with trust leaders and sector experts. While AI tools are already embedded in many everyday school systems, the report finds wide variation in how strategically and effectively they are being used.

MATs reported using AI to help with lesson planning, assessment, communication and administrative tasks such as financial forecasting and HR. Advocates argue these tools can reduce teacher workload and enable more personalised learning, particularly for pupils with special educational needs or those who speak English as an additional language. However, participants warned that time savings do not always translate into reduced workload, and raised concerns about bias, transparency and reduced human interaction.

The research highlights different approaches to AI adoption. Some trusts favour bottom-up experimentation led by teachers, while others take a more centralised approach, embedding AI oversight within existing safeguarding and data protection policies. Leaders stressed that successful use of AI depends on clear educational goals, strong digital literacy and robust accountability measures.

Despite government investment in AI tools and infrastructure, many trust leaders described a gap between AI’s promise and its real-world impact. A lack of clear national guidance has left MATs navigating what one participant described as a “wild west” of AI products, often with limited evidence to inform decision-making.

The report also raises legal and ethical concerns, including data protection, algorithmic bias and growing inequalities between schools with differing levels of digital infrastructure. Smaller trusts, in particular, face challenges in evaluating AI tools and negotiating with private providers.

EPI’s report calls for stronger government oversight, clearer guidance on AI and digital literacy, and greater collaboration across the sector. It recommends that larger trusts help support smaller ones, and that further independent research is prioritised to ensure AI adoption benefits pupils, teachers and schools alike.