AI in education

The Department for Education (DfE) has outlined its position on Generative Artificial Intelligence (AI/GenAI) in education, highlighting both the transformative potential of these tools and the responsibilities that schools, colleges and policymakers must uphold to ensure technology is used safely and effectively. 

In this blog, we break down the key messages from the DfE’s guidance and explore what they mean for the future of teaching, learning and governance across the education sector. 

At Price Bailey, our dedicated Internal Audit Team are now offering ‘AI in education’ as a new internal audit review option. You can contact us about this here

What is Generative AI?

Generative AI refers to tools that can create new content, including text, images, code, simulations, audio and video, based on large datasets they have been trained on. Well-known examples include ChatGPT, Microsoft Copilot, Google Gemini, and similar large language model (LLM)‑powered applications. These tools are designed to respond to prompts, complete written tasks, answer questions and generate content in a human‑like way.

How can AI support education?

The DfE highlights several key benefits:

  • Reducing administrative burden

Teachers and leaders face significant administrative pressure. AI can streamline tasks such as drafting emails, producing lesson resources, summarising information and generating templates.

  • Enhancing feedback and personalised support

Research shows high‑quality feedback has one of the greatest impacts on student outcomes. Generative AI tools can support educators by helping them provide tailored feedback quickly and consistently.

  • Supporting teaching and learning

GenAI can be integrated into lesson planning, knowledge checking, differentiation, and resource creation. It can help educators better understand pupil progress, enabling personalised learning pathways. It can also create quizzes and revision materials.

  • Creating accessible and inclusive learning environments

Generative AI can adapt content to meet diverse learning needs, for example, by converting text into alternative formats or simplifying complex language.

What are the riskof AI in education? 

The DfE is clear, it can still produce content that is: 

  • Inaccurate or misleading 
  • Biased 
  • Inappropriate or unsafe 
  • Out of date 
  • Infringing on intellectual property 
  • Convincingly wrong (also known as, “hallucinations”) 

AI tools generate content based on vast training datasetsnot the national curriculum, and therefore teachers must apply professional judgement and verify accuracy. Whilst AI can support expertise, it cannot replace it.  

What the DfE expects when using AI safely and effectively 

The guidance encourages a strategic, riskbased approach to adoption: 

  • Clear governance and accountability 
  • Robust testing and compliance 
  • Risk assessments 
  • Staff training and awareness 

The DfE also warns that generative AI could increase the sophistication of cyberattacks, making strong cybersecurity practices more important than ever. 

Does your academy trust’s risk register include AI specific risks? If not, it is advisable to include them on a watch list within the risk register.  

What governance around AI in schools should look like (core pillars of governance) 

AI governance is essentially the framework that ensures AI is used safely, ethically, and effectively. In schools, that means balancing innovation with responsibility and making sure decisions are transparent, consistent, and aligned with statutory duties. 

Below are the core pillars of strong AI governance in education.  

Clear policies and leadership ownership

Schools need explicit policies that set out: 

  • How AI can and cannot be used  
  • Expectations for staff and students 
  • Rules for homework, assessment, and academic integrity 
  • Procedures for evaluating new AI tools 

Leadership teams should own these decisions. The DfE’s early research shows that school’s benefit from having AI leads or “AI champions” who coordinate safe adoption and support colleagues. 

Does your academy trust have an AI policy in place? 

Risk assessment before adoption

Before any AI tool is used, schools should carry out a structured risk assessment covering: 

  • Safeguarding 
  • Data protection 
  • Age-appropriateness 
  • Bias and misinformation risks 
  • Cybersecurity 
  • Copyright and intellectual property 
  • Impact on teaching and learning 

Strong safeguarding and filtering controls

AI tools must be covered by the same filtering and monitoring systems that apply to all online content in schools. This includes: 

  • Blocking unsafe AI tools 
  • Monitoring student interactions with AI 
  • Ensuring age restricted tools are not accessible 

Data protection and privacy compliance

Schools must comply with UK GDPR and data protection law. That means: 

  • Not uploading personal data into AI tools without a lawful basis 
  • Ensuring student work is not used to train AI models 
  • Checking where data is stored and processed 
  • Reviewing vendor privacy policies 

Copyright and Intellectual Property controls

Schools must ensure: 

  • They do not use AI tools trained on unlicensed content in ways that breach copyright 
  • Students’ original work is protected 
  • Staff understand how AI-generated content can and cannot be reused 

This is an area where many schools need additional training. 

Academic integrity and assessment policies

AI changes how students complete work, so governance must include: 

  • Clear rules on acceptable vs. unacceptable AI use 
  • Updated homework and coursework policies 
  • Staff training on identifying AI-generated work 
  • Assessment design that reduces opportunities for misuse 

Ongoing monitoring and review

AI evolves quickly. Governance must be: 

  • Reviewed regularly 
  • Updated as tools change 
  • Responsive to new risks 
  • Informed by staff and student feedback 

Staff training and digital literacy

Governance only works if people understand it. Schools should invest in: 

  • Staff training on safe AI use 
  • Guidance for students 
  • Parent communication 
  • Professional development on AI literacy 

The DfE emphasises that confident staff are essential for safe adoption. 

AI Policy engagement calendar

 

 

Embedding responsible AI use across the year in your academy trust can look like this. 

A PDF copy of this calendar is available to download calendar is available to download here. 

Internal Audit services provided by Price Bailey 

Want to strengthen your academy trust’s AI governance? 

At Price Bailey, our dedicated Internal Audit Team are now offering ‘AI in Education’ as a new internal audit option. This review will focus on the governance, oversight and use of AI across an academy trust and evaluating whether policies, controls and practices support safe, compliant and effective implementation. 

Closing remarks  

When leadership teams put the right structures in place, AI becomes a powerful tool that reduces workload, supports learning, and prepares students for a digital future. If you have any questions, feel free to reach out to us using the form below. 

We always recommend that you seek advice from a suitably qualified adviser before taking any action. The information in this article only serves as a guide and no responsibility for loss occasioned by any person acting or refraining from action as a result of this material can be accepted by the authors or the firm.

Have a question for our Academies team? Ask us below...

Sign up to receive exclusive business insights

Join our community of industry leaders and receive exclusive reports, early event access, and expert advice to stay ahead – all delivered straight to your inbox.

Sign up

Top