Schools should be funded to run trials of potentially effective AI to check it does actually boost pupils’ outcomes, government experts have said.
A long-term strategy on the use of generative artificial intelligence (AI) such as ChatGPT in schools is also needed, the government’s open innovation team said today.
The Department for Education had asked the team to explore the opportunities and risks for AI in education, including proposals on what needs to change.
Writing for Schools Week today, academies minister Baroness Barran said: “The challenge is to make sure the enormous potential of AI can be put to work in schools, while keeping children safe from its risks.”
In November, 42 per cent of primary and secondary teachers had used generative AI in their role – a rise from 17 per cent in April.
Separately, the DfE has also updated its school technology standards on how devices should be accessible for pupils.
Here’s our round up of everything you need to know …
1. ‘Flipped learning’ could increase
Experts warn a long-term generative AI strategy is needed to set “the direction” of travel. Long-term planning should explore how AI could change education models, including implications for the role of teachers and classroom-based learning.
For example, “flipped learning” may become more pronounced, experts said. This is where students engage with learning materials outside of the classroom and come to a lesson with basic knowledge to participate in more “interactive activities”.
This strategy should be “future-proofed to keep pace with technological advancement”.
Forums made up of students, experts and practitioners to share knowledge about any changes in future AI.
2. Give schools funding to evaluate ed tech impact
Experts said there is a “growing need” for a larger evidence base to help educators make informed decisions about the effectiveness of genAI tools.
Key evidence gaps include its impact on pupils’ outcomes, especially for disadvantaged and children with SEND.
Ministers should set “metrics that matter”, such as student outcomes over engagement, ensure tools are pedagogically grounded and can be routinely evaluated.
It will require incentives and resources as schools are “unlikely to do this themselves” and the ed tech sector has a “vested interest” in showing effectiveness.
They suggest making funding available to schools to evaluate, as well as building on existing schemes such as the Oak National Academy curriculum quango.
3. Research funding needed to help teachers detect AI
As AI-enabled academic malpractice rises and becomes more sophisticated, it will become harder for teachers to identify its use, experts warn.
They say research funding is needed to support the development of tools reliability detecting AI-generated work as well as other initiatives that could help.
This includes watermarking, which embeds a recognisable unique signal into AI creations.
Safety, privacy and data protection accreditations could help reassure users.
4. Consider how to prevent ‘digital divide’
The curriculum should be updated to reflect how students use AI, or to integrate AI tools as an explicitly part of learning and assessment.
It should also be changed to meet employer needs going forward. But this will require collaboration between employers, government, awarding bodies and educators.
But experts warn generative AI could exacerbate “the digital divide” in education and there is already an emerging difference between state and independent schools’ use of the technology.
Government should consider how to support access by all teachers and students, they said. Evidence informed guidance and advice should be easily accessible through trusted platforms.
5. ‘Be transparent on impact evidence’, Keegan tells edtech firms
Experts warn more research is needed to better understand the intellectual property of genAI. This includes the infringement of IP rights due to the data input into generative AI models.
Traditional educational publishers could be left behind, the report warns, as teachers and students turn to generative AI to produce educational resources.
“Support for educational publishers may be needed to ensure we have a sustainable publishing sector underpinning the education system,” it adds.
Speaking today at the BETT show, education secretary Gillian Keegan also said “we should have the same expectations for robust evidence in edtech as we do elsewhere in education.
“Ed tech business should be leading the way – being transparent with buyers and promoting products based on great evidence of what works.”
What schools needs to know from updated tech guidelines…
Last week, DfE said schools should now assign a senior leadership team member to be responsible for digital technology, as part of updates to its technology standards guidance.
They should then create a minimum two-year strategy including what devices might need to be refreshed or replaced. Laptops should be safe and secure as well as energy efficient.
In another update today, schools were told devices and software should support the use of accessibility features including for disabled students.
Websites should be accessible for everyone and digital accessibility should be included in a school’s policy.
Perhaps the education system needs to evaluate HOW they assess learning, moving away from tasks which can be replicated through regenerative AI and towards hands-on, in-person, discussion-based activities.