Artificial Intelligence (AI) brings the most seismic disruption to our lives since the industrial revolution and the invention of the internet. It has the power to change our lives for the better but also risks dislocating so much of the connective tissue that binds our social fabric.
AI offers unparalleled, rapidly evolving opportunities to improve learning and achievement for all our young people, reduce workload meaningfully for our teachers and leaders, and transform how we address issues such as poor attendance and behaviour.
I asked ChatGPT how an incoming UK government should create policies to support the use of AI in education. You can read its answer – generated in less than a minute – here. In addition, the government recently published a report on Generative AI in education, outlining the views of experts, teachers and leaders about the technology which will empower – or overwhelm – us.
For me, five key policy decisions emerge.
First, policymakers should clearly outline the ethical parameters for use of AI in schools. There needs to be transparency and a common, informed understanding – among pupils, parents, teachers and regulators – of what is permitted and what is proscribed.
This isn’t just about coursework plagiarism; it’s also about how we use AI to make decisions, share and analyse data, generate learning and assessment materials and evaluate everything that a school does. The government is best placed to determine the universal expectations that ensure fair and equitable use of AI.
Second, the government should develop quality training materials, accessible online for teachers and leaders on how to make safe and effective use of AI to support learning, reduce workload and manage data. Such training would be mandatory, would set a minimum standard for professional practice in our schools and would require speedy and consistent adoption.
Third, government should work urgently with Oak National Academy, school groups and examination boards to create safe and effective tools for personalised tutoring so that no young person leaves school illiterate or innumerate.
Too many children, disproportionately from the most deprived communities, lack access to consistently good quality teaching during their time at school. AI could generate a curriculum that teaches our children how to read, write and do arithmetic delivered in a way that is tailored to address the precise gaps in their learning and at a pace that suits individual needs. A child could draft a story and the AI Tutor could show them within seconds how to improve it.
By the end of this decade, every child should have an AI tutor from the age of five. Making such tools available free-of-charge to families across the country could help eradicate educational inequality far more effectively than several decades of policy and funding. (I am emphatically not advocating that AI supersedes human interaction or that home education is the default model for the future.)
Fourth, for all of our children to use AI effectively, benefit from the opportunities it affords and not succumb to its risks, we must ensure their digital literacy. AI will sort out the mechanics of coding for us, but all our young people need to leave school knowing how AI can be used to improve their life chances.
The most successful future citizens will possess a core body of knowledge – cultural capital – coupled with the digital, creative and social skills to make use of the technological opportunities that present themselves. We need to design and deliver curricula in all our schools to equip these future leaders.
Finally, AI needs to be for everyone. Legislation must ensure that all AI tools are as inclusive as possible so that learners and teachers with disabilities, sensory and cognitive needs can benefit from transformational technology in an equitable manner. AI tools must also reflect the richness and diversity of our society and jettison its prejudices and biases.
If we are to ensure that AI is the force for good that we all wish for it to be, it is essential to anticipate and prevent flaws rather than try to minimise collateral damage after the cyber genie is released from the bottle.
This article is part of a series of sector-led policies in the run-up to the next general election. Read all the others here
Having read this article- I feel I’ve just had a lesson in how to suck an egg!