
The digital divides are something I have been discussing for a while. They generally aren’t anything new albeit I always use the plural rather than singular divide. This is due to my believe that it isn’t a simple single divide but multiple inter-related divides including access to hardware, high speed internet, support, and more. And in the discussion of AI I have been worried about it adding another divide, but speaking recently at an Edexec live event got me thinking a bit broader.
AI closing divides.
Maybe AI might close divides rather than open them. If we consider teaching staff, maybe AI in the hands of teachers will result in teachers generally being able to be more creative and engaging with lesson content. So rather than some students benefitting from creative teachers, being artistic, musically creative, etc, with the skills to turn this into lesson content, AI will put these capabilities into more teachers hands. You can create something artistic without necessarily being artistic yourself, as long as you have the ideas and can outline to Generative AI. I think back to teaching during an OFSTED inspection many years ago and I did a lesson on relative vs. absolute cell referencing in Excel using the game of battleships to get the concept across. I had the skills to make this engaging with video content and more, but I would suggest at that time, some maybe 20 years ago, I would have been in the minority. Fast forward to today and video and image content can easily be created using AI, putting the potential to create interesting, engaging content in the hands of more teachers than ever before.
We also need to look at student work, such as coursework. Those students who struggle to get started, or need support finessing and checking their work suddenly have AI tools available to help. Those students, taught in English but where it is their second or maybe third language now have tools to translate content. Students with SEND also have AI tools which can help, and this help basically amounts to reducing or even removing the divides which previously existed. In one discussion after my session at the Edexec event we were discussing coursework and marking with the suggestion that the gap between the best and the worst work will be narrowed through AI. This may lead to a need to refine marking boundaries, to refine expectations or even to refine the assessment methodologies as a whole, but whichever way you look at it, it is a reduction in some divides.
AI growing the divide
The likely big issue is one of socioeconomic divide and access to AI tools and the required devices, infrastructure and support. This will be uneven. But I wonder if it is for schools to solve socioeconomic issues which stretch way beyond schools, into access to health support, opportunities beyond schools, positive family cultures and more. We do want to seek to address this but am not sure schools have it within their power.
What schools do have in their power is to address the divide which may grow between those students at schools engaging with AI and those schools seeking to try and prohibit and ban AI use. If we simply accept AI is here, has been for a while, and that we are all using it, and especially that students are using it, then maybe a ban doesn’t make sense. Maybe we then find ourselves seeking to work with and teach students about AI and abouts its ethical and safe use.
Elephant in the room

And as to the “cheating” narrative, is a pen and paper cheating over having to explain a concept in person? I would suggest for an introvert a debate or discussion on a concept would put them at a disadvantage, however providing pen and paper shapes thinking and the output. It encourages slower linear thinking and a type of structure not quite as present in a discussion or debate. Taking this idea further, what about the students using a laptop or computer as part of their exam concessions; Is this cheating? Isnt it just about reducing the divide between them and other students? So why is AI use cheating if it reduces divides? Maybe we need to start asking students about why and how they used AI, what the benefits were, etc. And definitely, lets not ask them to reference AI tools as I don’t see the point in this; They don’t reference which search engine they used, yet this shaped the resources presented to them. AI is a tool, it is here, so lets getting students using it, but teaching them about its use and getting them to use it safely and ethically. Yes, some students may try to use it to cheat, but lets treat them as the exception rather than the rule, and develop plans for how we deal with this. If we don’t believe the work is the students, that it represents what they have learned, then lets just ask them to present or to explain it.
Conclusion
AI is a tool, it is here, and it has the potential to narrow some divides, as well as the potential to widen others. I doubt there will be a perfect solution so we are going to need to navigate our way through, considering benefit and risk and making the best reasonable decisions possible. If we can narrow the key divides, where schools have the ability to address such divides, where avoiding widening divides, then this is likely the best we can achieve. Maybe this will require us to think carefully about the scope of education and schools and what they can reasonably be expected to impact on and start there.