FutureShots 2024, Part 1

Early this month I had my second opportunity, post returning to the UK from the UAE, to contribute to an international conference event, this time the FutureShots event in Italy, not far outside Venice.   Now I have already posted on my Gondola experience during this particular trip however I would now like to share some thoughts from the conference proper, and in particular the first day of the conference which was focussed on AI in education.

The keynote session was delivered by my friend and colleague from the ISC Digital Advisory Group, Laura Knight who delivered her usual flawless presentation with so many take aways, so let me try to summarise the ones which particularly resonated with me.

Binaries: I have long been concerned by binary arguments which seem to dominate lots of education discussions.    In the case of AI things are no different with people either being doom and gloom, AI will end the world, or being evangelical about its ability to transform the world and education for the better.    The reality, as I have often stated, is that reality lies somewhere in between with positives balanced out against negatives, challenges or drawbacks.   AI isnt positive OR negative, but both positive and negative, and very dependent on the people using it, how they are using it and the task to which they are putting it, be it for good or for evil.

Trough of Disillusionment:  Laura suggested that we may be passing the hype part of AI and moving into the “trough of disillusionment”.   There has certainly been a lot of singing and dancing about AI in education and maybe this is wearing thin as generally the impact has been less than advertised, but I also note that the tech is improving and advancing quickly.   Only in the last few weeks we have seen GPT 4o and similar advancements coming out of Google, so could it be that as we approach the trough of disillusionment with one iteration of generative AI, that a new iteration and new functionality appears throwing us back into awe and wonderment.

Now Laura delivered many more points which I took away from her session. This includes considering ownership of ideas, agency in the use of the tools, the importance of trust, integrity and truth, and much more.   I will however save some of these for future blogs.  

The final, and possibly biggest point I took away from the session related to the term “resilience” which is often stated as a characteristic we wish to foster in students.    Laura raised concerns that although resilience is important it is not a state we can live in for any length of time.   This loosely aligns with my concerns regarding the “do more”, “be more efficient” narrative which we encounter all so often, both in education and beyond.   This “do more” with the same resources, pushes us increasingly into survival mode and “resilience” and this is something which is unsustainable over time.   Laura suggested an alternative in “equanimity” and being comfortable and calmly coping and managing change.   Now I am not 100% sure on this term yet, but I definitely agree with the sentiment that maybe we need to be a little more careful in over selling resilience as the solution to our challenges.

Next up was the panel session which I was involved in, chaired by Alessandro Bilotta, Content Director for EDUtech di Terrapinn along with Carlos Garriga Gamarra, CIO, IE University, Donatella Solda, Presidentessa, EdTech Italia, and Diego Pizzocaro, Head, H-FARM My School.   Now I must admit I didn’t take any notes during this one, having been a bit too busy being involved in it but the session did pose some interesting questions such as what it means to be human in a world of AI and generative AI?    If they key thing for us humans to do is the things AI cant do, what are those things?   Now I think the key thing is the social side of life, the human to human interaction including non-verbal queues, so not a Teams or Zoom call.   I used the term “human flourishing” as I think that sounds about right in principle although I will admit I havent quite bottomed out what human flourishing actually looks like;  I suspect that’s a work in progress.    Another question related to GDPR and AI, and whether GDPR was a road block.  For me it isnt;  We’ve been using satnav and google and social media for years without too many GDPR related questions.   Data protection is important but good practice in terms of data protection is independent of whether you are looking at an AI based solution or a non-AI based solution;  Its simply just good data protection practice.

EdTech startups were the next session of the conference with a number of startups each providing a short pitch of their product;  I must admit to being impressed with some of the pitches not just due to the ideas, but due to the presenters delivering in English where their native language was generally Italian.   Doing a short time bounded pitch is hard enough without having to give it in a second language.   Now the fact that H-Farm has these startups as part of their campus is such a great idea as it encourages the co-creation of solutions rather than tech vendors creating what they think education wants, and then spending lots of money convincing educationalists that their product is the one and best solution.

We were not even through the morning at this point and I already had quite a few thoughts and ideas to take away and consider.   My surface battery was depleting fast, an issue which was to impact me later on in the day but the day was going well.   Now I have plenty more to share from the event, however am going to split things here for now and continue in a subsequent blog.    If I was taking away a key thing from the morning it was the need to put the humans at the centre of AI use.   It is about assisting humans and allowing humans to therefore focus on the things which humans do well, and that support “human flourishing”.

Schools and Academies Show 2024

It was the Schools and Academies show recently and I was once again fortunate to be given the chance to speak at the event, as well as being given the opportunity to be involved in their hosted leaders events prior to the main show.   It was a busy day or so, but equally very useful.

Now the event started with my usual travel woes and my second train being cancelled leaving me looking for a plan B to get to London.   Am not sure why these things keep happening to me;  Is it bad luck, karma or simply a less than resilient rail network operating in the southwest of England?   I suspect there is a bit of everything thrown in.   Thankfully I managed to find an alternative route and made my way to London, and I will admit using the Lizzy line meant that I wasn’t too badly delayed when compared with previously using the tube and DLR.  

So the hosted leaders event before the main event was fun speaking with a variety of different people outside of my usual EdTech crowd including a head from a school abroad, a school SENCO, a school business manager and someone representing a teachers union.  Additionally there was some brilliant music from two students showing off how important the creative arts are.   

As to the event itself I have a fair few meetings with vendors in the diary with these being useful and giving me things to take away from the event.   This included an impromptu discussion with a company providing a managed telephone service aimed at children, looking to help address the challenge of students with smart devices and keeping them safe.   I also met with a well-known interactive whiteboard provider as to some of their recent developments.    Additionally, as I walked the show floor I bumped into people resulting in discussions, often also meaning I was less than timely in my attendance at my scheduled meetings;  To those I turned up too late, I do apologise.

There were also the usual presentations and panels, although this time I didn’t manage to attend as many of these as I had hoped although I did manage to attend a great session towards the start of the day including both Gemma Gwillam and Neelam Parmar.     One session I was annoyed I missed was the session on AI including Sir Anthony Seldon.   Oh, and then there was the panel session I was involved in, and speaking at myself, looking at phones in schools and whether we should seek to ban them.   I will write more on this session and my thoughts in the near future.    It was interesting on reflection that as a panel we were generally in agreement as to the direction of travel in the need to manage phones rather than ban, and also on the importance of education of students and of parents.  The fact the “ban phones” discussion keeps coming up is frustrating given how long it has been discussed however the panel, in my opinion, seemed to show that progress has been made and that many are adopting a more pragmatic and context-driven approach rather than a blanket ban;  This for me is good news.    I will however note that I am not sure if a panel who are all in agreement is a good thing, or maybe we are an echo chamber?   And maybe this is exactly the challenge facing our children, and more broadly society, is the balance between binary arguments and polarisation, and echo chambers and constant reinforcement of current viewpoints.   How do we reach balance?  The chair did try to stir things up by asking what we would do if research did establish a negative causal effect of smartphones on learning, however in the social sciences I think proving such a causal link is nearly impossible as there are simply too many variables at play.

One thing that made this event stand out for me was the DFG (Digital Futures Group) and being involved in the Schools and Academic Show alongside such valued colleagues as Gemma Gwilliam, James Garnett, Darren White and Abid Patel.   The Schools and Academies Show actually saw the DFG officially announce our launch and I look forward to some exciting times ahead with the group and in the lead up to the EduTech Europe event later in the year.   

Also, it was great to catch up with the team from the ANME as well although I did not spend as much time with them and on the ANME stand as I would have liked.   As a group they continue to offer the IT staff in schools and colleges a source of support, help and guidance which is all the more important as technology use increases and given the challenges associated with IT roles which often operate invisibly to the school except when things are going wrong.

In terms of both the DFG and ANME, the quote I so often use, from David Weinberger, is “the smartest person in the room is the room”, and the DFG and the ANME mean I benefit from being part of a very big room, and hopefully am all the better for it.

And if I am looking at the bucket list then this is the first conference I have ever been thrown out off!   And no it wasn’t due to poor behaviour, or controversial views or similar but due to the fact myself and others were so engaged in discussion post-event that the security staff felt the need to (rudely) force us to leave.   Apparently there were issues with public liability as we stood continuing our discussions.

Oh and also I temporarily found myself drink an orange substance that wasn’t the amber nectar, the Irn Bru.  The photo of me during the panel sessions provides evidence of this although the app for the event listed me twice so maybe the non-Bru drinking Gary was actually a doppelganger.    It was a shock to the system but I promise all that normal service was promptly resumed and upon returning home I will drink many cans of the Bru to clear my system of the non-Bru liquid!

So, I write this on the train heading back to somerset, and a three hour trip, all being well which with me is far from guaranteed.  (additional note:   On the return leg, my second train looked like it was going to be cancelled however did turn up just a little late;  that had me worried as the following train wasn’t until 1hr later and it was already 930pm).   So onwards to my next set of events, and what has so far been a very busy 2024;  Better busy than bored, plus sharing and networking continue to enrich my professional development way beyond any CPD course I have ever attended.   

Maybe the DFG and ANME are the model more people should adopt in forming groups, sharing, collaborating and growing together, across different educational sectors and contexts.

Esports event, Salford

I recently had the pleasure of presenting on esports at The Lowry Academy, alongside Kalam Neale from the British Esports Federation.    I have long been a believer in the potential for esports to be a positive vehicle for supporting student engagement but also the development of a lot of the soft skills that are important in life beyond school, including leadership, resilience, and teamwork to name but a few.   It was therefore great to be able to share but also even better to hear what the staff and students at The Lowry Academy, alongside 3 other United Learning Salford schools are all doing in relation to esports.

In terms of my presentation I would like to just share some of my thoughts and 5 pieces of advice in relation to esports, based on my experiences at Millfield, and as shared at the event.

It is not all neon lights

When you think of esports and when you look at professional events it’s all neon lights and high-powered PCs, expensive gaming keyboards, mice and headsets.   From the point of view of schools, this is difficult to square away especially where funding is often limited.    Although creating such environments may have its advantages it isn’t a requirement.  When we launched esports at Millfield we had a couple of IT labs which needed to be updated, plus we were moving to standard desktops rather than the overpriced all-in-ones we had previously.   We knew that the labs needed to be appropriate for Computing teaching and we didn’t want to distinguish these rooms from our other IT labs which weren’t up for replacement.   As such, in looking to prepare to deliver some esports provision we basically increased the spec of the PCs in terms of the graphics card, processor and memory, but opted to keep it in the same PC chassis we normally used.    So, we had two labs with PCs capable of running Overwatch 2, League of Legends and other esports games but the labs themselves didn’t look any different to other IT labs.  I note the higher-spec machines had other potential benefits beyond esports in terms of software they could run to support Computing, Art and other subjects.  That said, later when we started looking at esports and Rocket League in particular at our prep school we simply used the i5, 8Gb PCs we already had, and this worked fine.

Small is good

Now our upgrade work involved two labs as these labs were up for refresh anyway and therefore all we were doing was increasing the cost a little in line with higher spec machines however there is no need to go full lab.  If looking at Rocket League for example it might be ok to have only 3 machines to run a team playing against other schools, or maybe have 6 machines to allow two internal teams to play off against each other.   You can scale the equipment based on your available financial resources combined with your anticipated interest in your planned esports provision.

Beware updates

One thing that has snagged me a few times, usually after a holiday period has been game updates.   Myself and the students have rocked up ready for a bit of Overwatch 2 for example, following the easter break to find each machine needs a 6 or 7Gb update.   Queue a wait before you can get a match started and queue my network team asking what the hell is eating up all of our internet bandwidth suddenly.   As such it is well worth planning to check and update games towards the end of holiday periods to reduce the risk.   The game vendors might still release an update but hopefully by keeping on top of things it will be a smaller rather than cumulative update, and therefore a lesser delay.

Consumables

We haven’t provided any fancy keyboards or mice, which may make us a little less competitive, but it means where there is wear and tear we can quickly replace it.   That said I haven’t seen significant issues with keyboards and mice, however where we have used controllers, these seem to suffer wear and tear and therefore factoring this in to allow for occasional replacement is well advised.   In terms of headsets, the key is to avoid going too cheap, ideally spending a bit more on good headsets, which therefore, with careful treatment by students, are likely to last longer.   I learned this lesson in relation to headsets as an IT teacher years ago, that spending a bit more makes sense and that savings in the short term, on cheaper headsets, often ends up more expensive in the longer term.

Work across year groups

Initially, when I looked at esports I focused very much on getting students in teams with their peers, in the same age group and year group.   This, in hindsight, is I believe a mistake.  I had some issues with low-level behaviour and with the engagement of some students.   As soon as I put students together across year groups it worked much better and I also think it required students to develop their communication and collaboration skills more, given they were having to work with students who may be younger and older, but towards a common aim of winning their match.  I would therefore recommend any esports provision allows students to work across year groups, although within reason.

Conclusion

The FE colleges are doing some amazing things in relation to esports, often spurred on by offering esports BTecs as a programme of study.   Schools lag behind but the potential benefits are the same and the cost of getting involved is minor.   You don’t have to have a room painted black, with neon strip lights, expensive gaming chairs and £2000+ PCs.     All you need is a couple of PCs with the appropriate specification and you can get started.   It was great to hear from Lowry Academy and some of the other United Learning schools in relation to their recent pilot of esports and their Rocket League competition across 4 schools. The student enthusiasm was obvious for all to see. I can only hope that following this event more schools get involved. I look forward to continuing to support the growth of esports in schools and seeing more schools pick up on the potential which esports has to offer.

Google Discovery Day

I was lucky, thanks to a kind invite from Gemma Gwilliam, a colleague from the Digital Futures Group (DFG), to join staff from several Portsmouth schools in a visit to the Google offices in London.  Now I note my school largely uses Microsoft however I have made use of Google as the primary platform in previous schools I have worked with.    For me, the focus for all schools should be using the best tool for the job and therefore this may involve using Google and Microsoft tools at different times and for different jobs.   In this post, I would like to share just a couple of my key takeaways from the event.

Accessibility

This was definitely one of the key areas for the event in discussing the various gaps which exist within education, whether they are academic performance gaps or digital gaps.    The gap related to disadvantaged students, in particular, was discussed but also gaps in relation to accessibility related to special educational needs and disabilities were also raised, including a visit to the Google Accessibility Discovery Centre (ADC).  It was key for discussions and the various sessions which were delivered that technology, including Google technology, has such potential to help us with narrowing these gaps but in itself this presents a bit of a paradox as we would need to first address the gap of access to reliable infrastructure, devices, support, etc.   

Artificial Intelligence (AI)

Unsurprisingly AI was on the list of discussion points and I was really happy to hear some of the same messages I have provided being reiterated.   I liked the example used in terms of how a generative AI solution works in particular.  We as humans when given a question use the information we have absorbed to predict the answer, and a generative AI solution isn’t that much different.  I also liked the comment in relation to hallucinations being a term we should avoid however my concern has always been about this anthropomorphising genAI solutions whereas on this occasion it was raised that it was providing an answer we didn’t expect or which was simply wrong;   Would we want our students claiming they had simply hallucinated or is a wrong answer a wrong answer?      The key here was definitely that AI will increasingly make its way into our daily workflows and the suggestion was that for many of us, it will simply appear in the products we already use and therefore will be almost transparent to us.   This seems to ring a bell as we have been using AI for a while in our spellcheckers, preference functionality in Amazon and Netflix and our search engines, yet have never really identified it as being AI as opposed to being simply how the platform works.

Networks and sharing

One of the key takeaways from this event as with so many other events I have attended is the power of a group of people sharing.   We might not all operate in the same context in terms of our schools, or have the same views, but together sharing ideas, successes and failures, we are all better collectively for it.   David Weinbergers quote continues to be my go-to quote:  “The smartest person in the room is the room”.   The more we share, the more we come together and discuss, accepting disagreement as much as we accept agreement, being brave and encouraging diverse people and views, the better we all are.  

Context is king

One of the other points which really stuck with me in the event was in a presentation which talked about educational research.  The key thing which chimed with me was a warning regarding people who quote that “research says…..”.    I have heard this so often however the reality is that most research is limited in scope to be suggestive in terms of the context, impact, application, etc.    That’s not to discount research as educational research is very important, but we mustn’t lose sight of the importance of context and how something that succeeded or failed in one content, may do the absolutely opposite in a different context.    Education is simply too complex with too many moving parts, the students, the teachers, the parents, the school and many more variables which means that research can be very helpful but it will never provide a cause and effect.   So it’s a great guide and provider of direction but never an out-and-out proof of what will work across all schools, students, etc, in general.

Conclusion

I very much enjoyed the event and feel I took quite a bit from it.   My day-to-day largely does involve Microsoft however I try to avoid referring to my school as a Microsoft school.   We seek to use the tools which have the best impact so it was great to see and hear what Google have to offer and there definitely was a lot that they can offer.  And, an opportunity to network with staff from other schools and contexts is always valuable.    This I suppose is why I believe so strongly in the Digital Futures Group which myself and Gemma are part of, and without which I am not sure this opportunity would have arisen for me.   The more networks like this that exist the better, and hopefully the DFG will help show some of the potential impact and point the way for others looking to set up similar networks.

ISC Digital Conference 2024

I once again was privileged to speak at the ISC digital conference the other week, this time as the vice chair for the ISC digital advisory group as opposed to a member.   It was, as it was last year, a very useful and interesting conference, combined with an iconic location in Bletchley park.   I scribbled many notes from the various sessions and therefore wanted to distil those into a couple of key thoughts below.

Prof Miles Berry was his usual barrel of energy in his presentation, putting forward lots of interesting points for consideration.   Following on for the Oxford Academies Business Managers Group (OABMG) conference I attended the other week, Miles certainly was brave in his presentation, opting to actually do a live demonstration to illustrate the potential power of generative AI in terms of helping towards the challenges related to teacher workload.   I have attended so many conferences which discuss AI but it was so nice to actually see it in practice as Miles took a topic from the audience and worked through the creation of content for students, resources, lesson plans, etc., all in the space of minutes, but also highlighting that a teacher at their best could likely do better, but certainly not quicker.   This clearly highlights the efficiency and workload benefits of generative AI, but also the importance of seeing genAI as an assistant to be paired with our own human strengths.

Neelam Parmar then presented on developing an AI curriculum and there was one question which stuck very much with me.   What is AI?    Now why this stuck with me is both the inconsistency in terms of the use of the term and related terms (machine learning, deep learning etc.) but also in terms of the broader question it might hint to in terms of what is intelligence.    Can we accurately and consistently define what we mean by intelligence?    And if we cannot can we truly be confident in creating an intelligence, an artificially created intelligence or AI?    It’s a bit deep, but maybe this is a question we maybe need to consider, as it also hints towards considering the differences between human and artificial intelligence, and therefore the benefits and drawbacks of each.   I do often wonder how different an AI is to human intelligence in terms of how the human brain really works in processing the huge amount of “data” in the experiences and information we consume throughout our lives.   Is the key difference between AI that of emotion and the physical nature of our intelligence in relation to our physical existence?   

The AQA presentation was next up in terms of ideas which stuck with me, helping me feel a bit more positive in terms of where we are in terms of exam board engagement in relation to the use of AI in assessment and in schools.  I will admit to being disappointed that the Polish and Italian trial has been pushed back further to 2027, which I think is too far away, however, I get that it takes everyone to be onboard to move this forward so there are hoops exam boards must go through.  That said there were definitely positive noises in relation to analytical data on outcomes with school data being pulled in, and resulting info pushed back.   This goes to reducing the administrative burden but also to more effective use of the vast amounts of data schools gather.  It was also good to hear of AQA seeking to share a diagnostic tool for Maths;  Tools like this might just help us to find the best way forward in relation to adaptive, diagnostic and even summative testing.

I once again enjoyed hearing Tom Dore talk about esports and the potential benefits for schools adopting this.   It aligned so well with the earlier presentation which highlighted some of the softer skills which the World Economic Forum have identified as important for the future.  It is so much more than simply gaming, but involves communication, leadership, resilience, problem-solving and so much more, plus it often engages students who may be otherwise less engaged.   It was also so good to hear Amy-Louise Cartwright’s approach in her school and how they, albeit in their early stages of development, have already made progress and have plans for the future.  I loved the esports suite they have created, as although we have been involved here in esports for a while we have been using our normal IT labs, albeit with upgraded PCs capable of supporting the relevant esports games.

Conclusion

The ISC digital conference, like so many other conferences, is about getting schools and school staff together and sharing.   This year’s conference did exactly that, and that let me get my piece in as well which was nice.   It was also nice to be at Bletchley Park and its wonderful auditorium.   Now I will note my train ride to and from the venue was far from straightforward, but the trek was worth it, and I look forward to seeing where we stand in a years time, at the 2025 conference.   Will we have progressed significantly, be asking the same questions, or will the challenges have changed or even been addressed?   Only time will tell.

OABMG Conference

I was lucky enough to be invited to speak at the Oxfordshire Academies Business Managers Group (OABMG) annual conference earlier in the week where I was speaking on AI in education and the possible impact and implications on school business managers.    It was a lovely event and I really enjoyed Sarah Furness the keynote speaker, however, sadly I had to leave following my session in order to catch a train, one of a number of trains needed to get me to and from the event.

Be brave

Sarah was both insightful and entertaining and to be honest, I could likely write a whole blog post just on the stories she shared however let me just summarise my key takeaways from her presentation.    Her key message, which resonated for me, was the need to be brave, which aligns with the values of my school, and also is so very important where we have technology advancing at such a pace but with regulation lagging so far behind.   We have no choice but to be brave especially given both students and staff are already experimenting with the use of AI.  We need to be brave in engaging, we need to be brave in experimenting and we need to be brave in accepting where things don’t go quite as they planned, but learning from these experiences.   The need for sharing, asking difficult questions and accepting challenges also aligned with my thinking, and again looking to AI in education, if we are to find our way with AI in schools I think this all rings very true indeed.  We need to be sharing our thoughts, and both challenging and accepting challenges from others, if we are to move forward.    Sarah’s talk was about leadership, using her context as a military leader and pilot;  maybe this will be key in the use of AI in schools, the need for effective, brave leaders who value and encourage diversity, sharing and challenge.

AI in education

Going into my presentation my key aim was to discuss AI in education and some possible uses for school business leaders.   I don’t have all of the answers, and to be honest, I don’t feel anyone has all the answers when it comes to AI and education, as AI is advancing at a rapid pace where education has changed little and is under both funding and also workload challenges.   That said, as I shared in my presentation, “The smartest person in the room, is the room”.   This David Weinberger quote is one of my favourites and is often used, as it highlights the need to discuss and share, in doing so we hopefully engage others to think about the issue, in this case, AI in schools, and collectively our thinking, our ideas and experience is enhanced.

Now you can view my presentation slides here if you are interested.   

At the end of my presentation, a couple of questions were raised which I would like to just pick up on, namely school engagement in AI in education, policy and also regulation.  

School Engagement in AI

I would like to draw attention to the article in the Express which highlighted that 54% of the students they surveyed were using AI in relation to their homework.  The key thing here is that students are using AI independently of whether schools have considered or talked about AI.  And it isn’t just students, you will also likely have staff, both teaching and support staff who are using AI.   The AI genie is out of the bottle and attempts to block it will inevitably be futile so, in my opinion, it is key that we engage with the use of AI, we talk with students and staff about AI, and that schools experiment and share.    But the fact AI is already here isn’t the only reason to use it in education.   We talk about the need to support individual students, differentiation, English as a second language and also SEND barriers to learning; all of these can be addressed to some extent through the use of AI tools.   Now I will note here that the use of AI tools may also increase some challenges, such as that of digital divides, but that was a key part of my presentation in talking about the risks and challenges first, as we need to use AI but only from a position of an awareness of risks and challenges.

Policies

Linked to the above, I think it is very important that schools put in place an AI policy if they haven’t already done so.   This allows the school to set out its guardrails in relation to the use of AI in the school.  Now there is a brilliant template for this, as created by Mark Anderson and Laura Knight, which can be found here.   Looking to the future I suspect the AI policy might be eventually absorbed into the IT acceptable use and/or academic integrity policies however for now, while AI use in schools is so new, I think having it as a standalone policy makes sense.

Regulation

There will need to be some form of regulation in relation to AI tools including their use in education however we have already seen that the technology is developing very fast while the regulation is lagging so far behind and is slow to adapt.   As such I think we should hope for and support some form of regulation to protect people, including our staff and students, and their data, but I don’t believe we can wait for this to happen.    AI is already here and students and staff are likely using it.  We can’t stop this, so I think we need to run with it, to try and shape the use and hopefully in doing so shape the regulation which follows.  This will mean making risk v. benefit decisions but seldom do we see anything which is beneficial without any risks.

Conclusion

The OABMG conference was enjoyable even though my visit was brief.   It was good to get to share some thoughts on AI in education and I hope those in attendance found the session useful.   My two key thoughts from the event are, the need to be brave, remembering we learn most from our mistakes, and the need in this ever-busy and complex world to share as collectively we are all better for it. I think these are two things I will try do more actively in future.

AI and assessment (Part 1)

I recently spoke at an AI event for secondary schools in which one of the topics I spoke on related to AI and its impact on Assessment.   As such I thought I would share some of my thoughts, with this being the first of two blogs on the first of the sessions I delivered..

Exams

Exams, in the form of terminal GCSE and A-Level exams still form a fairly large part of our focus in schools.  We might talk about curriculum content and learning but at the end of the day, for students in Years 10,11, lower 6 and upper 6 the key thing is preparing them for their terminal exams as the results from these exams will determine the options available to students in the next stage of their educational journey.   The issue though is that these terminal exams have changed little.   I provided a photo of an exam being taken by students in 1940 and a similar exam in recent terms and there is little difference, other than one photo being black and white and the other being colour, between the photos.   The intervening period has seen the invention of DNA sequencing, the mobile phone, the internet and social media, and more recently the public access to generative AI but in terms of education and terminal exams little has changed.

One of the big challenges in terms of exams is scalability.  Any new solution needs to be scalable to exams taken in schools across the world.  Paper and pencil exams, sat by students across the world at the same time accommodates for this.  If we found life on Mars and wanted them to do a GCSE, we would simply need to translate the papers into Martian, stick the exams along with paper and pencils on a rocket and fire them to Mars.   But just as it is the way we have done things and the most easily scalable solution doesn’t make paper and pencil exams the best solutions.   But what is the alternative?

I think we need to acknowledge that a technology solution has to be introduced at some point and the key point is the scalability based on schools with differing resources.   As such we need a solution which can be delivered in schools with only 1 or 2 IT labs, rather than enough PCs to accommodate 200 students being examined at once as is the case with paper based exams.  So we need a solution which allows for students to sit the exams in groups, but without compromising the academic integrity of the exams where student share the questions they were presented with.    The solution, in my view is that of adaptive testing as used for ALIS and MIDYIS testing by the CEM.   Here students complete the test online but are presented different questions which adapt to students performance as they progress.   This means the testing experience is adapted to the student, rather than being a one size fits all as with paper exams.    This helps with keeping students motivated and within what CEM describe as the “learning zone”.   It also means as students receive different questions they can sit the exam at different times which solves the logistical issue of access to school devices.   Taken a step further it might allow for students to complete their exams when they are ready rather than on a date and time set for all students irrespective of their readiness.

AI also raises the question of our current limited pathways though education, with students doing GCSES and then A-Levels, BTecs or T-Levels and then onto university.    I believe there are 60 GCSE options available however most schools will offer only a fraction of this.    So what’s the alternative?    Well CalTech may provide a possible solution;  They require students to achieve calculus as an entry requirement yet lots of US schools don’t offer calculus possibly due to lack of staff or other reasons.   CalTechs solution to this has been to allow students to evidence their mastery of calculus through completion of an online Khan Academy programme.   What if we were more accepting of the online platforms as evidence of learning and subject mastery?   There is also the question of the size of the courses;   GCSEs and A-Levels and BTec quals are all 2 years long but why couldn’t we recognise smaller qualifications and thereby support more flexibility and personalisation in learning programmes?   In working life we might complete a short online course to develop a skill or piece of knowledge on a “just-in-time” basis so why couldn’t this work for schools and formal education?  The Open University already does this through micro credentials so there is evidence as to how it might work.   I suspect the main challenges here are logistical in terms of managing a larger number of courses from an exam board level, plus agreeing the equality between courses;   Is introductory calculus the same as digital number systems for example?

Coursework

Coursework is also a staple part of the current education system and summative assessment.    Ever since Generative AI made its bit entrance in terms of public accessibility we have worried about the cheating of students in relation to homework and coursework.    I suspect the challenge runs deeper as a key part of coursework is its originality or the fact that it is the students own work but what does that look like in a world of generative AI.    If a student has special educational needs and struggles to get started so uses ChatGPT to help start, but then adjusts and modifies the work over a period of time based on their own learning and views, is this the students own work?   And what about the student who does the work independently but then before submitting asks ChatGPT for feedback and advice, before adjusting the work and submitting;   Again, is this the students own work?  

There is a significant challenge in relation to originality of work and independent of AI this challenge has been growing.   As the speed of new content generation, in the form of blogs, YouTube videos, TikTok, etc, has increased year on year, plus as world populations continue to increase it become all the more difficult to be individual.  Consider being original in a room of 2 people compared with a room of 1000 people;    The more people and the more content, the more difficult it is to create something original.   So what does it really mean for a piece of work to be truly original or a students own work?

The challenge of originally and students own work relates to our choice of coursework as a proxy for learning;   It isnt necessarily the best method of measuring learning but it is convenient and scalable allowing for easy standardisation and moderation to ensure equality across schools all over the world.   It is easy to look at ten pieces of work and ensure they have been marked fairly and in a similar fashion;  Having been a moderator myself this was part of my job visited schools and carrying out moderation of coursework in relation to IT qualifications.   If however generative AI means that submitted content is no longer suitable to show student learning, maybe we need to look at the process students go through in creating their coursework.    This however has its own challenges in terms of how we would record our assessment of process and also how we would standardise or moderate this across schools.

Questions

I don’t have a solutions to the concerns or challenges I have outlined, however the purpose of my session was to stimulate some though and to pose some questions to consider.    The key questions I posed during the first part of my session were:

  1. Do we need an annual series of terminal exams?
  2. Does there need to be [such] a limited number of routes through formal education?
  3. Why are courses 2+ years long?
  4. Should we assess the process rather than product [in relation to coursework]?
  5. How can we assess the process in an internationally scalable form?

These are all pretty broad questions however as we start to explore the impact of AI in education I think we need to look broadly to the future.    In terms of technology the future has a tendency to come upon us quickly due to quick technology advancement and change, while education tends to be slow to adapt and change.    The sooner we therefore seek to answer the broad questions or at least think about them the better.

TEISS London 2023: Reflections

During September I managed to find myself in two industry level cyber/info security conferences, one of which I have already blogged about (See here).   This post focusses on the other event, being the TEISS London 2023 event which was a little more focussed on incident management rather than the previous event which was a little more generic.   So, what were my take-aways as relevant to education?

Incident Response

One of the key discussions across this particular event was in relation to the inevitable cyber incident and therefore the need to prepare.    Discussions arose around desktop exercises, the development of incident response playbooks and disaster recovery plans.    The key take-away for me was in the need to play through potential cyber incidents and to do this regularly.   We are not talking about once every few years, but as often as can be managed so that the relevant staff, both senior and IT technical, know how to respond when the inevitable issue arises.    It was also discussed, the need to carry out these desktop exercises with different groups of individuals in order to ensure that all are prepared.   Desktop exercising is definitely something I want to look towards repeating in the coming months, and building a process so that it doesn’t occur ad-hoc but more as part of  a regular process allowing for the review and improvement of the related processes with each test.

Concerning external factors

One of the presenters went into the risks associated with geopolitical issues, where issues in the geopolitical space often result in corresponding issues in the cyber arena.  From a schools point of view it is easy to wonder why this makes a difference;  Why would a nation state or similar focus on education?    I think the issue here is not so much an attacker focussing on education, but on the collateral damage which might impact education.   Now this collateral damage might be accidental however we also need to acknowledge the increasing use of cloud services;  This often means data and services hosted in various countries across the world so what is the potential risk where countries have disagreements and where some aggressive activity online results.   It is easy to say your school exists in Europe or the UK so this is unlikely however the presenter demonstrated some  aggressive cyber activity even within the UK and EU, so it therefore isnt unpredictable that this may happen again in the future.    For schools this means, as far as I am concerned, that we need to continue to do the basics plus prepare to manage an incident when it occurs.

Artificial Intelligence

AI once again factored in the discussion however at least one presenter suggested that where we are now is more akin to Machine Learning than AI.   I suspect this depends on your definition of both terms, with my definition having ML as a subset of AI.    The key message here was that the current instance of AI, generative AI, presents rather generic responses but quickly.   Its benefit, whether used for defence or attack, is its speed and ability to ingest huge amounts of data, however it is only in pairing with a human that it progresses beyond being “generic”.   In the future this may change, as we approach the “singularity” however for now and the near future AI is an assistant for us and for criminals, but doesn’t represent a significant innovative change in relation to cyber security;  good security with AI is little different to good security prior to generative AI.

Human Factors

The human factor and culture were a fair part of the discussion.    The cyber culture and “the way we do things around here” in relation to information security is key.   We need to build safe and secure practices into all we do and at all levels;  Easier to say than it is to do.    This also links to the fact that humans, and the wider user group which in schools would be students, staff, parents, visitors and contractors among others, continue to be involved in around 74% of breaches.   This means it is key that cyber security awareness training needs to hit all of these users and be regular rather than a once a year.    Additionally, if we assume we will suffer a cyber incident, how do we protect our IT staff and also those senior staff involved in incident response and management.   The stress levels will be very high, and as a result self-care may be lacking, but schools and other organisations have a duty of care for their staff, and during a cyber incident that duty of care may become all the mor important.   This is why, in my team anyway, I am introducing a role of “chief wellbeing officer” as part of our incident response plans.

Conclusion

The organisations at this particular event, similar to the previous cyber event, were generally large corporate entities yet for me the messaging may be all the more important for schools given we hold student data and student futures in our hands, and given the targeting of educational institutions.  How do we get more schools to attend these events?    I suspect events like these fall into the important but not urgent, where fixing a server issue or a device issue in a classroom is urgent and important, but then how do we ensure that school IT staff are prepared and preparing for cyber incidents?   Chicken or the egg issue maybe?   

Cyber incidents are inevitable and I have always said that “the smartest person in the room is the room” so if we can share with industry where I believe they have much more experience in this arena, then maybe we, as in schools, will be all the better for it.

EduTech Europe 2023

Its been a while since I have had to fly out to present at a conference, with the last time being almost 10 years ago flying to Kuwait from the UAE to present, but recently I found myself in Amsterdam presenting a cyber session at the EduTech Europe event.    I suppose this means I can claim to be an international speaker for all that might be worth!   In terms of the event itself I found it to be very useful indeed so I thought I would, as I have for other events, share my thoughts.

Education and Disruption

There was a fair contingent of UK EdTech experts and gurus at the event and it was great to catch up with many of them and to watch their various sessions.   This continues to be one of the big reasons for events like this, in the networking and opportunity to share thoughts and ideas, however I think EduTechEurope did particularly well at this as there seemed to be more time to allow for discussion.

Of particular note was some discussion with Gemma Gwilliam and Emma Darcy in relation to the education system as it currently exists.   Emma referred to “brave leadership” in her session, in response to a question from the floor relating to how the current curriculum doesn’t prepare students for the digital world which exists and which lies ahead.   This struck me as highlighting that those schools seeking to do the right thing for their students are often having to break away from the established education system.   In Emma’s case one aspect of this was re-imaging the school day and timetable to make time available for digital and the things that matter even when these are not within the curriculum or something the current education system seeks to develop or assess.   Over lunch on day two, Emma, Gemma and myself had a really interesting discussion as to how we as a group, along with some others, might seek to support the “breaking” of education through constructive disruption.    I left the event feeling energised and excited from the discussions and look forward to sharing the progress we as a group make over the coming months, and possibly ahead of EduTechEurope 2024.

Digital Strategy

Digital strategy in schools has been discussed often over the last 5 or 10 years so isnt something new.   The pandemic also brought the importance of digital solutions to the forefront further stimulating the discussion however it was refreshing to hear discussion of an often forgotten aspect in relation to wellbeing.   Technology allows us to do more and to be more efficient and quicker, but does this “doing more” have a negative impact on wellbeing and on staff workload?   The wellbeing aspect of digital strategy is something we need to explore much more, as is the challenge regarding the “additive” approach to education which has seen us forever seeking to get better, which is fair and enviable, but at the expense of increasing workload and challenges around wellbeing and mental health.

AI in education

Laura Knight delivered her usual high quality and thought provoking session, this time on AI in education.   She explored the benefits and challenges in relation to AI which is something I myself have explored in some of my recent presentations.   The most interesting part of her presentation for me though was her discussion on agility and education.    This is something I see as a key challenge given education has changed little over the last few centuries, albeit the whiteboard has been replaced by a projector or panel.   This is against a backdrop of rapid technological development.    We have therefore needed to re-evaluate our current education system for some time, including how we assess students, and how students progress through formal education.   It may be that AI will now prove as a potential catalyst to make this change happen.   It is also likely that in the first instance we will need the “brave leadership” and the positive disruption from a small number of schools to lead the way for schools and colleges in general.

Cyber Resilience

The panel session I was involved in looked at “cyber-proofing your school” and it was great to be on such a diverse panel chaired by Abid Patel.Some really good discussion with two particular notable points for me. One being the fact that schools dont and will not ever have the finances and other resources to cyber “proof” our schools so we can only focus on doing the basics and preparing for an incident. The second take away for me was the fact the session was the last session of day 1, with this often being the case for cyber security awareness sessions, given limited time and placed at the end of an inset day; We need to realise that the potential impact of a cyber incident means we need to move cyber up the pecking and priority order, treating it not as an IT issue but as an organisation wide issue.

Travel issues

Any event I attend cannot be without its travel issues and this event was no different.   Firstly, it was the short stay carpark, a nice 2 or 3 mins walk from the departure terminal, being closed leading to a longer and unplanned 15 to 20 mins walk to be met with a snaking long queue for security.  Panic ensued however thankfully the Bristol security team were efficient and quickly saw us all through.   Once in Amsterdam I couldn’t check into the hotel when I arrived leading to having to attend the conference a little dishevelled from travel and rushing around.

And on the homeward bound leg it was plane delays and an amusing moment at the gate as shown on my app, looking at a plane at the gate but unable to get to that plane, as the plane slowly backed away from the gate;  Myself and another passenger wondered if that was our plane and we had missed it but thankfully it wasn’t our plane, and our plane was still 1 hour from arriving.   And once back in the UK, exiting airport parking, my ticket wouldn’t work to open the barrier so I pressed for help.  The helpful voice took some details and then went quiet.  After about 10 minutes sat waiting I pressed for help once more and got the same helpful voice who for reasons unknown had forgotten about me;  The barrier then promptly raised and I exited the carpark, promptly turning down the wrong exit from a roundabout and nearly going back into the same carpark.   A little three point turn and I finally was heading in the correct direction.

Conclusion

It was a very worthwhile trip, catching up with so many great people and also watching and listening to a number of really useful and informative sessions.  It was also nice to listen to a broader range of speakers from across Europe rather than just UK only as is common in UK based events.   This made for a richer discussion including discussion of education within a variety of different national and regional, and stage, related contexts.   As always the network side of things was a key benefit of the event and I look forward to some follow up from some of the really interesting and exciting discussions and plans that were created.

AI risks and challenges continued

This is my 2nd post following on from my session speaking on AI in education at the Embracing AI event arranged by Elementary Technology and the ANME in Leeds last week.   Continuing from my previous post I once again look at the risks and challenges of AI in education rather than the benefits, although I continue to be very positive about the potential for AI in schools and colleges, and the need for all schools to begin exploring and experimenting.

Homogeneity

The discussion of AI is a broad one however at the moment the available generative AI solutions are still rather narrow in their abilities.   The availability of multi-modal generative AI solutions is a step forward but the solutions are still rather narrow, being largely focussed on a statistical analysis of the training data to arrive at the most probable response, with a little randomness thrown in for good measure.     As such, although the responses to a repeated prompt may be different, taken holistically they tend towards an average response and here in lies a challenge.   If the responses from generative AI tend towards an average response and we continue to make more and more use of generative AI, wont this result in content, as produced by humans using AI, regressing to the mean?   And what might this mean for human diversity and creativity?    To cite an example, I remember seeing on social media an email chain where an individual replied asking the sender to not use AI in future, to which the sender replied, I didn’t use AI, I’m neuro-diverse. What might increasing AI use mean for those who diverge from the average, and what does it even mean to be “average”?

Originality

The issue of originality is a big one for education.   The JCQ guidelines in relation to A-Level refer to the need for “All coursework submitted for assessment must be the candidate’s own work” but what does this mean in a world of generative AI?   If a student has difficulty working out how to get started and therefore makes use of a generative AI solution to get them started, is the resultant work still their own work?   What about a student who develops a piece of work but then, conscious of their SEN needs and difficulties with language processing, asks a generative AI solution to read over the content and correct any errors, or maybe even improve the readability of the piece of work;  Is this still the students own work?   Education in general will need to seek to address this challenge.   The fact is that we have used coursework assessment evidence as a proxy for evidence of learning for some time, however we may now need to rethink this given the availability of the many generative AI solutions which are now so easily accessible.    And before I move on I need to briefly mention AI and plagiarism detection tools;   They simply don’t work with any reliability, so in my view, shouldn’t be used.   I don’t think there is much more that needs said about such tools, other than that.

Over-reliance

We humans love convenience however as in most, if not all things, there is a balance to be had and for every advantage there is a risk or challenge.   As we come to use AI more and more often due to the benefits we may become over-reliant on it and therefore fail to consider the drawbacks.   Consider the conventional library based research;  When I was studying, pre-Google, you had to visit a library for resources and in doing so you quite often found new sources which you hadn’t considered, through accidentally picking out a book or through using the reference list in one book, leading to another book, and onwards.   The world of Google removed some of this as we could now conveniently get the right resources from our prompts.   Google would return lists of sources but how many of us went beyond the first page of responses?    Now step in generative AI which will not only provide references but can actually provide the answer to an assignment question.    But the drawback is Google (remember Google search uses AI) and now Generative AI may result in a reduction in broader reading and an increasingly reliance on the google search or generative AI response.   Possibly over time we might become less able, through over-use, to even identify when AI provides incorrect or incomplete information.   There is a key need to find an appropriate balance in our use of AI, balancing its convenience against our reliance.

Transparency and ethics

Another issue which will likely grow in relation to AI is that of transparency and of ethics.    In terms of transparency, do people need to know where an AI is in use and to what extent it is used.   Consider the earlier discussion of student coursework and it is clear that students should be stating where generative AI is used, but what about a voice based AI solution answering a helpline or school reception desk;   Does the caller need to know they are dealing with an AI rather than a human?   What about the AI in a learning management platform;  How can we explain the decisions made by the AI in relation to the learning path it provides a student?  And if we are unable to explain how the platform directs the students and therefore are unable to evidence whether it may be positively or negatively impacting the student, is it therefore ethical to use the platform?   The ethical question itself may become a significant one, focusing not on how we can use AI but on should we be using it for a given purpose.     The ethics of AI are likely to be a difficult issue to unpick given the general black-box nature of such solutions although some solutions providers are looking at ways to surface the inner workings of their AI solutions to provide more transparency and help answer the ethical question.   I however suspect that most vendors will be focussed on the how of using AI as this drives their financial bottom line.   The question of whether they should provide certain solutions or configure AI in certain ways will likely be confined to the future and the post mortem resulting from where thing go wrong.

Conclusion

As I said at the outset I am very positive about the potential for AI in education, and beyond, but I also believe we need to be aware and consider the possible risks so we can innovate and explore, but safely and responsibly.