EdExec Live, Herts

I recently had the opportunity to contribute to the EdExec live event in Hertfordshire.  Now I have contributed to EdExec Live events in the past but this is the first time I have done so in Hertfordshire.   I need to admit, as is all too common for me, travel to the event came complete with travel disasters, with me getting easily to London and across London but then subsequent trains being cancelled and delays, leading to an Uber and a total travel time of just over 6hrs.  But enough of my usual travel woes.

I think the first thing of note is my belief in the fact that education, teaching and learning in schools, takes a village.   It requires various people doing various roles.   This includes teachers in the classroom, teaching SLT members, IT staff supporting the IT setup as well as school business leaders and more.   Now am lucky to, as a teacher of many years, contribute to the teaching side of things, and as an ANME ambassador to contribute to the IT side of things, however the EdExec events allow me to contribute to the school business leader side of things.   As I have said many times before, collaboration and sharing is so important or as david Weinberger put it: “the smartest person in the room, is the room”.   As such it is so important that we share widely, including sharing beyond silos associated with specific roles.   So, I am therefore keen to share and be involved with discussion with educational professionals across the various roles which work towards ensuring schools operate and students succeed.

The conference was opened by Stephen Morales from the Institute of School Business Leadership (ISBL) and so much he said aligned with some of my thinking.   Firstly, he mentioned the implications and impact of geopolitics on education.    This was something I heard only a few weeks earlier at an Information Security conference, where it was clear information security and cyber security of organisations, including schools was being impacted by geopolitical issues.     Stephen also mentioned the privilege divide, which refers to socioeconomic divides, and in turn has a direct impact on technology divides.    We clearly need to reduce divides where possible, building equity, however sometimes the easy “solutions” have unintended consequences in this complex world so we need to make sure our decisions are measured and considered.

Stephen referred to the need for collaboration and also to the need to consider technology.   Both of these are things I believe strongly in, believing there is a relationship between the two.   Given how tech changes and advances so quickly we cannot seek to stay up to date on our own so the best solution we have continues to be collective action, to be sharing and discussing and using the wealth of experience, thought and skills of the education sector as a whole.   He also referred to structures, processes, people and technology, and I think this is key, considering not just the technology but the people using it and the processes it is being used for.  This immediately got me thinking about teaching and the TPACK model.

He also mentioned AI which was the focus of the presentation I was giving immediately following his keynote.    You can access my slide here.    Some of my key points from the session where the fact that AI is here now and students are definitely using it, as are many staff.    We can’t put that genie back in the bottle.    As such we need to look to how we can harness AI, and that’s not just generative AI, but includes the various other branches of AI.   We need to look to it’s using in teaching, in helping teachers prepare content and in marking, in learning, putting AI in the hands of students, and also in the administrative aspects of schools, both in the classroom and in the wider school.     I made the point that this isn’t without risk, which was apt when the next session I attended, led brilliantly by Laura Williams, was specifically about risk management.    If we want to benefit from the potential of AI, we will need to deal with the risks.   If we don’t allow use of AI, if we ban it, we don’t need to deal with the risks of AI usage, although there are risks resulting from this, from not teaching about and not allowing AI use.   It’s the balance issue I often talk about.

My session talked about the need for an AI strategy which aligns with the technology strategy which in turn aligns with the school strategy.  They are inter-related.    I also mentioned the need for appropriate foundations, so we cant look at AI without good infrastructure, devices, support and training.    An Ai, and a tech strategy, as well as a school strategy, has to be built on solid foundations.    So chasing the next shiny AI tool, without the fundamentals in place just wont work.

In terms of risks, I mentioned bias and inaccuracies however also mentioned that humans are not short of these challenges either, albeit we don’t always appreciate them.   Data protection continues to be an issue, however Data protection in the world of Ai is often simply good data protection related to any online or technology service.   Obviously automated decision making needs a little more consideration, however how many of the online content platforms schools have been using years, and which recommend and direct students to learning content, aren’t fully transparent as to how their algorithms, their AI, make decisions.

Thinking back to Stephens presentation he mentioned about fears as to AI replacement of humans.    For me, as for Stephen, it is about AI and humans working together, rather than one replacing the other.

The conference was yet another opportunity to share my thoughts and to engage with others as to their thoughts, and some of the discussions I had over lunch were very interesting indeed.    Schools are clearly at different points, and with different contexts, and this for me is fine, however if we wish to move forward I continue to believe in the need to work collaboratively and to share.    I came away from the event with new thoughts and ideas, and I hope those who attended my session came away the same.

TEISS, Infosec summit

Last week saw me attend the TEISS European Information Security summit down in London.  This is one of my annual journeys outside of the education bubble to look at cyber security, resilience and health in the broader industry and enterprise context.   I feel it is always important to try and seek diversity and to seek to avoid falling into the issues associated with existing purely within a silo, so stepping outside of my day to day on a regularish basis is a must.

More of the same, but greater volumes and speed.

If I was to summarise one of my main takeaways from the event, it would be that a lot of what I had heard was similar to what I had heard a year before.    Cybercrime continues to grow in terms of both threat and in terms of its potential impact.    The specific threats, such as ransomware, or social engineering, haven’t really changed but the frequency and speed of attacks has increased.    One particular slide looked at national state actors showing how some countries were now down to a breakout time, from compromise to exfiltration, of under 6 minutes.   Now it isn’t likely that schools will need to face nation state actors, albeit we could end up as collateral damage, however this increase in speed for nation state actors is likely mirrored for other threat actors, including those schools may actually face.      Related to this, one presenter showed screenshots of AI powered cybercrime tools which are now available, highlighting that AI, and in particular Large Language Models, not only have the potential to increase the productivity and efficiency of users, they also have the potential to increase the productivity and efficiency of criminals.   I was aware of FraudGPT and WormGPT so this wasn’t new to me however the subsequent slide provided showed an automation and orchestration platform which criminals could use.    The combination of AI powered creation tools alongside automation tools gives me concern as it would clearly give the criminals the ability to broadly launch convincing attacks but where any compromise can be quickly leveraged before defenders have an opportunity to react.   Think PowerAutomate for criminals.    Lots more, better phishing emails, where user errors are quickly capitalised on to deliver malware, extract data or propagate further attacks.

Geo-political instability

Discussion of the impact of geo-political instability and its impact on information security was very interesting especially in considering the room full of cyber security professionals charged with protecting companies and data, including companies responsible for critical national infrastructure.   From a school point of view, this might seem to be outside of our wheelhouse however on reflection I wonder about our need to educate students in relation to this.    We have already seen that modern warfare now involves a cyber element, with the cyber element often preceding any physical engagement.   Do students need to be aware of the implications of globally connected digital services in a world of increasing conflict along national and geographic borders?   How might these issues directly impact us, but also what about where we are indirectly impacted or where the impact is subtle manipulation via social media.   I suspect there is a whole post possible on this alone.

User awareness and training

I spent a significant part of the conference watching sessions within the Culture and Education stream.    There was some good discussion in relation to culture and testing of cyber resilience, particularly the use of phishing awareness testing.   These tests are very good at giving us a snapshot or even a longitudinal view as to our general cyber resilience, however they aren’t as useful at an individual user level.    To present a staff member or student with some additional training material to undertake following them falling for a phishing test, doesn’t find them are their best in terms of their potential to learn.     One presenter presented an alternative view suggesting that all users mean to do the right thing, so therefore we should be asking what it is that makes them do the wrong thing, rather than focusing on how we change individuals behaviour.    For me this very often comes down to being time poor and therefore being in a rush or suffering workload issues so I am not sure quite what we can do about this.   In my view, the world and our roles only see us adding more tasks and activities, and very seldom do we take things away, therefore it is no wonder that we are time poor and therefore no wonder that in our hurry we fall for social engineering and for phishing emails.  That said, it is definitely worth the conversation as to what the barriers to good cyber behaviours are and then looking to see if there is any way to address them.   I suspect we wont solve the issue, but I bet there will be some possible quick wins.

Recovery over prevention

One presenter made a very interesting observation that we continue to spend too much time focussed on prevention over spending time looking at how we might respond and recover from an incident.   I can immediately see why we might focus on prevention, as if a cyber incident doesn’t happen, then things are all good.   The reality however is that cyber incidents are almost guaranteed.    And if we accept that an incident is definitely going to happen at some point in the future then we are better spending a little less time focussed on prevention and a little more on considering what we will do when an incident does happen.    This can easily be done through desktop exercises, and doing so is always preferable to actually having to work it out when the world is on fire in the midst of a real cyber incident.   And to that end I actually delivered a little exercise only the other day.

People, Processes and Technology

One of the biggest takeaways from the event was the mention of People, Processes and Technology (PPT for short, and not the Microsoft App).    Sadly all to often we focus on Technology.   How can we technically keep data secure?  How can IT deliver training to those clicking a phishing link?   What we need to do more of is to consider the people involved and their impact, as well as the processes.   If we consider people, processes and technology we likely will have the best opportunity of keeping things secure and safe.   And I note, that considering people, processes and technology isn’t just an infosec thing, it can equally be applied to school technology strategy, to use of technology in classrooms, and much more.

I suspect as we continue to make use of more technology and as technology further pervades every aspect of our lives, we need to increasingly seek to look to the human contribution and to human behaviour, rather than getting so focussed on the tech.

BETT 2025: reflections part 2

Continuing on my reflections of the BETT conference from my previous post, I found sir Stephen Frys discussion with Dr Anne-Marie Imafidon to be quite interesting in the exploring “science reality” and how some things science fiction have come to pass, plus in looking at how Artificial Intelligence (AI) has actually been around for some time.    In terms of science reality, I did a presentation last year where I referenced an episode from Star Trek: The Next Generation, where it appears that captain Piccard is using a device very much like an iPad or other tablet device.     It is notable the episode aired in the 1980’s and focused on events from the 23rd century, when in fact the iPad made its appearance in 2010.   For me this highlights that science fiction sometimes presents us with novel and interesting ideas, that people then seek to make happen.   It also highlights that we are pretty crap at predicting the “when” of things with any real accuracy.

In terms of the longevity of AI, the concept has been discussed since the 1950’s with period of progress and then periods of quiet, with one particular quiet period known as the AI winter.    The reality is that the current progress of AI, as discussed by the sir Stephen and Dr Imafidon, is likely the juncture between increasing computing poor and increasing “oil fields of data”.    I found the point regarding how we “sleepwalked into the internet age” interesting, highlighting we cannot do the same with AI, but did we truly know what the impact of the internet was going to be, so can we truly know what the impact of AI might be?      I also found discuss of how social media focussed on “maximising engagement” to hit the nail of the head especially when this was expanded to include maximising bias, hatred and other negatives.   The term socio-technical skills as something we should possibly seek develop, was a new one on me, but I can see the point.

The discussion then progressed to education and assessment categorising the implications of ChatGPT for cheating as a minor issue brought about by the education system we currently have.    This aligns with some of my views on the need to reform education.    Education is not about tests or coursework, it is about learning.  It isn’t about grades.    I found the comment regarding our current system “testing for ignorance” and then pushing it, to be a particular telling and critical assessment of the world we consider to be education.    In the roundtable on assessment I took part in, that was one of our discussions regarding how coursework and exams are simply scalable for use across different schools, regions and countries, so we use them due to this scalability rather than because they are the best thing for education or for learning, or for our students.   

As the discussion moved back more towards AI there was an interesting discussion on AI development in terms of how we often describe AI as currently being the worst it will be, and that it is constantly improving.  This is fair to a point but sir Stephen referred to the internet as “filling with slop” and “contaminated” and if we assume that AI continues to use the AI in its training data then it too may become contaminated so it maybe possible to suggest that AI might be at its best now and only get worse as it becomes more contaminated by its own “slop”.    And who controls the AI and its development.  It was suggested that the three worst options might be the three groups most likely to lead the way on AI development, being countries, corporations and criminals.   In all three cases I can see the outcomes being far from positive and we can already see the internet being used to political and national ends, for pure commercialisation, consumerisation or profit, or for crime.   

I could likely write a whole series of blogs based on the session by sir Stephen and Dr Imafidon however rather than focussing on that I just want to share how they finished the discussion, on the need to find the “sweet spot”.   The need to find a balance between pessimism and optimism.   Now this aligns very much with my view of balance, in that most good things will have some balancing drawbacks or challenges.   We need to try and find our way and find the best middle group, the “sweet spot”.

The next session I watched before hitting the BETT conference floor was a session titled “Education in the AI era”.   Again I could write a lot about what was said as I found it to be very interesting indeed but am going to avoid doing that.    One key comment mentioned 30% of teachers not using AI;  My sense is this figure is lower than the reality.   The data came from TeacherTapp which I think is great, but I also think that the subset of teachers using TeacherTapp are likely to be those who are a little more tech savvy and therefore likely to use AI, and that a greater proportion of those who don’t use AI will also not be using TeacherTapp.  The bigger and possibly more important question is why some teachers who know of AI aren’t using it;  Is it they don’t know they are using AI, but are, that they don’t have access, lack training, lack confidence or something else?    In terms of access, this session also mentioned access to technology and affordance, which to me links to the concept of digital divides.

I also liked the discussion on banning and blocking AI where they compared it to knives in food tech.   Why would we ban AI in some or all subjects when we know knives can be dangerous, yet don’t ban them?    Now I know that this is a very simplistic and flawed analogy and that it was likely used for effect rather than accuracy, but I think the point is valid;  How often has prohibition of anything ever been beneficial or effective?   It just tends to make people do it more, but do it in secret.

This session finished on the big question, which had also been raised the previous night at the Edufuturists event, in terms of what the purpose of education is?     In terms of what we measure, tests, coursework, grades, are these what truly matters?   And if not, what does matter, and how might we measure it, assuming we need to?

That’s some pretty deep questions to end this post on, but that’s where I found myself and I was still in the morning of day 1 of BETT.   The afternoon would see me getting around the event and doing the networking side of things, which for me is one of the main benefits of BETT, but the sessions from the morning, and some of the other sessions I attended across the conference were also very beneficial in stimulating thoughts and ideas, and in some places in confirming or challenging some of my thinking.    Next BETT post to follow soon……….

SAAS Birmingham 2024

The schools and academies show and co-hosted EdTech Summit are marked in my calendar as an annual event to attend, so it was with this that I made my way over the Birmingham NEC for this years event.  

As always the key benefit of the event is the networking which is provides with so many great people to meet up with and share ideas with.  

The Digital Futures Group, including Gemma Gwilliam, Emma Darcey, James Garnett, Jonny Wathen and Abid Patel were all in attendance, and with Abid bringing a Bru, albeit the wrong one.  

Additionally I met up with Osi Ejiofor and Georgina Dean and Nicola Pondsford.    Some of the ISC Digital Advisory Group were around including the chair, David Horton, and Neelam Parmar as well.    And the event is never complete without the ANMEs presence, the stand complete with the ambassadors avatars, myself included, and a chance to chat with Rick Cowell, Jase Caul and Terry Dignam.      I am so thankfully for such a wonderful group of people who are so keen to share their thoughts and wisdom.   Every time I meet up with these people I come away feeling more optimistic as to EdTech in schools, plus more developed in my thinking. The smartest person in the room is the room, so I am gratefully to be in the same room as all these great people.

So this years event had a fair amount of Gemma keeping me on the straight and narrow, including guiding me through Birmingham airport buildings to our chosen hotel, where I suspect had I been doing this on my own I would have got badly lost. My geographical and directional skills are not very reliable and I am convinced the Google Maps app is bent on sending me in the wrong direction.

As I write this blog, and seek to share some thoughts, I must note my Schools and Academies badge as supplied by Rick and ANME.    The notable thing about it was that it didn’t list me as a visitor, a speaker (I wasn’t speaking) but as “Press”.    Now sadly I suspect this was an administrative error rather than this blog now reaching sufficient people that I can now be considered a member of the associated press.    But who knows what the future holds 😉  Maybe next year I need to walk around with a microphone and see what reaction that gets?

So what were my main takeaways from the event?     As mentioned above I think a key takeaway of this and other conferences and events is simply the need to network and share.  Technology moves fast, and all schools differ in context so no-one in schools is in a position to resolve the resultant challenges on their own, so our best chance is simply to approach the problems and challenges collectively, to share and work together.    The ANME, the ISC Digital Advisory Group and the Digital Futures Group (DFG) are all key to me in this, plus events like the schools and academic show provide opportunities to meet further people.

A second take away relates to AI.   This continues to be a hot topic in educational circles but I sometimes feel this is a small number of individuals and schools rather than the masses.   AI has such potential but we need to be pragmatic about it and how we encourage people to try and to experiment but to be safe.    To this end I liked Chris Goodalls presentation where we talked about the low barrier to entry in relation to staff using AI;   It is simply about getting people trying it, typing in the text box and submitting their prompt.    He also pointed to the possibly unnecessary language used making AI sound difficult, such as “prompt engineering”;   If we look at generative AI as a chat bot and as involving a dialogue, then isn’t it something that everyone can do, assuming we can simply convince them to give it a try.   And as to remaining safe, we proposed the need for a DBS, or the need to consider Data privacy, Bias and Safety;   Such a simple acronym.

My third take away related to strategy and a few discussions on AI.   AI may be great thing to consider and maybe something which is a hot topic, but you cant experiment with AI without infrastructure, bandwidth, devices, training and more.    There needs to be a plan, a strategy, and only once you have this and seek to implement it can you then potentially look at using AI.     Related to this I also had a number of discussions on digital divides where the existence of a strategy, the availability of reliable infrastructure, and of client devices for staff and students, plus now, engagement in AI, all represent different divides which can impact on schools, teachers and students.  

Now I was only going to list three takeaways but as I sit on this train there was one other takeaway I need to share and it came from the drinks reception which occurred prior to the SAAS event.    Schools and teachers now have such a list of things that need to be done, so many compliance and other required tasks which are done because we have to.   I have commented on this in the past.    But what about doing the things we want, or even love to do?    Don’t we want teachers that love to teach, and student that as a result love to come to school and engage.    Is the increasing requirements, tasks, compliance and workload eroding the love for teaching and love for learning?   I suspect it is but how can we redress this balance.    This isn’t an easy question, and relates to some of my previous posts on focussing on what matters.   I suspect this will continue to grow as something which schools, school leaders, governments, inspection regimes and other education related organisations need to stop and think about, and hopefully sooner rather than later.

So that’s the Schools and Academic 2024, Birmingham show finished for another year.   The next significant event on the calendar is BETT and I must say, after last year I am so looking forward to it.   I will note that I didn’t do a great job of planning my visit to the schools and academies show, so this is something I will need to make sure and correct for BETT.   And also, maybe this year I may finally heed the common advice regarding comfy shoes.   Or maybe Ill rock up in a suit as normal!

Unleashing AI

It was around a year ago that I had the opportunity to speak at a Keynote event alongside Laura Knight, Dr Miles Berry and Rachel Evans, my fellow ISC Digital Advisory Group colleagues, so it was with some anticipation that I looked forward to involvement in another Keynote event, again including Laura and Rachel, but also including my fried Bukky Yusuf as well as Dina Foster and Dale Bassett.   As with 2023, the event focused on AI in education, and included an opportunity for me to speak on AI literacy for students as well as on the potential for AI to help with efficiency and workload.

So, the opening speaker was Bukky delivering an introduction looking at what AI actually is and to some of the terminology and language which surrounds AI.    She highlighted that AI isn’t new and is something which was being discussed all the way back in the 1950’s plus that, even before ChatGPT burst onto the scene in late 2022, AI was already something we were using in our daily lives in the likes of google maps.    It was interesting as she discussed narrow AI, which is where I think we are now, but also Artificial General Intelligence (AGI) which some predict will be achieved by 2040, and Artificial Super Intelligence (ASI) which is the advancement and the scary situation that would proceed AGI.    If AI achieved AGI, the issue is that it can iterate and evolve far quicker than we can as humans, so once AGI is reached its self-advancement quickly moves beyond human capacity and understanding towards ASI.   We potentially become what ants are to human beings.    Now I hold to the hope here that we are pretty poor at predicting the future, that this is still a couple of decades away and that we will hopefully put some guard rails and mitigation measures in place to ensure we are prepared for this between now and then.   

Next up was Laura, who as always, delivered a thought provoking session which stimulated such broad thought in relation to AI and education.   I loved her discussion of technology strategy metaphors and the dangers of a hot air balloon, fireworks or jet fighter approach, each with its advantages and its drawbacks.   I sense I try to balance the hot air balloon and the jet fighter, seeking to have an overview but while also trying to keep a sense of momentum and direction.   I think I am passed my days of seeking the shiny new thing, the fireworks, although I will note that I certainly did fall into this trap in my early teaching and EdTech days.      Laura also touched on the need to be creative and yet also be an engineer which I think is an interesting challenge as it requires two different types of thinking.

My first session of the day related to developing AI literacy within students, but in fact much of what I said was equally applicable to staff as well as students.   I outlined some of the knowledge which I feel is important, including knowing of the benefits but also the risks and challenges as they relate to the use of AI.   Next I moved onto the skills side of things, and how all the discussion of prompt engineering and the likes paints use of AI out as being complex and technical, when in fact my recent use of CoPilot involved me simply talking to my laptop and CoPilot.   The barrier to entry, to actually having a play with AI is actually so very low than anyone can do it.

In terms of skills I highlighted the need for students, and staff, to be able to think critically and to review and asses content presented to them to identify what is fake or real.   Given the speed with which posts on social media become viral, and the potential for AI to be used to create or manipulate content, whether it is text, image, audio or video, the need for critical thinking is never more key.    I also pointed to the need to consider the ethics in relation to AI tools, using Star Wars and the post death use of James Earl Jones’ voice and Peter Cushings likeness.   Is this ethical?  How do we seek consent or permission?  Are there risks of mis-use?    Data literacy was my next focus, in the fact AI relies on data and therefore we need to get better at understanding what data is gathered, how it is used, how data might be inferred and more.     One of the attendees also raised the issue of the environment, and on reflection, I should have included a slide to this, to the need to consider the environmental impact of the user of genAI.

After lunch the next session was another Laura session this time looking at the safeguarding implications of AI.   This session went into some of the murkier implications of AI including the use of AI imagery and maybe even chatbots to support criminals engaged in sextortion.    She talked about the shame that people feel when they get caught up in technology enabled safeguarding incidents, such as sextortion, and I think the emotional side of things is very important to remember and to consider.   She also raised the issue of some students possibly withdrawing and relying on AI as their friend and confidant, and the implications of this from a privacy point of view as well as from a safeguarding risk point of view where an AI could guide a child towards inappropriate or even harmful behaviour.   The challenge of privacy was also covered, acknowledging that we humans are pretty poor at this often agreeing to app terms and conditions without any consideration for what we have actually agreed to, a challenge that is becoming more and more difficult in my view as we share more information with more apps and services.

My final session of the day focussed on AI and efficiency and also on the possibility it can help to address the current workload challenges in education.  Now Bukky bigged this session up as the “unicorn” session so my first step pre starting the session was to use genAI to get a nice photo of a dog with a unicorn horn on its head;  I simply don’t think anyone has the answers here, or the unicorn, it is just a case of prompting discussion and sharing ideas.   My session was very much about getting attendees to collaborate and share their own idea and experiences.  I have long said the smartest person in the room is the room and this session focussed on exactly that and on getting the audience themselves to share their thoughts and ideas, before I then went on to share some of mine.   One of the highlights for the event as a whole was an attendee picking up on my comment regarding the need to build networks and communities, suggesting that the attendees were themselves now a network and therefore it would be worth seeking to find a way to continue discussion beyond the event;  I highly hope this is something we can get off the ground as I truly believe our best chance to realise the potential of AI, or maybe just to survive the fast paced technical change, is to work together and to actively share and discuss issues or ideas.

The event then closed with a panel session involved myself, Laura, Rachel and Dale.   And before you wonder about if I suffered my usual travel woes, lets just say I stupidly decided to climb the stairs at Russel Square tube station, clearly missing the warning sign.   Approx. 170 spiral stair case steps later and I almost never made the conference the following day!

It was a long but very useful day with lots of things to go away and think on.   I also made use of Otter to record my own presentation with a hope to use this to improve my preparation and my delivery for future events.    I am also hopefully that maybe the attendees will indeed engage with sharing and discussion beyond the event itself, as this is the most likely method in ensuring the discussions and sessions shared bring about the positive change myself and the other presenters would love to see.

InTec IT innovation in education

This week saw me taking a trip to Mercedes World to speak at the InTec IT Innovation in Education event in relation to esports and also to host a little esports round table.   Now as usual my travels weren’t without their issues which started from the outset with the car park at the station being full, so no spaces, and then was promptly followed by a delayed train meaning I missed my connection.    I do sometimes wonder why I continue getting the train however I suspect, if I drove instead, there would just be significant traffic jams plus I wouldn’t be able to work or have a beer in the process of travelling.   As it was the already long journey took just over 5 hours to complete.

So as to the event itself.    The first topic covered was AI in education and in particular Microsoft’s Co-Pilot.   Now this session focussed on the paid version of Co-Pilot where it exists in Word, PowerPoint, Outlook, etc, rather than the free version.   The capabilities are impressive as was evidenced by the demo video which was worked through however two challenges currently exist in schools.   One is that of cost, with a cost of around £25 per user per month the scalability of CoPilot in its paid form across whole school staff bodies is rather limited.   That said it could be issued to key users.   The other issue is that of data protection and data security in relation to how CoPilot may surface data which it shouldn’t but where permissions and labelling of data has been historically poor.    Now an example I used here, and experienced recently albeit not actually involving copilot, involved a poorly configured MS Team with data pertaining to a trip.   Permissions made the team available to all within the organisation, including students.   Now in the past this wouldn’t have been a problem as students would either need to find the link or get very lucky in stumbling across the Team however in this case the AI in office 365 which tries to predict what might be useful, surfaced some files from this team following a number of staff accessing said files.    Office 365 was just presenting “this file might be of interest” however surfaced information which wasn’t meant to be available to students.    In a world of CoPilot this is likely to happen all the more often and present significant potential risk.

Next up was a discussion on cyber security and safeguarding.   I liked the strong linking here between safeguarding, which is rightly viewed as critical, and cyber security, which is often shown lesser consideration.    It may be that the best way forward in terms of schools and cyber security is to view it as an aspect of safeguarding in keeping student and staff data safe and secure, and through this protecting them from potential harms.   And isn’t protecting student from harm exactly what safeguarding is about?

During the lunch break I got my hands on a very nice Sim racing rig and got to do a bit of racing.  To start with I didn’t do too well, treating the pedals like an Xbox controller, with a brake and accelerator pedal with an up and down position and nothing else.  Cue, spinning off the course and missing corners.  I joked with one of the Mercedes staff that I was driving a lawnmower given the amount of time I was spending on the grass.    Later I started to get a better feel for things and for being more careful with my acceleration and braking, at which point I started to make gradual improvements, eventually getting my lap time down below 1 minute and eventually coming 5th on the leader board.

After lunch there were sessions on infrastructure and IT planning.   I think the key messages were the importance of a modern infrastructure to support the increasing number and differing types of devices, including VR headsets and 3D printers among many other items.  Also, the need to plan and plan early.   This always makes me think of failing to plan as planning to fail, however in this case its not just about planning but about planning early to allow time for those things we cant predict.

My session was largely on esports, talking about how easy it is for schools to get involved with esports plus about the potential benefits in terms of soft skills development but also in terms of the potential career pathways which esports, and the soft skills it helps develop, might open for our students.   I still sense that esports continues to be adopted more by the Further Education colleges than it does within schools, and I feel this continues to be a shame as the benefits are not limited to those 16+ year old. 

My session also had a second topic being the ISBA Technology Survey.   Now I led on the development of the 2024 survey and resultant report, picking up from the work of Alan Hodgin and Ian Philips who developed the 2018 survey.     I continue to feel that technology changes so fast that no school or staff in a single school can effectively adapt and therefore we need to seek collective solutions.   To that end the ISBA Technology survey is about gathering data and presenting baseline information to schools on how technology is being used across schools, to help in comparison and in planning.   

Conclusion

The event was very enjoyable and the Mercedes World venue was perfect, especially given the opportunity to get some Sim racing done before then presenting on esports.   It was also a great opportunity, like so many other similar events, to network and share thoughts and ideas, including getting to catch up with a few colleagues from other schools where I haven’t seen them in person for a number of years now.

AI does continue to be a common topic in education circles at the moment, and this event was no different, however I am increasingly seeing discussions of esports;  This is something I find very heartening and something which I hope continues.   It would be great to see more and more schools get involved in esports, helping students develop the soft skills which esports support, plus introducing them to the many career paths which esports links to.  

TEISS 2024, Resilience, Recovery and Response

I try and take myself out of the educational bubble at least once per year.   This has been a conscious decision for a number of years as I realised the importance of diversity and therefore the limitations of only looking at IT and at cyber, data protection, etc from the stand point of people in similar educational contexts.    As such the TEISS event is one of those events I try to attend to broaden my experiences and get the views and thoughts of those who exist beyond the educational context of schools and colleges.  

This years TEISS event, where these events focus on cyber security and cyber resilience, had some predictable topics of discussion.  These obviously included Artificial Intelligence and also third party or supply chain risks.    So what were my big take aways from the event?

The cyber context

I am reasonably well aware of the cyber context and the risks which impact organisations in general including schools however the TEISS event presented a couple of key facts which I think are interesting.   That there is a cyber attack every 29 seconds in 2023 says it all, with this only likely to grow once the 2024 figures have been calculated.    This highlights the need for all organisations, including all schools and colleges to consider cyber risks and their defensive and recovery methods.    There is no excuse for having not done so.

Behaviourism

A number of presenters, and a number of those I had conversations with during the course of the conference highlighted the need to consider human behaviour as part of cyber thinking.    A cyber awareness programme isn’t so much about the programme but about bringing about behavioural change, so although having an annual training or other training programme might meet compliance requirements, does it bring about the behavioural change we seek and how do we know that this is the case.    It is about encouraging people to report issues and reinforcing such reports by making users aware of the impact where they do report concerns such as a phishing email.   If we can reinforce this view of reporting having an impact, rather than just being another thing staff are “asked” to do, then we might manage to build the cyber culture we want in organisations.   In discussion with one event attendee they raised a solution which would automatically remove phishing emails from mailboxes once it had been reported, and would then let the reporting user know as to their positive impact.   This seems like a great tool but apparently what had been a cheap tool was bought up by a bigger company and now forms a part of their free valued added tools but to a bigger more expensive product which needs to be purchased.  For schools this brings us back to limited budgets which means that key tooling for cyber security continues to be outside the budgets of those in education.

Its about people

The old Richard Branson quote in relation to looking after your staff as they will look after your customer was raised, albeit with a cyber bent, that you should look after your cyber security staff and they will look after your security rather than focussing on security.   I have to strongly agree with this and also to strongly agree with the need to look after those staff involved from an IT point of view in cyber incident response . The stress levels are high following the onset of an incident and someone needs to make sure that those leading the technical response stop and eat, sleep and take time out.    One interesting discussion which was raised however was how the CISO might do this for their team but who might do this for the CISO.    If the board and senior leaders push for updates and things to be “fixed”, while the CISO supports the team of people doing this work, who looks after the CISO?   Now in my team I feel lucky in that I feel my team would be quick to question me and challenge me to take the necessary time if needed.   This then goes to organisational culture and the culture to question at all levels.  I feel lucky to feel this would happen in my team, although I hope I never have cause to test this in a real incident, as we can only test these things in a real life situation;   Desktop exercises are all well and good but they pale when compared to the stress and challenges of a real incident.

Incomplete information and its inevitable

The inevitable nature of cyber risk is something I have talked about for some time.   You can do all you want in terms of your defences but the defenders need to get it right all of the time, while the attackers need only get it right, or get lucky once, so the probability lies with the attackers.    If we take that defence can never be 100% and therefore attackers always have a chance and will be trying from now unto an organisation ceases to exist, plus that no organisation seeks to not exist, then probability states with relative certainty that an incident will happen, just not when.      And when it happens we will see only bits of the picture initially with increasing amounts of the picture as to the impact of the incident, the ingress route, etc, appearing as time progresses, yet the expectation will be to communicate quickly as to an incident.   In relation t o comms the key message seemed to be that the worst thing to do is to state something which is later proved to be untrue, so this means it is all about saying little.     Another point which came across was related to the cadence of information, in that although we may seek to say little, we should seek to be regular in our communications even if this means saying that investigations are ongoing and that at this stage we know nothing more.   

Cyber and AI…..Or not

Within a couple of presentations the issue of language was raised.   The issue of AI being the current buzz word and being used both in terms of vendors singing about their products, but also in terms of threats and AI based threats, was mentioned.    Maybe AI has become a bit of a buzz word which needs to be included in product pitches, in conferences, etc, and maybe this doesn’t match the reality.   Another presenter raised how we use the term cyber.   Cyber bullying, cyber threats, cyber security, etc.   But isn’t it just bullying, a threat or security, albeit enabled by technology?    And does the use of the cyber word push us to think its an IT issue, an issue for IT companies and vendors rather than something which is the responsibility of a wider organisation, a school or a school community.   Maybe we need to reduce our use of the word cyber and embrace the wider links of technology enabled attacks as a subset of existing issues rather than as something unique and distinct.

Conclusion

I enjoy stepping outside of the education bubble and hearing about what cyber security looks like to those in the enterprise world where they generally have far greater resources.   It is heartening to hear that they suffer from the same problems and have the same answer, despite or in spite of their significantly greater resources.    This continues to highlight for me that “not enough money” or “not enough staff” isn’t the answer as we need to be pragmatic about cyber.   We could have  infinite staff and budget and we would still face challenges.   It continues to be about doing what we reasonably can, and preparing for the worst.   It also continues to be about getting this message across to trustee and governors, that no matter what we do the risk will continue to exist plus also that most schools or colleges which have suffered an incident have moved past it and survived.  In education with students we talk about FAIL as first attempt in learning, and maybe that’s what a cyber incident is? That said, its not a learning exercise I would care to undertake!

ANME South West

Last week I ran the ANME’s South West meeting, once again allowing me to take part in some of the excellent discussion with other IT staff in schools across the South West.   It was also great to see fellow ANME ambassador, Andrew White, back at the meeting after his recent health concerns.   As always the event proved useful in allowing IT staff from schools in different contexts and at different stages in their digital journey to get together and share thoughts and ideas.    It fits perfectly with the David Weinberger quote I so often use, “the smartest person in the room, is the room”.    In seeking to manage the increasing pace of technology and change, sharing and seeking collective solutions is likely to be our best chance to be successful and to thrive or flourish.  

One of the sessions, delivered by Michael Bewis focussed on wellbeing among school IT staff.   Now I found this session to be very refreshing as when looking at workload in schools the focus is often on teachers with well known research such as the teacher wellbeing index seeking to assess stress and workload of teaching staff over time.  But what about the IT staff, who are often quietly working behind the scenes to ensure the technology works as it should, often being very busy to make sure everything works, even when it all seems to be working well, never mind how hard they work when things aren’t going so well.   As such it was good to see some feedback from a survey of IT staff, involving SalamanderSoft and the ANME itself in gathering the data.   That ¼ of IT staff feel their workload is unacceptable and 40% feel undervalued is concerning although I do think this goes to a wider issue in education, including with teachers.   Now budgets, and linked to this staffing, were mentioned, however this is outwith our control, however our expectations in terms of what is done, when and to what standard is within our control, plus communication is also within our control.   We can therefore focus on what is within our control to hopefully seek to reduce workload and increase job satisfaction.    In the session I mentioned my current 3 keys words or entropy, prioritisation and reasonableness.    That we need to accept that education as a social endeavour has so many variables that there will also be a bit of the unexpected and a bit of chaos, but in dealing with this we need to prioritise what matters and do what we reasonably can.    I also note in relation to workload I have concerns as to the efficiency narrative, and trying to solve the issue by being more efficient.  This invariably leads to simply doing more but maybe we should be asking what matters most in schools, in teaching and learning and in IT, and then focussing on this rather than simply trying to do more.    I also think a key part of workload isn’t the tasks, but the culture of the team and of the organisation.  Is working long hours, being first in and last out, being knackered seen as the sign of a good employee?   Or is a good employee the person who gets the job done but knows when to say no or “not now/yet”, and sometimes works late, but other times leaves early, who clearly seeks to balance out work with their wider life?     It is the little things which build the culture, so do the little things in your school build a culture of wellbeing or not?    And as to wellbeing groups and initiatives, I am not a fan, as all too often these are just things tagged on in the interests of being seen to do something, rather than the cultural change which is really needed, and for cultural change to work it needs to be at all levels, at teacher and support staff level, middle leaders, senior leaders and even governors and trustees.

The other session delivered by Toby Ratcliffe, another ANME member, discussed building a resilient IT support team.    I liked the acknowledgement that things are never simple and plans seldom progress as you planned.  This aligns nicely with my concept of entropy as mentioned earlier.    Some of the other ideas presented matched very much with mine such as the importance of gathering data in relation to the performance of the IT team as a whole.   I personally make use of data from our help desk ticketing system as well as office 365 usage and storage information plus also data gathered from an annual staff and student perceptions survey.  This data allows me to highlight all the work my team do plus the ongoing increase in work as we have more systems, more users and more data to support, helping others understand the nature of the work we do on a day to day basis, never mind when things go a bit wrong.   The annual perceptions survey, as Toby noted, tends to be very subjective however, this aside, having some data is surely better than having no data, as would be the case if you never run a survey.   The key about satisfaction surveys in that it allows you to make decisions based on data, or data driven decision making as it may be referred to.  

Overall, it was a very useful and interesting day, with lots of sharing and discussion above and beyond the two presentations mentioned above.    Discussions dipped into cyber security and business continuity, esports (and I note this came up randomly and not of my doing…..honestly 😉) and Windows 11 deployment among other areas.

So that’s the last of my ANME meetings for this 2023/24 academic year, but I look forward to 2024/25 and further meetings in the new academic year.  Through sharing and collaboration we can best meet the challenges of the future, especially where technology is moving at such an increasing pace.

EdExec Live – London

I recently spoke at the EdExec live event, talking about school IT strategy.   I thought I would share some of my somewhat rambling thoughts from the event.  I note one of my opening slides related to Star Trek and what appears to be an ipad-esqe device in captain Piccard’s hands, back in a 1992 episode of The Next Generation.   Now Star Trek TNG is set in the 24th century, yet the iPad made its appearance in 2010, in the 21st century.  This shows how poor we are at predicting the future, however also hints to the pace of technological change.

Tech is here and here to stay

We just need to look at our lives today and we can see that technology is a key part of it.  On my way to London for the EdExec event I used digital train tickets, I listened to music via spotify, worked on some blogs using my MS Surface while also engaging in social media discussion.   I also used Google Maps to help me navigate my way to the event venue.   Technology is now an essential part of our everyday lives.   And looking at schools it is no different.  When I qualified as a teacher, back in the late 90s (and that does make me feel old!) you put your lesson content on a roller blackboard or acetates for display via an OHP.   You recorded student attendance manually in a register.   Now, all of these things involve technology, recording attendance on your schools Management Information System (MIS), putting digital content on your digital panel, smartboard or projector.  You also use digital tools for safeguarding, for communication and for much more.    All of our schools are digital, to some extent, already.

Strategy

And if schools are digital there should be some sort of plan to manage the training needs of staff, sustainability into the future, renewal and updates, etc.     Although the technology is already here, we need to ensure we have a plan to make this situation sustainable into the future.    Beyond the basics, if you are looking to significant innovation, such as rolling out a learning platform or 1:1 devices for the first time, we need a detailed strategy and plan to ensure we get all the basics in place, such as infrastructure, training and support.   After this, once technology is largely embedded and mature, such as at Millfield where 1:1 devices have been in place 2012, office 365 has been phased in since 2019, and Teams/OneNote from 2022, there isnt the same need for a distinct technology plan and technology now takes its lead from the broader school vision and strategy.  So the need for a distinct technology strategy varies with the technology maturity in the school.   I also note as you go down the iPad route, over chromebooks or windows laptops, or Office 365 rather than Google Workspace for Education, etc, and as these become embedded, it becomes increasingly difficult to change path.

A key issue in all the technology decision making is that it is not about the technology, the shiny new Chromebooks or Google Classroom, but about the Why and what you hope to achieve.   Is it about improving access for students with SEND, or about students with EAL?   Is it about supporting the development of soft skills such as creativity, communication, collaboration and problem solving?  Why are you seeking to use technology and what do you how to achieve?    Once you have this you can then look at which technology or technologies are the best fit for your requirements.

Balance

I also highlighted the importance of balance during my session.   Everything we do, which we do for good reasons, will have a negative implication.   We ban phones and students will still use them, plus we lose an opportunity to teach students about appropriate use of their devices.       We buy 1:1 devices and we increase the safeguarding risks as students now have their own personal devices, while also possibly having a wellbeing impact due to increasing screentime.   There is a constant balance and very few, if any, binary situations where something is purely good or bad;   The reality is that technology tends to be good and bad.   The key therefore is the need to consider the options and the good vs. bad continuum and then to work out what works for your school and where on the continuum you will sit, your risk appetite.

Some of the future

I also spent a little time looking towards the future, but acknowledging that we are poor at predicting the future, so I had opted for some future advancements, which are almost here, or here but not fully implemented at this time.     Now this clearly had to include mention of Generative AI (GenAI) and how education and schools need to look to adapt to this new technology, which both students and staff are already using.    If GenAI gives all students the ability to create coursework, homework and other content, but with a broader vocabulary, independent of their primary language, independent of any special educational needs or disabilities and of their creative thinking, isn’t this a good thing?   But if this is the case, how do we continue to grade student work and award them their GCSEs and A-Levels, or maybe we no longer need to rank and order students in the same way we used to?    There is the potential for such a broad shift in education resulting from GenAI, but I also am concerned that there is also potential to expand the digital divides which already exist.

Linked to the above is hopefully that shift towards digital exams rather than sitting students in an exam hall once year with paper and pen.   And I am not talking about the “paper under glass” exams which are planned for the coming years, where the paper exam is just made into an identical digital exam.   I am thinking more about adaptive testing, allowing students to take exams as and when they are ready, allowing schools to manage 100’s of students through a Maths exam for example, but where they don’t have that number of devices and therefore have to put students through in batches.   It may even be that students don’t even sit these exams in the school but can actually engage in them anywhere and anytime.

And in the way of balance, with GenAI, and with a shift towards digital exams, and with more digital time generally, we need to consider the risks related to addictive social media content, data protection of increasing volumes of data being shared, particularly where the data relates to young people, the risks associated with fake news, and with influence and manipulation of people via social media and other platforms.   

A solution?

I finished my session with my favourite quote, which I have been using for years, the quote from David Weinberger, “the smartest person in the room, is the room”.    In a world where technology is moving so fast, and where education has a tendency to move much slower, our best change to maximise the positive impact of technology, while minimising and controlling the negatives, is to focus on the power of the collective.   Working collectively, sharing ideas, what works, but also what doesn’t, will allow us all to be better than any of us can be individually.    Our biggest strength is in networks, in collaborating and in sharing.    The bigger the room, the smarter we all are.

FutureShots 2024, Part 2

This is the second of two blog posts reflecting on the FutureShots event which I attended and spoke at earlier this month.   You can read the first post here which focussed on the first part of the morning, including the keynote from Laura Knight and the panel session which I myself was involved in.   This post picks up from there midway through the morning and starts with another panel session where once again some of the early comments resonated with my thinking in relation to AI and education..

Should we consider if AI means we should stop that which we are currently doing?   This early comment in the session, to me goes to the fact that technology, including AI, is but a tool and you need to use the right tool at the right time, and therefore there is a lot which we do in schools and colleges currently that we should continue doing.   I love a good post-it note in a lesson, posting different thoughts and ideas around the room.  Some of what we do may change however equally some things may not change and this is fine.    The potential for technology and particularly AI to act as an enabler and a leveller was also mentioned, highlighting how, if used appropriately, technology has the potential to have a profound impact on Bobby, a student I will introduce shortly, and other individual students.   Equally during the panel the importance of putting humans at the centre of things, including of AI use, was stated alongside human characteristics such as emotional intelligence

Gemma Gwilliam, a fellow member of The Digital Futures group was up next as part of a panel session alongside Jordan King, Global Opportunity Scholar di Franklin University Switzerland and Jean Wu, Director of Green Office Sustainability Programs di Franklin University.   This was the first of Gemma’s two contributions to the event.   This session focussed on sustainability and I very much liked the comment on the multi-faceted nature of sustainability.   As a director of IT when I consider sustainability I am often thinking about financial sustainability in terms of ongoing replacement and refresh of devices and hardware, or about systemic sustainability in whether a process will be repeatable and scalable.   There is also the environment sustainability; does the solution allow us to thrive or merely to survive?   Gender equality was also raised as a sustainability issue which to me makes a lot of sense, but I had never previously heard it discussed in this way, in relation to sustainability.   It was also very refreshing to hear how AI shouldn’t be seen as a replacement for “flawed” humans, as AI is also flawed.  I think this is very interesting as it acknowledges our human flaws and therefore suggests we may need to re-evaluate quite how critical we are of AI when it comes to bias and inaccuracies, etc, where we as humans, on careful analysis, don’t do much better despite the fact we convince ourselves that we do.     On this panel, Jordan a young researcher raised the issue of how some see gen-Z as being lacking in resiliency and maybe even  being “soft” and in need of constant “trigger” warnings however she then proceeded to point to all the social media and the events in the press, to conflict such as those currently engulfing various parts of the world, which have bombarded her generation through technology more than any other generation in history;   Maybe we need to cut gen-Z a bit of slack here.

Now in the afternoon I didn’t take any real notes mainly due to my Surface battery giving up, and me having left by battery pack back at the room which was some distance away.    I do remember Gemma’s second contribution of the day, although this might be due to arriving just as she was due to start, sitting in the front row and beginning to eat from a little tub of ice cream;   Who needs supportive colleagues when you have me?    Now as it was great to hear Gemma hit a particular theme which I believe is so important, in the need to seek collective knowledge.   She mentioned a great set of books, including Darren Whites book, where Darren is also a Digital Futures Group member, plus a variety of others.  She also mentioned blogs, including mine, plus other online groups and individual sharing ideas, thoughts and resources.  For me in a world where technology is moving so fast the old methods of centralised reform and of waiting for CPD are no longer as appropriate as they are too slow.    The key and our best hope is to network and to collectively share ideas, thoughts and resources, and this type of networking is the key tenet of the Digital Futures Group.   “The smartest person in the room, is the room” as David Weinberger would say, so the bigger the room or more rooms you get involved in the better you, and collectively everyone, will be.    Gemma towards the end of the session eloquently brought things back to the students, as that is what education and schools are all about, however her use of “Bobby” and it all being about Bobby, and about us needing to consider Bobby and the effect on Bobby was so very impactful    This act of putting a name and making it about a specific, visualisable student rather than the generic and nebulous “students” makes all the difference and really helps nail the need to consider the individual learners in all we do.  

It was great to finish the day on top of what we dubbed “Teletubby hill”, being the grassed roof of the building in which the conference was held, looking off towards the setting sun.   It was a very busy but also very useful and interesting day.  And there was ice cream so what more can you ask for.  

How little did I know that attempted murder and Gondola related trauma awaited on day 2!