Review of 2025

As 2025 speeds to a close I, like many other staff in schools, find myself wondering where the time and the year has gone.  Working in schools seems to be a constant sprint from the start of a term, until half term, then onwards to the end of term, before a brief pause and go again.

As the year draws to its close it is an important opportunity to stop and reflect so I thought I would share some of my thoughts, the highlights and even some of the low lights from the year with you.

Now for a number of years now I have been sharing pledges, setting out my plans and targets for the year ahead however at the start of 2025 I didn’t do this.  I did personally make some notes as to what I would like to achieve and to do however I never packaged this up into the usual blog piece as I would have done in the past.   I had been questioning whether having targets led to chasing the targets rather than more broadly experiencing and hopefully enjoying the year, so this may have led me to not put aside the time to create a blog post.   I simply deprioritised this particular task.   And I think this is something I am struggling with;   Having lists and targets, which allow easy measuring of progress and planning, etc, versus being more freeform, having the opportunity to respond to things as they arise, and maybe getting my head up more, to enjoy life, the world and everything.   I suspect as always it is about finding the appropriate balance, where I think recent years have moved me more towards lists, spreadsheets (who doesn’t like a todo list with colour coding?) and targets, a little more than I am now comfortable with.

Fitness has been a running theme (did you see what I did there?) for me over the last few years.   My distance for the year was just over 250km, which was far less than the 500km and 1000km from previous years, but this year was marked with such inconsistency in my running that even achieving 250km was something to celebrate.   I got there in a couple of concerted efforts over a couple of months rather than more evenly across the year.   I suspect here it was simply a case of other more important things taking priority and I am happy with that, but it also highlights that I need to see how I can better balance fitness out with other demands, as this year the inconsistency caused me some stress and disappointment.   I note that at this point, from a weight point of view, I am heavier than I have been for several years, where weight was a big part of me starting to run, so this is something I am going to have to reflect on ahead of 2026.

Reading has been another thing I have been trying to do more of;  Sadly it has been finding the time, however my increasing need to drive around has led me to listen to audiobooks more and more frequently.  This has allowed me to get through quite a few books.  I don’t think I have engaged with the various titles quite as well as I would have if reading them normally, complete with inserting post-it notes and annotations, but listening to a variety of titles, is definitely far better than doing no reading or listening at all.

 2025 has definitely seen me having more time for me, my partner, and our kids.   This has been absolutely amazing, however has required me to reduce time spent on other things.   Sadly to have more time for one thing we need to have less time for something else;  Something I think those leading education at national level could do with remembering.   On reflection I wouldn’t have done it any differently.  I do however need to get this balance right such that I challenge myself in terms of being busy, yet don’t have unrealistic expectations which leads to stress and disappointment.    On the positive side, of particular note is our nice new world map board where 2025 has seen us put a number of pins in the board to indicate places we have been and things we have done.   From this point of view 2025 has been an amazing year, with my relationship with my partner going from strength to strength, one amazing memory after another, and the year isn’t even done, with plans for what little remains including a very significant life event, which actually sees me sat here in highland dress, but enough of that for now.  We only recently found ourselves flicking through our photo library from events across the year, and we have done so much and had such fun.  This all makes me all the more positive and happier as to the looming 2026. 

2025 has however saw some less positive moments in some health issues within my immediate family, with these causing some worry and stress.   As I write this, one of my elderly family is once again back in hospital having had tests and then a minor surgery.   This adds an element of uncertainty, worry and stress to the immediate weeks, and over the festive period, however I hope that the tests and resultant care of the NHS will be able to address things and allow all to enjoy the festive period and progress on a firm footing into the year ahead.  I do however see 2026 as involving some difficult discussions as to the future.   There is also some uncertainty in relation to my son, who having finished college is still looking for the first steps on the career ladder. I compare his situation to mine, when i was that age, and I think things have only got more difficult for the young. I am sure something will present itself however i recognise how difficult it is when receiving rejection emails or simply not hearing back following applications for junior positions. As to apprenticeships, I feel there are just so many young people chasing the limited number of opportunities, which is a shame as I see apprenticeships as a great way of providing young adults both additional educational opportunities but also the much needed experience of the working world.

Looking to work and to my involvement in technology and education, it has been a solid year, albeit I don’t think it was as busy as 2024.   January saw my first opportunity to speak at the BETT conference, delivering a cyber security session to a packed room, despite a worried period of preparation and fear that my session, last thing on the final day of the conference, would be poorly attended.   This year has also seen me awarded the EduFuturists Outstanding IT professional award.  The highlight of this, other than the uprising event itself and all of the wonderful people there, was the fact the award was given to me by Dave Leonard, an EdTech superstar and my friend and colleague.  Also adding to this event was the fact I also collected an award on behalf of my Millfield colleague Kirsty Nunn.     Then there has been my work with the ISC Digital Advisory Group, leading on the 2025 conference, before becoming the new chair of the group, and beginning the planning for next years conference (save the date: June 19th, University of Roehampton, if you are interested!  Promises to be an amazing event).     And I can’t fail to mention the Digital Futures Group and a WhatsApp group which I often struggle to keep up with.   Such a brilliant group of people, doing so many amazing things in schools across the country, while also being open, friendly and supportive people that I feel blessed to consider as close friends.   And there is also the ANME, and several hundred WhatsApp groups (thanks Rick 😉) and being able to lead the various Southwest regional events, and some amazing discussions in relation to technology in schools.  

Looking forward to 2026, I look forward positively but also am conscious that it likely will represent a year of some challenges and change, but when is this not going to be the case.   It would be nice for things to be easy and convenient; however I also note that we need struggle and challenge in order to feel like we have actually achieved something.   Some difficulty is actually desirable; it is just about trying to find the right balance.   2025 definitely had some challenges, but looking back it has, in a broad and general view, been successful and positive.   I have some amazing memories particularly personally, and I suspect the day ahead of me will see more amazing memories created before 2025 is finished.   As I mentioned back at the end of 2024, maybe the journey is more important than the destination, and the journey through 2025 was filled with memories, and a sense of progress.   What more can you ask for?

School social media checks in a world of AI

In the UK, it’s increasingly common for schools to carry out social media checks on candidates invited for interview. The intention behind this practice is clear: safeguarding students. These checks are designed to identify concerns that can be discussed openly during interview rather than to eliminate candidates before they’ve had a chance to explain. But the social media checks rely on the AI algorithms inherent in social media tools, with all their flaws, plus the increasing availability of AI tools introduce new risks and challenges.

The Promise and the Pitfalls

On the surface, social media searches feel and look simple and straightforward.   You might ask candidates to provide their account details for the social media services they use, or you may simply search to find them.    You can then peruse or search the content, what they have posted, replied to, etc, and from this identify questions or areas of discussion for interviews. 

But online posts are often without context meaning it is all too easy for posts to be misinterpreted.  It makes me think of VAR and football, that things often look worse when slowed down in a video review, or when slowed down looking at posts without the emotions and pace of a specific moment in time, responding to others posts and thoughts.   Also do HR professionals truly understand how social media and their search or display algorithms work?  It is AI algorithms which decide what information to present meaning there is a risk of bias.    The algorithms might focus particularly on those posts most likely to spark a reaction, as they seek to keep people on their platform, not understanding the aim of the HR staff member carrying out the searches.   A harmless cultural reference or a joke taken out of context could be surfaced simply because the system has been trained to surface such comments due to the greater likelihood that such comments would lead to strong feeling, comments and the engagement in the platform. This may make things look worse, or better, than they are.   Equally AI algorithms might surface different types of post based on gender, ethnicity, age or other data points related to a candidate, potentially introducing bias to the data the HR team have available to them.

The Rise of Synthetic Content

Then there’s the growing threat of fake content. Deepfakes and AI-generated images are no longer the stuff of science fiction, they’re here, and they’re convincing. Imagine a candidate being implicated by a fabricated photo circulating online, or even a fake video. Without robust verification processes, schools could make decisions based on lies. How many HR teams are prepared to spot a deepfake? How many even know what to look for?   Also, as we see growing numbers of people using wearable technologies, such as smart glasses, how are HR to react when footage was taken without the applicants’ knowledge, before being posted online?   How would they even know it was without consent, and therefore illegal?  Would it be acceptable to use such a post within an interview process?   What if the applicant pointed out it was taken without consent and therefore was being processed illegally both by the poster and now the school?

Safeguarding vs. Fairness

The tension between safeguarding and fairness is real. While protecting students is paramount, recruitment must remain ethical and transparent. Social media checks should never become covert screening tools. Candidates deserve the chance to explain context, and decisions should be based on facts, not assumptions. Yet when AI enters the equation, the line between fact and fiction can blur alarmingly quickly.

There’s also the question of privacy. GDPR sets clear boundaries, but do all schools adhere to them when using AI-driven tools? Consent is critical, as is clarity about what these checks involve. Without transparency, trust in the recruitment process erodes and that’s a risk no school can afford.

Bridging the Knowledge Gap

The truth is, many HR professionals in education are experts in safeguarding and compliance, but not in data science or AI ethics. This knowledge gap matters. If we don’t understand how these tools work, we can’t challenge their outputs. We can’t ask the right questions about transparency, fairness, or verification. And we certainly can’t protect candidates from the unintended consequences of flawed algorithms.   This for me is key, in ensuring HR staff understand the tools they are using if undertaking social media checks, including understanding the risks which may arise from the use of AI powered search tools inherent in social media platforms.   Additionally they need to understand the risks as they relate to fake content, including audio, images and video;  What you hear or see may not be all that it appears to be.

A Call for Reflection

Ultimately, the goal is simple: to keep students safe without compromising the integrity of recruitment. Achieving that balance requires more than technology, it critically requires understanding, vigilance, and a willingness to challenge the systems we rely on, and the content they may present.   If schools are carrying out social media checks, the widespread availability of generative AI tools, and our increasing awareness of the risks particularly around bias, means maybe we need to revisit this and ensure we have considered all of the implications.

Think fast, act wisely

I recently gave a presentation titled “think fast, act wisely” at the Berkhamsted IT Conference looking at the future of education as I see it.  Below are my key thoughts as presented at the conference.

The landscape of education has always been shaped by the tools at its disposal. In 1998, when I entered the teaching profession as a fresh faced, and very young looking newly qualified teacher, the classrooms I worked in were dominated by blackboards and chalk.  I still remember dusting myself off each day having leant on the board as I was writing things.  The late 1990s and early 2000s saw the first technological shifts, but the pace of technological change has only accelerated since then. As we stand on the brink of increasingly frequent breakthroughs in artificial intelligence (AI), quantum computing, AI use in medicine and even bioengineering, it’s clear that technology is transforming our world. Yet, education often lags, moving at a much slower, incremental pace.

The Slow March of Educational Change

Despite the rapid evolution of technology, the structure of education remains stubbornly familiar. Promises of transformation have been plentiful, but delivery has been inconsistent. Interactive whiteboards, virtual learning environments (VLEs), flipped learning, MOOCs, and 1:1 device programs have all brought benefits, but rarely have they redefined learning in a fundamental way.   The SAMR model has presented a way of examining our tech integration in schools, however time after time we have been stuck at substitution, never progressing to the potential of augmentation, modification or redefinition.  The IWB for example simply replacing the whiteboard which had previously replaced the blackboard, or the 1:1 device, often being little more than a replacement for the textbook and workbook.

The barriers to the potential of tech are not just technical but human, including insufficient training, resistance to change, or processes that fail to adapt. Sometimes, the technology itself falls short particularly in relation to the sales promises when compared with pragmatic tech use in real life classrooms.   If there has been some meaningful progress, as I believe there has been in the last 5 years, it has been the result of significant external forces, like the COVID-19 pandemic.  I personally saw the potential here, with one student in particular who struggled in the classroom but thrived engaging remotely via Teams.   The issue here was that this progress was not sustained, and there was a “rubber band effect,” snapping back to old habits once in-person learning resumed. 

Artificial Intelligence: Promise and Peril

The arrival of AI in education marks a new chapter, and another external factor impacting on education. Tools like ChatGPT have democratized creativity and output, allowing students of all abilities to produce impressive work. AI can also assist teachers with content creation, marking, and summarizing, freeing up time for deeper engagement. However, these advances come with significant risks: ethical concerns, bias, accuracy, and the potential for disinformation. The binary debate of block or allow misses the nuance required to navigate these challenges.

If education continues to move slowly, it risks falling behind. And as this occurs some users will simply seek to act on their own, using whatever tools they feel are of benefit.   Shadow IT, where users, both staff and students, unofficially use technology solutions may then introduce safeguarding and cybersecurity risks. Also, the divide between those who embrace new tools and those who don’t will widen, exacerbating inequalities. Moving too fast, however, brings its own dangers.   I, for example, worry of those chasing “shiny new things” without proper embedding, straining resources, and overlooking critical issues like data protection and AI ethics.

Striking the Right Balance

The key is balance. Every new technology brings a mix of risks and benefits, and schools must develop and understand their own risk appetite. Regulation and compliance are essential, and many look to centralized guidance for support however this is slow to arrive, often seeing publishing only after technology has already changed or moved on.   It is therefore about considering the risks and about going much deeper than the block or allow narratives that often prevail.   It is possible to find a middle ground blocking some things, while allowing others, accepting some risks, accepting the weakness of technical controls, but mitigating through education and also using mistakes people may make as an opportunity for learning rather than as situations that must be eradicated.     We must think deeper.

For example, exam specifications warn against copying material and stress the importance of “own work.” But what does that mean in an age where AI tools are embedded in the learning process? If I use spell checker or grammar checker, or if I get AI to help start my document or to review the content and offer feedback;  is it still my work?   In thinking about this, if my teacher teaches me something, and I then use this knowledge to write a piece of coursework, if this my work?   And why wouldn’t I want to use AI to review and help me improve my work;  Don’t we want students to achieve their full potential, using the technologies which are available and will be available in their lives beyond the school?   The phrase, “must be the students own work”, seems pretty straightforward however in a world of AI tools, where these tools are embedded in the productivity tools we want students to use, it may not be that simple.    We must think deeper.

Rethinking the Purpose of School

This is a moment to reflect on the fundamental purpose of school. Do we need physical buildings? What do we want our schools to achieve, and how should technology help us get there?    Values and progress are crucial as is accepting that change is inevitable.  As such we need to become more agile and flexible to such change.   But we are not alone.  Through collaboration, discussion, and sharing we can collectively approach the challenges.

Big Questions for the Future

As we look ahead, a couple of big questions emerge:

  • What does human flourishing look like, and how do we support students to thrive now and in the future?
  • What is the purpose of education? How do we assess learning in a world shaped by AI, and what are we actually measuring?
  • How can AI best support students, teachers and school leaders, freeing them to think deeply and creatively?

The answers require courage, pragmatism, and a willingness to adapt. The education sector has an opportunity to be brave, but this means learning to move faster, but also to act wisely, navigating the balance between these competing requirements. It is due to this that we need to get better at managing risk, and that includes actually establishing what the risk of harm is and taking crucial decisions in relation to allocated resources.  We cant address every risk to the same extent.   

Above all, we must remember that we are on a shared journey.  In a world of AI, synthetic identities and AI assistants , it is collaboration, and human connection, that may be our greatest asset.

Citizenship Education in an AI-Driven World

Having recently met with colleagues looking at Digital Citizenship Education this encouraged me to scribble together some thoughts, which form the below post. I believe, now more than ever, we need to re-examine our education system especially as it relates to digital citizenship and preparing our students for the digital world we now live in, and for the digital world of the future, whatever that might look like.

For years, educators have spoken about “digital” citizenship as if it were a distinct concept, something separate from the so-called “real” world, and real world citizenship. But that separation no longer exists. Today, digital systems underpin almost every aspect of daily life: banking, healthcare, travel, shopping, and even social interaction. When these systems fail, life grinds to a halt. You just need to look at the Crowdstrike incident and the more recent AWS issues to see this.   Any separation between the digital and the real worlds which may have existed in the past, no longer exists.

This raises an important question: does the term digital citizenship still add value? I don’t think it does. Instead, we need to think about citizenship in a connected world; a world where artificial intelligence (AI), automation, and global networks shape how we live, work, and relate to one another. The challenge is not simply about teaching children how to behave online; it is about preparing them for a society where technology mediates almost every interaction. It’s also about preparing them for a future world where the technologies of today will have been replaced by new technologies, some we can predict and others that may not currently be as evident.

But this digital or technological change isn’t new to discussion of citizenship.  Citizenship education has never been static. It has always evolved in response to societal forces. In early modern Europe, it sought to counter superstition and establish rational norms. In the twentieth century, it became a bulwark against fascism and communism, promoting democratic values and civic responsibility. Today, the forces shaping citizenship education are different but no less profound. We face questions about identity and belonging in a globalised world, the ethical implications of AI, and the fragility of truth in an era of misinformation and disinformation.  Privacy is also an issue, or construct, which is now under question as we are faced with a world where people wander the streets with AI powered smart glasses and other wearables on, constantly recording, cataloguing, summarising and recommending our every action.  These are not abstract concerns; they affect how societies cohere and how individuals navigate their rights and responsibilities.

As I think about this I wonder about a fundamental tension: do we teach for a global society or at national level, reinforcing national norms? Should education prepare young people to embrace diversity and shared human values, or should it prioritise national identity and social cohesion? This is not a trivial question. It touches on debates about migration, climate change, and international governance. It also exposes the political nature of citizenship education. What we choose to teach and what we choose to omit reflects the kind of society we want to build.  These decisions are made at national levels, albeit contain some reference to globalisation.

Layered on top of these questions is the reality of AI and algorithmic decision-making. Increasingly, decisions that affect our lives including credit approvals, job applications, even court outcomes are mediated by algorithms. The news, even on trusted news platforms, are influenced by the swathes of data and content generated on social media.  AI tools further complicating things in enabling the generation of fake content which makes it increasingly difficult to discern the truth and whether something presented as true is actually true or if something presented as false, is actually false.   Understanding how these systems work, and their potential biases, is essential for informed citizenship. Without this knowledge, individuals risk becoming passive subjects of technology rather than active participants in shaping its use. 

Then there is the problem of information disorder, of misinformation and disinformation. Deepfakes, misinformation, and polarised media ecosystems challenge the very notion of truth. If citizenship education once focused on teaching civic literacy, it now must teach epistemic resilience in the ability to question sources, verify facts, and resist manipulation. In a world where AI can generate convincing falsehoods at scale, this skill is not optional; it is foundational.

So, what should citizenship education look like in this context? It cannot be reduced to a checklist of technical skills. It must cultivate critical thinking and not just the ability to analyse arguments, but to interrogate algorithms, question data, and understand the socio-technical systems shaping our lives. More than ever we need to not just question what we see and hear, but also the why;  Why an algorithm has chosen to present this content over other content and what this might mean.   It must emphasise human skills such as empathy, collaboration, and adaptability, qualities that machines cannot replicate but which are vital for social cohesion and ethical decision-making. It must foster ethical literacy, enabling students to grapple with questions of fairness, privacy, and accountability in AI systems. And it must build resilience, preparing young people to cope with uncertainty and change in a world where disruption is the norm.

Citizenship education in the age of AI is not about adding a few lessons on online safety or digital etiquette. It is about rethinking what it means to live responsibly and ethically in a world where technology mediates almost every interaction. Educators, policymakers, and communities must ask hard questions: what values do we want to uphold in a connected world? How do we balance global responsibilities with local identities? How do we ensure that technology serves humanity, rather than the other way around?  What does it mean to flourish in a technological world and how do we support our students to flourish?

The answers will shape not just curricula, but the future of democracy itself. Citizenship education has always been about preparing young people for the world they will inherit. Today, that world is algorithmic, interconnected, and uncertain. Our task is to ensure they enter it not as passive users, but as active, ethical citizens.   And this all requires that we start thinking deeper, asking more probing questions and supporting and encouraging our students to do the same.

Education and Football: Is It All About the Result?

I’m a football fan. I love the game, the drama, the unpredictability although don’t get me started on VAR.   I’ll admit, my team is having a bit of a rough patch right now. We’re not playing well, the performances are shaky, but I still find myself hoping for a win, no matter how it comes. A scrappy goal in the 89th minute? I’ll take it. A lucky deflection? Fine by me. Three points are three points.

That got me thinking about education.

Because in many ways, education today feels a lot like football. It’s all about the result. For students, parents, and even universities, the final grade is what matters. An A* is an A*, and a D is a D, regardless of how you got there. Whether you coasted through with natural talent, worked tirelessly every night, or crammed in the final weeks, the grade on the certificate is the same. Just like in football, where a win is a win, no matter how ugly the game was.

But should it be that way?

If education is truly about learning, about growth, development, and preparing young people for the future, then surely the journey matters just as much, if not more, than the destination.

The Scoreline Obsession

Grades are the currency of education. They open doors to continued education, to universities, apprenticeships, and jobs. They’re used to measure school performance, teacher effectiveness, and student potential. In a system so heavily focused on outcomes, it’s no wonder that the process of learning often takes a back seat.

This results-driven culture can be seen everywhere: revision guides that promise top marks with minimal effort, tutoring services that focus solely on exam technique, and students who ask, “Will this be on the test?” rather than, “Why does this matter?”

It’s understandable. Just like football fans want to see their team win, students and parents want to see results. But when we focus only on the final score, we risk missing the bigger picture.

Learning as the Journey

Imagine a football team that wins every game but never improves. They scrape by with luck, individual brilliance, or opposition mistakes. They don’t train hard, they don’t develop tactics, and they don’t build team chemistry. Eventually, that luck runs out.

The same is true in education. A student might achieve a top grade through memorisation or last-minute cramming, but what happens when they face a university course that demands critical thinking, independent research, or long-term project work? What happens when they enter the workplace and need to collaborate, adapt, and solve real-world problems?  And collaboration, critical thinking and other so called “soft-skills” are going to be all the more important in a world of AI, robotics and other tech tools.

Learning is the training ground. It’s where students build the skills, habits, and mindset they’ll need for life beyond the classroom. It’s where they learn to fail, to reflect, to try again. It’s where they discover what they’re passionate about, what they’re good at, and what they need to work on.

If we reduce education to a scoreboard, we risk turning it into a game of short-term wins rather than long-term growth.   What we measure, in tests, in coursework and other things which are easily measurable becomes what matters, rather than focusing and measuring what really matters; learning.

The Pressure to Perform

There’s another side to this too. When results are all that matter, the pressure on students can be immense. Just like footballers who fear making a mistake in front of a hostile crowd, students can become anxious, disengaged, or even burnt out.

We see this in rising levels of exam stress, in students who feel like failures because they didn’t get the grade they hoped for, and in those who give up entirely because they believe the system isn’t built for them.

But if we shift the focus to learning, to progress, effort, and resilience, we might create a more inclusive, supportive environment. One where students are encouraged to take risks, to ask questions, and to grow at their own pace.

Rethinking Success

So how do we balance the need for results with the value of the journey?

We can start by redefining what success looks like. Yes, grades matter. But so does curiosity, creativity, collaboration, and character. We need to celebrate improvement and engagement as much as achievement. We can value the process of learning, not just the product.

Teachers can design lessons that encourage exploration and reflection. Parents can ask about what their children learned, not just what they scored. Universities and employers can look beyond grades to see the whole person.

And students? They can be reminded that their worth isn’t defined by a single letter on a piece of paper.

Final Whistle

Don’t get me wrong, I still want my football team to win. And I understand why students want top grades. But just like in football, where the best teams are those that grow, adapt, and play with purpose, the best education systems are those that value the journey as much as the result.

Because in the end, it’s not just about what you achieve. It’s about who you become along the way.

Creating original work?

What Does It Mean to Present Original Work?

In an age of abundant information and powerful tools, the idea of “original work” is increasingly complex.   I often find it difficult philosophically when exam boards state, in schools and colleges, that work should be the “students own”.   What does that actually mean in todays world?     If you write or create something based on a lesson, a book, or with the help of AI, is it still your own?   

Learning vs. Creating: Where Does Originality Begin?

Originality doesn’t mean creating something in a vacuum. In fact, most original work is built on what we’ve learned from others. If a teacher explains a concept and you write about it in your own words, that is your work. You’ve processed the information, interpreted it, and expressed it through your own understanding. The same applies when you read a book and then write about it.   You add your voice, your synthesis, and your framing make it original.  

The Role of AI and Other Tools

Using AI to help structure your thoughts, refine your language, or even co-write parts of your work doesn’t automatically make it unoriginal. Tools have always been part of the creative process and we don’t want to remove them.  Think of spellcheckers, grammar guides, or tools to make writing easier such as the word processor or PowerPoint for creating a slide deck, or even just brainstorming with a friend.  Even consider the humble pen and paper.   Without the tool the output would be different, so the tool shapes the output, but yet we still consider the output to be our own.   The key question is: Are you using the tool to express your own understanding, or are you outsourcing the thinking entirely? 

If AI helps you articulate or present what you’ve learned, it’s still your work. But if you rely on it to generate output which you haven’t engaged with or understood, then I think it is fair to say the final product isn’t your own.  So, engagement and intent are key here.   If AI tools are being used for the right reason, to learn, and you engage with the AI in the production, co-production if you will, then it’s your own work.

Originality Is About Ownership of Thought

At its core, originality is about intellectual ownership. It’s not just about where the information came from, but how you’ve made it your own. Did you wrestle with the ideas? Did you connect them to other things you know? Did you form a perspective? If the answer is yes, then your work is original even if it’s inspired by others or supported by tools.

The Product vs. the Process

One of the most important insights is that the final product doesn’t always show the depth of learning behind it. A polished essay might look effortless, but it could be the result of hours of reflection, revision, and growth. Equally it could have been easy for an individual.  Conversely, a well-written piece generated mostly by AI might lack the personal journey that makes learning meaningful.  That said it could also represent the output to hours of effort, iteration, exploration and revision with the support of AI tools.

In considering the classroom and art for example, who achieves more, the student with strong artistic skills who produces a good graphic with little effort or the student with poorer skills who also produces a good graphic, but with the support of AI where they engaged with the process, contributing their ideas and identity but relying on AI tools for the realisation of the work?   Is it the product we value, in which case does it matter?   Or is it the process in which case one of the students clearly was more engaged in effortful learning.   I suppose it depends on what you are actually assessing.   And maybe that’s part of the issue.   Have we become to focussed on the product, the good graphic, rather than looking at the process, or more importantly effortful learning.

So perhaps in looking at if work is the students own, a better question is: Does this work reflect what they’ve learned? If it does, then it’s a valuable and original contribution.

Note: this piece was written with the help of AI.  It comes from my ideas and initial prompting, was refined through further prompting with final edits of the text.   In posting it I do so as it reflects my views on originality however benefits from AI’s broader vocabulary and structure.    Is it still my work?   I think it is.

Technology in Schools: Innovating While Staying Safe

The technology in education landscape is evolving at a pace that often feels dizzying.   One look at the last few years and Artificial Intelligence (AI) development alone and this is pretty evident, never mind the pair of smart glasses I am currently experimenting with and what they might mean for students with English as a second language or with special educations needs, not to mention the challenges around academic integrity.   New tools, both hardware and software, emerge almost daily.   For schools these new tools offer such potential but adopting is complex and takes time, and relies on teachers having space to trial, experiment, and build confidence before these tools become part of everyday practice.  

The Challenge of Adoption

Introducing new tools often means rethinking lesson plans, learning new interfaces, and managing technical hiccups, all while maintaining the core responsibility of delivering quality education.  This all takes time.   With time and workload being such an issue, as has been identified by successive teacher workload surveys, it is often a case of comparing possible benefit versus the cost of exploring and testing new tools, where such exploring and testing may identify tools that aren’t fit for purpose.  When this happens the time appears wasted, leading to some being reluctant to invest the time in the first place.   For those who do invest the time, it is either at the expense of other things or at the expense of themselves in exploring in their own time.  And I note I have seen many a teacher present at webinars and conferences as to how they have used one amazing tool after another, which is great at an individual level and based on their personal investment in exploring and experimenting, but the question is how we scale this up to become the norm across all teachers.

Some years ago I presented at a conference in Dubai in relation to the need for teachers to build confidence where they wish to fully embed technology tools in their practice, however building such confidence is difficult when there isn’t the time and where platforms and solutions move on so quickly.  So, what is the solution?

Democratising Creativity: Let Students Lead

Perhaps the solution lies in shifting the focus. Instead of expecting teachers to master every new tool, why not empower students to experiment?  In the words of David Weinberger, “the smartest person in the room is the room” so what if we count the students in our classrooms?   There is only so much experimenting a single teacher can do, but a class full of students can do much more experimenting and sharing, with this facilitated and directed by the teacher.   The teacher doesn’t need to know every app or tool as long as they understand possible risks and pitfalls.

Coursework and project-based learning offer ideal opportunities for pupils to explore emerging technologies, whether that’s creating multimedia presentations, using AI to generate ideas, or leveraging coding platforms to build prototypes.  AI tools in particular mean that really impressive products, be these images, presentations, videos, or more, can be put together by any student able to type prompts into an AI, and then able to review and refine their prompts and resulting output.    We can truly allow students to be expressive and exploratory in their learning and in how they evidence their learning.   It is very much the “democratising creativity” which I have heard Dan Fitzpatrick refer to on many occasions.   

Safety First: Privacy and Data Protection

Of course, this freedom must come with guardrails. Schools have a duty to protect students’ welfare, their data and their privacy, so need to ensure that experimentation happens within safe, ethical frameworks. This means clear policies on what tools can be used, vetting and risk-based consideration of platforms, and ongoing education about digital responsibility.

Data protection isn’t just a legal requirement; it’s a trust issue. Parents, staff, and students need confidence that personal information is secure and that applications and tools which teachers introduce or recommend have this security at their heart.   I note that nothing is without risk but that doesn’t absolve us of responsibility to do what is reasonably possible, to do due diligence, put mitigation measures in place and protect our students where possible. 

Cybersecurity incidents, such as phishing attempts or data breaches, underscore the importance of vigilance. Even seemingly minor lapses can erode trust and expose vulnerabilities. I myself have seen how data incidents can have an impact years after the initial incident and where the initial incident was deemed to have been low risk.   We therefore need to ensure we have processes in place to manage and minimise such risk.

But, things going wrong, error and failure are part of the learning experience, and allowing students to experience such things, along with supporting them on the road beyond, and in building their resilience, are important parts of preparing for the world beyond education.  Supporting students to develop the skills so they can evaluate what they have done, refine and improve is likely to become more and more important in a world where information and knowledge is so widely available.

Balancing Innovation and Responsibility

The path forward requires balance. Schools must embrace innovation to remain relevant and prepare students for a digital future, but they cannot do so at the expense of privacy, security, safety or equity.  In considering new tools and technologies this therefore gets me thinking about the below:

  • Data Protection: Does the technology introduce data protection risks in terms of data sharing, including with AI tools for training, or with advertisers or other third parties?    How long is data held for?  Is the use of data limited to a clearly stated purpose?  Where is the data geographically stored?
  • Cyber Security: Does the technology vendor put in place basic measures for security such as MFA and requirements for breach and vulnerability notification?
  • Ethics and transparency:  Is the tool ethical and “right” in its planned use?  
  • Age limitations and T&Cs: Does the planned use align with the terms and conditions, including any terms which limit the age of users?  Also is the tool designed for and appropriate to the age of planned student users?
  • Intellectual Property: Who owns the product of the given technology?   Is it the user, be they student or staff, or is it the technology vendor?
  • Sustainability: Is the solution financially viable into the future, if there is a cost, as well as environmentally sustainable?

I note the above aren’t all the possible considerations but they are a good starting point.  I also note that, for older students, it may be appropriate to get the students themselves thinking and investigating the above before seeking to use or explore a given tool.

Looking Ahead

Technology will continue to reshape education, but its success depends on thoughtful integration. By empowering students to lead experimentation, schools can harness creativity while giving teachers the time and support they need. Teachers could be the tech guide on the side although there may be work required to help them build the confidence to act as such.   Coupled with strong privacy safeguards, this approach ensures that innovation enhances learning without compromising safety.

He was efficient!

We live in a world obsessed with outcomes. Achievements, milestones, KPIs, medals, checkboxes; the language of success is often framed around what gets done, how fast, and how well. Life, it seems, is about doing stuff. And doing it efficiently, with Artificial Intelligence being the latest shiny tool to help us be more efficient, as if efficiency was the ultimate outcome.    Am not sure I want “Here lies Gary, he was efficient” on my tombstone.

As someone who works in IT, I’m no stranger to the allure of systems, processes, and measurable results. There’s a certain satisfaction in ticking off tasks, optimizing workflows, and seeing progress in neat little graphs. I love a spreadsheet for keeping records of tasks complete with conditional formatting and an abundance of complex formula.   Or maybe a little PowerAutomate to gather approvals and ping emails to the relevant people while also updating a master SharePoint list.  And lets not even discuss having SMART KPIs and a PowerBI Dashboard or two. But lately, I’ve been thinking: maybe we’ve got it backwards. Maybe the real value isn’t in the doing itself or what we get done or achieve, but in how we feel while we’re doing it and in the experience.

Take my recent attempt to restart morning running. It’s been… messy. I used to be consistent(ish). Up early, shoes on, out the door, often before I realised what I was doing. Now? I snooze the alarm. I negotiate with myself. It looks a bit dark or a bit too cold or I don’t quite feel 100%, and then I climb back into bed.  Some (read: none recently)  mornings I make it out, some I don’t. And when I do, it’s not elegant. It’s slow, it’s uncomfortable, and it’s definitely not social media worthy and I do sometime feel for the poor souls who have the pleasure of encountering me on my morning run.

But here’s the thing: those runs, the imperfect, reluctant, sweaty ones, are teaching me more than the polished ones ever did, although am not sure any of my runs can ever have been described as polished. They remind me that the journey matters. That showing up, even when it’s hard, is a kind of quiet victory. That the struggle itself is part of the story.  Things are messy and sometimes, rather than getting annoyed or angry with myself, I need to accept I am human and that it is par for the course.  I need to step back and take a broader view on my progress over the longer time period.

We often treat mistakes, detours, and off-days as things to be minimized or hidden. But what if they’re actually the most important parts? What if the real richness of life is found not in the clean lines of achievement, but in the jagged edges of experience?

Think about it. The best stories aren’t about flawless execution. They’re about perseverance, vulnerability, growth. They’re about the days we didn’t want to, but did anyway. Or didn’t, and learned something in the process.

Efficiency has its place, of course. We need structure. We need goals. We even need the odd spreadsheet to help organise our finances or our todo list and there isn’t anything wrong with the occasional PowerBi dashboard.  But if we focus only on the measurable, we risk missing the immeasurable: the joy of a sunrise on a run, the frustration of a setback, the quiet pride of trying again. These moments don’t fit neatly into a spreadsheet, but they shape us in ways that matter.

So maybe life isn’t just about doing stuff. Maybe it’s about being in the stuff. Feeling it. Living it. Letting it be messy and real.

As I try to get back into running, I’m trying to learn to be gentler with myself. I need to spend less time beating myself up when I don’t manage to get out the door focusing more on how I might do better next time. I also need to remember improvement isn’t linear so setbacks, hurdles and struggles are to be expected. I need to celebrate the effort, not just the outcome, even where the effort was hard and the outcome sub-optimal. To notice how the air feels, how my body responds, how my mind resists and then relents. It’s not about pace or distance anymore. It’s about presence.

And maybe that’s the lesson we all need. That the journey, with its stumbles and surprises, is not a distraction from the goal. It is the goal.

So, here’s to the messy mornings. The half-finished plans. The imperfect efforts. They’re not signs of failure, they’re signs of life. And that, in the end, might be the most important achievement of all.

TEISS 2025 Cyber conference

Cyber resilience is an issue impacting all organisations, be they schools, charity groups or enterprise organisations.   It is with that in mind I attended the TEISS London 2025 event, an event focusing on cyber resilience and specifically, as per the conferences title, on resilience, response and recovery.   It is always interesting listening to industry speakers talking of cyber security, plus hearing the thoughts from the NCSC among others.   So, what were my take aways

Third party risk

    Third party risk has been something I have been concerned with for a while.  This includes companies which a school might use to provide online services such as learning platforms or even MIS services however, also includes companies which provide software which we may use, such as software or hardware to provide network monitoring or Wi-Fi or other functionality.   The reality is that we use far more third parties then we tend to be immediately aware of, with each vendor representing some form of risk in terms of the effect if their service wasn’t available or if their service was breached, resulting on the loss of school data to the dark web.   The TEISS event however widened my view to include the overall supply chain.   Consider a cloud hosted MIS system for example.   The vendor might use an external identity provider, a third party backup solution, an MDR solution and a separate bulk email solution, and we, as the user of the MIS may not be aware of which solutions they are using, and maybe the contract with the MIS provider requires they inform us of a breach, but do they need to notify us of a breach of a third party they use?   And never mind just breach situations, do they need to notify us of vulnerabilities identified in the third parties they use?   Remember, the school as the data controller is responsible for data even when being processed by a third party;  How can we discharge our duties when we don’t actually know the risks?  I feel we sometimes treat the use of cloud services as a way to shift the risk to them rather than the school, however all we can shift is the responsibility, the accountability and liability generally remains with the school.

    Recovering identity services

    In discussion of the recovery phase of an incident, the issue of identity services was raised.   We might have all our data and our systems nice and backed up, and we may have tested these backups to make sure we can recover them but what happens when our identity services are compromised?    How can we recover from backups when the identity services needed to authenticate are compromised and unavailable?   I must admit I myself have spent a fair amount of time in considering backups and disaster recovery but this raised a new angle.    Having break glass accounts is all well and good, but if our identity services have been compromised and even destroyed then these accounts are unavailable meaning we can’t get to our backups.   Are we sufficiently prepared for rebuilding our identity environment, including rebuilding all the trust connections across different systems and services as required to mount a recovery and get things back working, even if only to a minimum viable environment situation.

    Not if, but when?

    We do tend to spend a lot of time talking about our protective and defensive measures, our AV, EDR, XDR, MDR, SOC and other two or three letter acronyms but these defences only give us a less than 100% probability of a successful defence.    Looking at the probability of a breach or incident, if we look far enough into the future we approach a 100% chance that it will happen.   So maybe this means we need to spend more than 50% of our time talking about how we manage the “when” it happens.    And maybe the key here is not to treat a cyber incident as an IT or cyber issue, but as an organisational or school issue.   It’s a school operations incident albeit one of the cyber variety.   So, the question then becomes how do we manage the different critical aspects of a school if there is no IT available.   Thinking on this, I think teaching and learning can easily continue in the short term without IT;  Teaching worked for many years before we started adding tech to classrooms.   The issues therefore are other areas of a school such as do the door locks work so staff and students can get in and out?   Will payroll operate to pay suppliers and also pay teachers and other staff?   How will we manage safeguarding without our safeguarding platform or the ability to easily ping emails back and forth?    What about contacting parents if there are issues or even in general, if we don’t have access to our MIS?   It’s about the key business or school processes and how they might work in a minimum viable environment without our usual IT systems.

    Conclusion

    So, it was a very enjoyable conference once again, although I will note it wasn’t without my usual train related issues which saw me arriving home two hours later than original planned due to delays;  Why do I expect anything different?

    Thinking back one of the key discussions in a group session focussed on culture and I think this is key.   Cyber resilience, a term which has never fully sat well with me, is an ongoing battle against changing technology, evolving threats and evolving threat actors, as well as increasing internal risks as technology becomes more integrated and critical and as people are asked to become more efficient and do more, something which has a tendency to result in occasional errors.    As such it is all about building the culture such that the risk is addressed by all and not just by IT or cyber teams.   This is the ideal.  The question therefore becomes, how can we engage our school communities in cyber resilience, in the same way we engage them in safeguarding?    Is it time we stopped talking about cyber resilience in schools and started referring to it as digital safeguarding maybe?

    Minimal viable edtech product?

    In the IT world the phrase “minimum viable product” is often used to indicate the point at which a new product or service is good enough to go to market, but by no means the finished and polished product.  The issue in enterprise environments is the need to be first to market, complete with the competitive advantage this results in.   But would we use the same phrase in education and in particular in relation to teaching with technology?  Wouldn’t we want the best for all of our students?

    This question came to me following a number of webinar events focussed on AI in education where I saw some really great stuff being demonstrated.   I saw ways for teachers and students to bring their work to life, animating, creating video or imagining what a given piece of textual content might look or sound like.   And that’s the tip of the iceberg and it was inspiring stuff.    My initial reaction was one of, how do I achieve that or where can I use that.

    But then the day to day tasks and activities got in the way.  I didn’t have the time to visit and sign up to the various apps, and to learn how to use them to do all of these amazing things.   I didn’t have time to consider the pedagogical and content side of things in terms of how someone might teach through and using these tools.  The demos where amazing, but the people doing the demonstrations didn’t really share about how long they spent finding and learning the tools, how long it took them to build up the confidence that was so apparent in their presentations.      And I think building confidence in tools is a key issue and one which is often not fully considered.   As is experimentation, but this takes time. I would also suggest that where people do present great examples of tech tools, they are often eager motivated individuals with an interest in technology and exploring ideas and solutions.

    When we seek to scale all of this up we therefore hit challenges.   Not all people have the interest and motivation to explore technology tools, including AI tools for example, and therefore either they are unaware of what is possible or they simply don’t have the time to experiment and gain confidence.    And with quite so many different tools being demonstrated I also think staff can find it difficult knowing where to start.   Which tool is going to represent the best value, giving me the greatest impact but with the least effort.   So, selecting a lesser subset of tools and training all staff tends to be what happens, the minimal viable product, but it therefore misses out on some of the jaw-dropping opportunities which AI advancements can now provide. There isn’t anything wrong with this, but is it the best we can achieve?

    Maybe the issue here is my point of reference which is that of the teacher.   What if, rather than the teacher seeking to learn all these apps and tools, we empower the students to explore.   This fits better as the students are exploring tools which relate to their work and are doing so for their own benefit.   A teacher would however need to be comfortable in supporting students with tools where they don’t necessarily fully understand their use.  This would be a bit of a shift for the traditional view of “sage on the stage” teaching, and more towards coaching and mentoring, asking the right questions rather than showing students the facts or what to do.   This feels right to me, albeit possibly uncomfortable, although it is through challenge and a bit of discomfort that we often achieve the most.

    AI tools are providing such great opportunities at the moment and it is great to see the art of what is possible but I cant help but feel we should be supporting the students in this exploration.   This therefore means that the role of the teacher needs to be different, focussing on asking questions as to age requirements, privacy, intellectual property, ethics and the like in relation to the tools, rather than being the one using the tools or teaching the students how to use specific tools.   We need to explore why students are seeking to use certain apps and what they hope to produce.  Maybe this is the new minimum viable product in education.   Maybe this is what we need if we are seeking to develop creativity in students as well as the critical thinking to help them stay safe and secure.