Who wants a child to fail?

FAIL: First attempt in learning.    This for me has always been a great concept, that we often learn the most when things go wrong, however I am increasingly conscious that maybe the world we now live in is becoming increasingly risk averse, meaning that fails are not seen as opportunities to learn, but also that we are actually reducing the number of opportunities for students to learn from difficulties, challenges and even failure.

But why would we want a child to fail?

I suppose this is the key question, who would anyone want a child to fail?    I think this almost goes to highlight one of the key challenges in that a fail is seen as a negative conclusion and something we don’t want children to suffer.   But what if a fail isn’t a conclusion but is a step within a larger journey?    If our fails aren’t terminal or final but are more a road bump along the way, a change to re-channel efforts, to change paths or approaches or to simply learn from error, maybe there isn’t an issue with a child failing.

Desirable difficulty

So, if failing isn’t negative, might it be positive?   The concept of desirable difficulty refers to the positive benefits of being challenged rather than finding things easy.  Surely something not working or going as we intended, a fail, is definitely a challenge and therefore could represent a desirable difficulty if an eventual positive outcome results.   From a fail we have the opportunity to review our practice and identify how we might change to overcome this road bump, and in doing so we learn plus may also grow more resilient.  That clearly sounds like a desirable outcome, albeit I will also acknowledge it may not be easy, but I suppose the term “desirable difficulty” already says this.

Risk aversion

The challenge with all this is that, I feel, as a society we are becoming more risk averse.   We look at GCSE pass rates and want more students to pass each year, with the pass rates being in the high 90%s.    So, this meets our need for all students, or at least most, to achieve, but does it therefore rob students of the opportunities to experience and learn from failure.    As teachers we add scaffolding, we differentiate, we provide additional support where needed, and much more to make students succeed, but again are we depriving students of the benefits which result from where things go wrong?    In relation to AI in education we worry about AI errors, about bias, etc, where I don’t think we can get rid of these things;  Shouldn’t we embrace the technologies, teach students to be critical and accept that sometimes there will be a fail, but that students will then learn from this?

Monitoring and supervision

And looking more broadly we now monitor our children more than ever before, wanting to know their every move and making sure they have a mobile phone on them so they can be easily contactable.  We take them to football games and to other events, often being the ones which arrange the events, where once upon a time kids sorted their own entertainment, returning only once the street lights came on.   I look at my own childhood and the experiences I had when out with friends, sometimes just playing football or having fun, and sometimes maybe up to things my parents may not have approved of.  But in all of this I learned from my experiences, I made mistakes and picked myself up and moved on eventually better for it.

Compliance

And then there’s compliance and the world of health and safety among other areas.    We increasingly mandate things or require checks to be carried out, meaning activities we once did now take more time and effort due to the need to deal with compliance requirements.   As we add all this extra work and effort, the risk assessments, checks and balances, it makes us less likely to try new things and to experiment.   The potential gains of a project, of a new technology for use in the classroom, or many other things may not have changed, but the overhead in terms of checks and balances is now greater than it used to be so this means the perceived differential between the gain and the effort has reduced.   This increases the likelihood we will simply evaluate the technology, project or other activity, coming to the conclusion that the benefit is not sufficient to outweigh the efforts needed, and therefore the status quo remains.   

Conclusion

I came across a quote recently:  “life begins at the edge of your comfort zone”.  The challenge however is that we increasingly don’t want to allow students to experience the edge of their comfort zone for fear of fails or discomfort.   So what kind of life, and what kind of learning will result?  

TEISS 2024, Resilience, Recovery and Response

I try and take myself out of the educational bubble at least once per year.   This has been a conscious decision for a number of years as I realised the importance of diversity and therefore the limitations of only looking at IT and at cyber, data protection, etc from the stand point of people in similar educational contexts.    As such the TEISS event is one of those events I try to attend to broaden my experiences and get the views and thoughts of those who exist beyond the educational context of schools and colleges.  

This years TEISS event, where these events focus on cyber security and cyber resilience, had some predictable topics of discussion.  These obviously included Artificial Intelligence and also third party or supply chain risks.    So what were my big take aways from the event?

The cyber context

I am reasonably well aware of the cyber context and the risks which impact organisations in general including schools however the TEISS event presented a couple of key facts which I think are interesting.   That there is a cyber attack every 29 seconds in 2023 says it all, with this only likely to grow once the 2024 figures have been calculated.    This highlights the need for all organisations, including all schools and colleges to consider cyber risks and their defensive and recovery methods.    There is no excuse for having not done so.

Behaviourism

A number of presenters, and a number of those I had conversations with during the course of the conference highlighted the need to consider human behaviour as part of cyber thinking.    A cyber awareness programme isn’t so much about the programme but about bringing about behavioural change, so although having an annual training or other training programme might meet compliance requirements, does it bring about the behavioural change we seek and how do we know that this is the case.    It is about encouraging people to report issues and reinforcing such reports by making users aware of the impact where they do report concerns such as a phishing email.   If we can reinforce this view of reporting having an impact, rather than just being another thing staff are “asked” to do, then we might manage to build the cyber culture we want in organisations.   In discussion with one event attendee they raised a solution which would automatically remove phishing emails from mailboxes once it had been reported, and would then let the reporting user know as to their positive impact.   This seems like a great tool but apparently what had been a cheap tool was bought up by a bigger company and now forms a part of their free valued added tools but to a bigger more expensive product which needs to be purchased.  For schools this brings us back to limited budgets which means that key tooling for cyber security continues to be outside the budgets of those in education.

Its about people

The old Richard Branson quote in relation to looking after your staff as they will look after your customer was raised, albeit with a cyber bent, that you should look after your cyber security staff and they will look after your security rather than focussing on security.   I have to strongly agree with this and also to strongly agree with the need to look after those staff involved from an IT point of view in cyber incident response . The stress levels are high following the onset of an incident and someone needs to make sure that those leading the technical response stop and eat, sleep and take time out.    One interesting discussion which was raised however was how the CISO might do this for their team but who might do this for the CISO.    If the board and senior leaders push for updates and things to be “fixed”, while the CISO supports the team of people doing this work, who looks after the CISO?   Now in my team I feel lucky in that I feel my team would be quick to question me and challenge me to take the necessary time if needed.   This then goes to organisational culture and the culture to question at all levels.  I feel lucky to feel this would happen in my team, although I hope I never have cause to test this in a real incident, as we can only test these things in a real life situation;   Desktop exercises are all well and good but they pale when compared to the stress and challenges of a real incident.

Incomplete information and its inevitable

The inevitable nature of cyber risk is something I have talked about for some time.   You can do all you want in terms of your defences but the defenders need to get it right all of the time, while the attackers need only get it right, or get lucky once, so the probability lies with the attackers.    If we take that defence can never be 100% and therefore attackers always have a chance and will be trying from now unto an organisation ceases to exist, plus that no organisation seeks to not exist, then probability states with relative certainty that an incident will happen, just not when.      And when it happens we will see only bits of the picture initially with increasing amounts of the picture as to the impact of the incident, the ingress route, etc, appearing as time progresses, yet the expectation will be to communicate quickly as to an incident.   In relation t o comms the key message seemed to be that the worst thing to do is to state something which is later proved to be untrue, so this means it is all about saying little.     Another point which came across was related to the cadence of information, in that although we may seek to say little, we should seek to be regular in our communications even if this means saying that investigations are ongoing and that at this stage we know nothing more.   

Cyber and AI…..Or not

Within a couple of presentations the issue of language was raised.   The issue of AI being the current buzz word and being used both in terms of vendors singing about their products, but also in terms of threats and AI based threats, was mentioned.    Maybe AI has become a bit of a buzz word which needs to be included in product pitches, in conferences, etc, and maybe this doesn’t match the reality.   Another presenter raised how we use the term cyber.   Cyber bullying, cyber threats, cyber security, etc.   But isn’t it just bullying, a threat or security, albeit enabled by technology?    And does the use of the cyber word push us to think its an IT issue, an issue for IT companies and vendors rather than something which is the responsibility of a wider organisation, a school or a school community.   Maybe we need to reduce our use of the word cyber and embrace the wider links of technology enabled attacks as a subset of existing issues rather than as something unique and distinct.

Conclusion

I enjoy stepping outside of the education bubble and hearing about what cyber security looks like to those in the enterprise world where they generally have far greater resources.   It is heartening to hear that they suffer from the same problems and have the same answer, despite or in spite of their significantly greater resources.    This continues to highlight for me that “not enough money” or “not enough staff” isn’t the answer as we need to be pragmatic about cyber.   We could have  infinite staff and budget and we would still face challenges.   It continues to be about doing what we reasonably can, and preparing for the worst.   It also continues to be about getting this message across to trustee and governors, that no matter what we do the risk will continue to exist plus also that most schools or colleges which have suffered an incident have moved past it and survived.  In education with students we talk about FAIL as first attempt in learning, and maybe that’s what a cyber incident is? That said, its not a learning exercise I would care to undertake!

Focus and distraction

I recently read Stolen Focus by Johann Hari which looks at the perception of how we are increasingly less able to focus and hold our attention on a particular task or activity, with this particularly impacting our children and young adults.   Now one of the predominant views generally held in this area is that this is the result of technology and in particular the smartphone and social media however Hari goes to point to a few other things which could impact on our increasing inability to focus.

Environment

We live in a more polluted world plus increasingly are the subject of environmental changes resulting from global warming.     Now in some ways we are making progress with lead no longer included in petrol and the move to electric vehicle, plus the reduction of smoking in the UK however smog is still an issue in some cities.   There are also chemicals used in the production of modern goods or as by products of modern processes which all end up in the environment and eventually in our bodies.   How do we focus when our bodies are subject to pollutants over a prolonged period?   The answer according to Hari is simply that we cannot focus as well as we might have been able to in the past.

Reduced free play

Freedom to explore, to play and to have fun and make mistakes is a key part of the human learning process.   We have evolved as a species over millions of years using this approach however more recently, we have reduced the opportunities for freedom and exploration.    We increasingly supervise or even track our children to the extent that they don’t develop the social and resilience skills they may have once developed through play.    I feel we do this for the reasons of seeking to keep children safe based on a perception of a more dangerous world, but this is a perception rather than a reality, resulting from the ease with which we receive news of when things go wrong and our human tendency to overweigh the importance of what comes easily to mind.    As such we restrict our children from playing unobserved and freely.     And the reality in terms of safety is that we are likely safer than we have ever been before.  Hari makes the point of the likelihood of a child kidnapping, something we worry about and which drives our need to supervise and limit children’s freedom, being less than the likelihood of being hit by lightning.  I don’t think we keep children indoors and monitored to protect them from a lightning strike!

Food

Another issue impacting on our ability to focus is our changing diet.  Gone, largely, are the healthy home cooked meals involving fresh ingredients, which might be served around the family dinner table complete with family discussion.   The modern diet increasingly involves microwave or other convenience foods, foods loaded with preservatives and other additives, or high in salt or sugar content, often consumed on the go or while distracted by TV or social media content.   These aren’t the ideal ingredients to develop our ability to focus and in fact negatively impact on this human capability.

Sleep

Sleep and in fact the reducing amount of time spent engaged in quality sleep is another issue which Hari identified.    Now some of this is certainly the result of technology and our on-demand access to TV and movies, plus to addictive social media apps which encourage doom scrolling, however I also would suggest part of it relates to the increasingly fast pace of life and the need to squeeze every second of every day for the maximum we can get out of it.  This means we might get less than eight hours of sleep, maybe even only five or six hours, or less.This points to the increasing focus on the need to be more efficient, to be faster, to do more and to focus on growth and improvement.

Focus on growth

And this is the one which I think is a driver for some of the other issues, particularly the environment and our change in diet, the focus on growth, on doing better and doing more.    The world focusses on growth, so the world gets more frantic, faster, busier so we have less time to do tasks and need to move on quickly;  This builds a habit.   We also have the economic focus meaning tech vendors prioritise profit over societal good, in the name of growth, being a more profitable or bigger company this year when compared to last year.   This all drives a focus on doing whatever we can do to drive growth even where it isn’t positive for society as a whole.   Now I was always a fan of the educational concept of continual professional development however on reflection here my worry is that this is unsustainable, we cannot constantly develop and get better when faced with an infinite timescale.   In fact this driver, this need for growth, may make things worse and mean we are less focussed on identifying what really matters and what we need to do to achieve this.

Conclusion

I really enjoyed Hari’s book as it clearly established that technology and social media are part of the problem we now face in relation to our ability to concentrate and to focus, but they are not the whole, or the root, of the problem.   There are a number of factors feeding this problem including the environment we live in, our increasing risk adverse nature which leads to a reduction in opportunity for children to play and experiment, the poorer nature of the modern diet, reduced periods spent in quality sleep, and also the driving force focussed on growth above all else.  

This brings us neatly to where Hari begins in trying to solve the problem by cutting out technology and social media use for a period of time, or banning mobile phones as a school might decide to do.    This impacts on part of the problem but it doesn’t cover the other factors;   Maybe we need to have a broader discussion in schools in relation to focus and the things that might affect it? Maybe the problem is bigger than schools can address, and needs a more community or societal approach?

2023-24 in photos

2023-24 was such a busy year with so many great opportunities and so many great people to meet and share thoughts and ideas with.   From the outset and attendance at the ISMG Cyber Security Summit then presenting on AI at the VWV conference, it always looked like it was going to be a packed academic year but little did I know quite how packed it was going to be.   Little was I to know that the year would see me speaking in Amsterdam, Venice, Birmingham, London, Cardiff, Bristol, Leeds and a fair few other locations, and never mind the locations, it was the brilliant people that I had the pleasure to meet up with and talk all things technology and education with along the way that made it so very worthwhile.    The academic year also saw me become vice chair for the ISC Digital Advisory Group and become one of the founding members of the amazing Digital Futures Group (DFG). 

This collage of some of the photos is just the tip of the iceberg which was 2023-24 including some amazing memories from BETT, the Schools and Academies Show, The Edufuturists Uprising, EdTech Europe and also an attempted murder at FutureShots in Venice.     I can only hope that 2024-25 sees similar opportunities arise and further chances to share and collaborate with such great people, albeit hopefully without the attempted murder!

Bias (AI and human)

I was thinking a little bit further about bias in artificial intelligence systems. It is so easy to demonstrate and I, myself, have shown it in image generation tools with a simple request for a doctor or a nurse in a medical setting; The various bias in the resultant images are very clear indeed. But I think it is also important that we acknowledge the inherent bias in us as humans.   We get by despite this inherent bias yet seem to be unwilling to accept similar bias in AI solutions, so why is this the case?   What is the difference?

I thought a good place to start was to ask ChatGPT for its thoughts regarding the differences in bias between AI systems and humans.    It created a list of sources of bias for AI systems and a corresponding list for humans as below:

AI and Bias

  1. Training Data: AI systems learn from data, and if the training data is biased, the model can inherit and perpetuate those biases.
  2. Algorithmic design: Bias can also be introduced through the design of the algorithms themselves, including the choice of features, learning objectives, and optimization processes.
  3. Unintentional amplification: AI systems may unintentionally amplify and propagate existing biases present in the training data, leading to unfair or discriminatory outcomes.
  4. Lack of consciousness: AI systems lack consciousness and awareness. They don’t possess intentions or beliefs, so any bias they exhibit is a result of patterns learned from data without an understanding of the social context.
  5. Automated decision making: Bias in AI can lead to automated decision-making that affects individuals in various domains, such as hiring, lending, and criminal justice.

Reading the above, items 1 and 3 feel like a repetition, while item 4 begs the question of what is consciousness and how might we be able to evidence it in a future Artificial General Intelligence (AGI) solution?    This is a question I will leave for now or will leave for brighter people than me.  Item 5 is less about the source of bias and more about the impact it can have on humans.

Human Bias

In terms of humans ChatGPT listed the below:

  1. Human bias is often rooted in subjective beliefs, attitudes, and cultural influences. Humans can be consciously biased based on personal experiences, opinions, or societal conditioning.
  2. Humans can exhibit bias with intent, driven by personal beliefs, prejudices, or discriminatory motivations. This may not be the case for AI, which lacks intentionality.
  3. Unlike AI systems that learn from data in a structured manner, humans can adapt, unlearn biases, and consciously strive to be more objective through education and self-awareness.
  4. Humans possess a deep understanding of social, cultural, and contextual nuances, allowing for more sophisticated decision-making that takes into account the complexity of real-world situations. AI systems, on the other hand, may struggle with nuanced or context-dependent scenarios.

Considering A and subjective beliefs, attitudes and cultural difference, arent these the training data we as humans are provided, which shapes our neural pathways and our actions?   This is your upbringing, parenting, friends, local and national culture and values, etc.   We are exposed to this experiential training data throughout our lives, where an AI can be provided similar training data in a far shorter period of time.     Item B then comes from A in the same way as an AI’s bias might come from its training data or algorithmic design.    And I note the design of human beings, as influenced and evolved over time, has resulted in some design features which are sub-optimal in the modern world.  Take for example the fight or flight response kicking in during a heated discussion;   In the past all the relevant hormones released by fight and flight would be used up in the resultant fight or in running away from the teeth and claws of a predator, whereas in the boardroom these hormones have nowhere to go.   Does the boardroom really merit an increase in heartbeat and respiration?  And that’s before I dip into the availability bias, halo effect and a number of heuristic shortcuts we subconsciously use.

Items C and D, in my opinion, provide an overly positive view of us humans and our ability to unlearn bias and show a “deep” understanding.    Yes this may be possible however it isnt easy as humans may be unaware of their bias or bias might play into their perception of their understanding;   Take for example the confirmation bias where we might simply pick the facts or information which aligns which our view, discarding or undervaluing other counter facts or information.

It was at this point I considered AI and Humans and found myself noting the plural humans;  Maybe this is the key.    Humans work together where an AI solution is a single entity and maybe this is where bias diverges in its impact between humans and AI.    If we can gather a diverse group of human individuals this diversity can actively work towards identifying and removing bias.   An AI solution, as a single entity it doesn’t benefit from access to others, it simply takes the prompt and kicks out a response. 

But maybe we could look to multiple AI solutions working together?  Maybe it is a number of AI’s working together, working alongside humans?   I have frequently talked about IA, and AI as an Intelligent assistant, and maybe this is where the answer lies in an AI, with its bias, and a human, with its bias, working together and hopefully cancelled out each other’s bias?

Conclusion

I think its important that anyone seeking to use generative AI is aware of the inherent bias that may exist within such tools.   That said, I think the narrative on AI bias is rather shallow and limited, focusing on pointing out the shortcomings of AI in relation to bias, without considering the bias which exists in ourselves as humans.     I think we need to get more nuanced in our discussions here and look towards how we might address bias in general, whether it be AI or human related.