Compliance and training

In schools we have a fair number of “compliance” matters which we need to deal with especially around training.   Safeguarding including Online safety, Data Protection and Health and Safety to name but a few.   But sometimes I wonder whether we sometimes can’t see the wood for the trees.    I wonder if sometimes we become that focussed on the compliance measure that we may lose sight of the reason the compliance requirements are there in the first place.

Let us take data protection as an example.   We must comply with the legislation and as part of this we need to make sure that staff have received the relevant training.   So, the compliance measure is to ensure that staff have had training.    So, we attack this requirement with an annual training session likely a short session within inset at the start of the new academic year.    If we do this correctly, we will have a nice register of all the attendees at the session, which will match the list of staff, thereby showing all staff have been “trained” and we are compliant.

But why do we have the training?    The training is not the outcome or objective, it is the vehicle.   What we are really trying to achieve is that staff understand data protection, understand what they should and should not be doing with data and what to do when things go wrong.   So, taking my compliance to the next level (is there such a thing?   Do I not comply or not comply?   Maybe a separate question!) I now want to check understanding.   So, I send all staff a quiz to answer based on some of the content of the training.    If most, let’s say 80% of staff, get the answers correct I am happy that the training has developed the understanding I seek.  If the score is less, then I need to review the training materials and amend accordingly.

But again, this has flaws.   Is a quiz sent out after the training a good measure of understanding?   Or is it just a test of short-term memory where staff who may score well immediately after the training, will have forgotten most of the information two or three weeks down the line?   At this point I may revise my compliance measure again, releasing different short quizzes at the end of each term to get a better view for how the understanding is retained over time.

At this stage I may decide that the single training session isnt enough, as I want to go beyond basic understanding and look to change the culture in relation to data protection.   I am now looking at a multi-modal approach with a big training session, maybe weekly short facts, termly quizzes, smaller training sessions with targeted departments, on-demand video training, etc.  In looking to change the culture I am clear that this will involve lots of little changes and activities now and over a longer period of time although the prevailing culture is unlikely to show any signs of change until some time in the future.   I am accepting that changing culture is the long game.

Conclusion

The issue with all of the above is that I feel we often get stuck at the top, at the simple measure of the number of people who attended an annual training session.   I don’t feel we always question and explore our compliance.  I don’t feel we go beyond the simple and easy to measure.   

What if, in accepting the need to comply with training, we accept that our complex approach of briefings, training sessions, standing agenda items, etc forms the training and that this is better than a one-off training session, even with a quiz.   In doing so we can demonstrate training through details of our approach and evidence to support each part of it; We can provide copies of briefings, PowerPoints from presentations, meeting minutes, etc.    Isn’t this enough to prove compliance should a relevant authority ask for such proof?   Yes, we won’t be able to provide a nice simple list of all staff having signed attendance at an annual training session, but was that ever the point?

IT Service: To help or to develop self-sufficient users?

One of the key roles of IT services or IT support teams is to resolve issues, to fix things.  But if this is the sum of expectations it represents a short-sighted view, as all you may get is repeated calls related to the same issue.   As such IT services teams also need to try and develop users such that they are more able to resolve their own issues, only needing to seek IT services help for specific technical issues.    So how do we navigate between these two options?

Statistics: Calls logged, resolved, time taken

We often need to identify methods by which we measure our efforts.   In schools, for our students, this is the exam system and terminal exams at the end of the year.   For IT teams one easy measure is to look at the number of issues reported, issues resolved and also the time elapsed.    These are easy pieces of information to gather using a help desk software solution.    The danger here is that what is easy to measure becomes what matters rather than us choosing to measure what matters.   As such the repeated call by the member of staff related to the same issue can be viewed positively as it will be simple to resolve and close the call quickly therefore reflecting positively on the statistics.   Is this use of IT staff time, repeatedly resolving the same issue for the same person, achieving value?

Learner Helplessness

Another issue with repeatedly and quickly fixing issues for staff is learned helplessness.    Although staff will be happy to quickly and easily have their issues resolved it equally doesn’t encourage them to be self-sufficient.   It in fact encourages them to call IT in future for all problems as this is likely to be easier and less effort than trying to find a solution for themselves.    When working with Primary School teachers, I remember some teachers approaching this issue with their students, by using “C3B4ME”.   What this basically means is that students shouldn’t approach the teacher for help unless they have tried 3 other sources such as books, their fellow students, the internet, relatives, etc first.    I have actually had this poster placed on our IT Services noticeboard at the entrance to our offices as I think it is as valid for staff and for senior school students as it is for primary school students.

Training

So, from the above it might seem clear that we need to seek to train staff to be self-sufficient.  If it was that simple we would all be doing it.   Sadly, the challenge here is often time and intrinsic motivation.   On the time front, staff in schools are already busy and there is a dearth of free time available to conduct training, therefore requiring something else to give, to free up time.  Also, where staff members approach IT teams with an issue they largely need this issue resolved immediately as it might be impacting the current class or a class due to be taken later in the day.   Linked to this, the motivation is about removing the issue to the teaching or admin task to be progressed;  There is little motivation at the point of contact with IT teams towards learning a bit more about IT or about developing additional technology skills. 

Maybe a future

I suspect part of the future may include the greater use of AI and chatbots.    More and more schools force staff to log their issues via an online reporting tool rather than supporting direct phone calls.  This makes sense due to the time taken for a phone call and the resultant resource usage where direct phone calls are supported.    Augmenting this with AI that can easily and directly inform users as to fixes for common issues or can direct them to user guides to assist, freeing up IT staff time to focus on those issues which aren’t as easy to fix.   This obviously relies on the accuracy of the AI to accurately interpret and categorise the user input.    A challenge that I believe will occur here is simply the lack of detail which sometimes is entered within support calls from users.   Am not sure we can do much about this, however a chat bot might simply deal with this by stating the need for further information.

Conclusion

If IT teams focus on fixing issues, staff skills will likely never improve and we will simply repeat the same guides and instructions as solutions to the same problems.  This doesn’t feel like a productive use of time.   Alternatively, we could try a focus fully on training with each call, however this is likely to result in user frustration and take too much time.    As with so many things, the issue likely lies between the two.  We should seek to fix issues as efficiently as possible while also seeking to inform and to educate.   We should also use the data we gather to identify the common issues and again seek ways to share and train users to resolve these issues for themselves.

I feel it is the role of IT Services teams both to help resolve issues but also to develop user self-sufficiency such that they can increasingly solve their own problems; a difficult balance to achieve.

Building user awareness

When thinking about cyber security the first area I always put first is developing user awareness as to the risks and what they need to do should they make an error.  Given that most data breaches tend to have user involvement at some point in the incident, often at the beginning, it seems logical to focus first on user awareness, but how do you build user awareness in a busy school?

The old inset model (Compliance)

This is the model by which the training is put on once per year likely at the start of the year with everyone in the school forced to attend.   For me this approach is more about compliance than about improving awareness or understanding.    It makes it easy to prove that all users have been “trained” as you can point to an attendance sheet for example, however in the busy world of schools it is likely a fair part of your audience will be focussing on other tasks rather than the content being presented.   It doesn’t necessarily result in users being more informed and aware of cyber risks than they were prior to the session.  This approach also fails to take into account the constant evolution of cyber threats and the cyber threat landscape.    As such, this model of the once per year training event is no longer sufficient on its own although it still makes for a useful approach when combined with other approaches.

Regular communications and updates

My favoured model of cyber awareness development can be summarised as “little and often”.   I make use of the schools regular bulletin to share examples of phishing emails received in the school, plus tips on how to identify them.  I am increasingly making use of video to share short presentations of 3 or 4 minutes long outlining emerging risks or emerging trends.    The key for me is to make cyber security awareness content something that all users consistently come into contact with on a weekly basis.   Hopefully by doing so they will be more concious of the risks.  Basically, I am using the availability bias to hopefully develop user awareness.

I will also note one important thing here is to vary the content as if the content is always the same it may eventually become ineffective.  As such I use a mix of my own video content, NCSC and other cyber organisations video content, written content with annotated screenshots and even the odd cyber security sea shanty (See here for the cyber sea shanty if you are interested.)

Testing

One of the big things about awareness development is being able to test that it is working.    If your training is about compliance the only test you need is to check that your attendance list has everyone’s name on it but if you are truly after user awareness development you need to check that users awareness has actually developed.   An easy approach to this might be a simple short quiz including alongside new awareness content, with a focus on helping users identify what they don’t know rather than centrally providing scores.   A centralised focus on these scores once again is more about compliance rather than the actual users and user development.   An alternative approach might be regular phishing awareness tests to see whether users fall for a phishing email, or whether they report the issue.   Reducing numbers of users falling for such tests, and increasing numbers of users reporting emails to IT teams both representing improvements in user cyber awareness.

Fear of reporting

Another big challenge is trying to ensure users understand the importance of their vigilance and care in relation to cyber security, and the size of the risk both to them, to the wider staff and students and to the school/college as a whole.    The balance here though is that we need to balance this out against creating fear in users to the point that either they are reluctant to use technology or are reluctant to report concerns or issues. 

For me encouraging people to report is critical both in terms of quickly identifying any issues, but equally importantly in terms of identifying misunderstandings or near misses.   From this information we can refine training and awareness development approaches.    We can basically seek to use the ongoing reports to continually learn and develop as an organisation, in relation to cyber security.

Conclusion: Building a culture (The long road)

It still worries me that some organisations continue to treat cyber security and also data protection as a compliance issue;   For me this is a shallow approach.  The true challenge should be to develop user awareness such that we shouldn’t need to be too concerned in relation to compliance.  

Awareness development in my view isnt a single training session or even a number of training events, tests, etc over the course of a term or academic year.   It’s a longer term project.    Its about building a cyber security culture which isnt a case of days or months, but can be best measured in years.    As such the sooner we all get started with this the better.

Confidence

When looking at teachers using Technology in lessons, one of the key indicators in relation to their successful use of technology is confidence.    Those who, in my experience, have had the most success have been confident about how they plan to use technology and the impact it will have.  That is not to say that it always goes well or as they would have expected, but they are confident in outlook, and when things don’t go as intended, they confidently deal with this as a road bump rather than an obstacle, before moving on.

The challenge therefore is how do we build this confidence, with “training” being key.    Training in relation to the technology itself and how it works, and training in relation to how to use it for the purposes of teaching and learning.   

One of the limitations though in relation to training, to sharing and building confidence, is time.   Time to train has historically been limited to specific inset days where the schedule is often prescriptive.   To counter this limitation, we have increasingly been referring to continual professional development (CPD) or continual professional learning (CPL); I prefer the later as the former suggests something is lacking and in need of development.   The emphasis here, in both versions, being on the “continual” nature of the learning and sharing.   It isnt a once a year or once a term, but something ongoing, continual and day to day.  It should be part of the culture of our schools.

The challenge with CPL (or is CPLS better, where the S refers to sharing?  Education has more acronyms than a series of Line of Duty!)  is supporting it to occur and I think the last year of lockdown has given us a bit of a window into what we need to be doing.

The last year has seen massive amounts of fast paced change as teachers across the world have had to shift from face to face classroom learning, the type that every teacher would have been used to post their training, to online teaching and learning.    What I have seen as a result of this forced change is a need to find support and help.   This need has been met through online platforms, EdTech tools and social media, including solutions such as Microsoft’s Educator Centre, through groups of proficient users such as Microsoft Innovative Educator Experts, Google Certified Educators and Apple Innovative Teachers, and also through more local groups including groups of schools which have come together to support each other.    I have also seen support groups form in individual schools using platforms such as Microsoft Teams to allow staff within school to share their successes and issues, and for other staff to learn from and support each other as and when required.   This is something I feel has worked well in my own school.

The last year has seen various support groups pop up plus I suspect will have seen greater engagement in such groups as teachers everywhere sought to adapt to the forced change brought about by the pandemic.    Teachers have been sharing their issues, sharing their techniques, sharing what worked and what didn’t, supporting each other to get through the challenges the pandemic has brought.   For me the key going forward is for these groups to continue to support teachers, providing a place to share techniques, ideas and thoughts, and for teachers continue to engage.   These groups also need to exist at different levels from the large corporate sponsored groups offered by Microsoft and Google, through to the support groups operating inside our schools, made up of our colleagues helping one another.

One of my favourite phrases continues to be “the smartest person in the room, is the room”.    I think this is key to “training” or CPL.   The days of the expert trainer and one-off training session are gone and especially in relation to EdTech where technologies change, disappear or are introduced on a daily basis.   As such it is critical we embrace a more open, just in time model, sharing not just what works but openly discussing what hasn’t worked, so that we can all benefit.  This needs to be available throughout the year for teachers to engage as and when it is appropriate for them, to dip in and out as needed.

I do wonder that maybe one of the challenges we currently have is that the sharing of ideas, resources, etc is spread across different platforms.   I have seen resources on specific websites belonging to companies or groups, on social media using twitter, on YouTube, on MS Teams, etc.    As such it can be difficult or time consuming to find things, plus it means that on each particular platform you are only able to access a subset of the teaching expertise available rather than all of it.    I suspect this fractured nature of sharing and the associated resources is unlikely to change as people tend to their preferred platform or the platform used within their school, however I suspect as we move forward there will be a greater curation of the available resources.

Building confidence is key to the successful use of EdTech in schools.    We therefore need to consider how we support and enable confident to be built.   Also worth noting, the above refers to confidence of staff however it is equally important that we build confidence in our students, however I will leave that for another post.