
In schools we have a fair number of “compliance” matters which we need to deal with especially around training. Safeguarding including Online safety, Data Protection and Health and Safety to name but a few. But sometimes I wonder whether we sometimes can’t see the wood for the trees. I wonder if sometimes we become that focussed on the compliance measure that we may lose sight of the reason the compliance requirements are there in the first place.
Let us take data protection as an example. We must comply with the legislation and as part of this we need to make sure that staff have received the relevant training. So, the compliance measure is to ensure that staff have had training. So, we attack this requirement with an annual training session likely a short session within inset at the start of the new academic year. If we do this correctly, we will have a nice register of all the attendees at the session, which will match the list of staff, thereby showing all staff have been “trained” and we are compliant.
But why do we have the training? The training is not the outcome or objective, it is the vehicle. What we are really trying to achieve is that staff understand data protection, understand what they should and should not be doing with data and what to do when things go wrong. So, taking my compliance to the next level (is there such a thing? Do I not comply or not comply? Maybe a separate question!) I now want to check understanding. So, I send all staff a quiz to answer based on some of the content of the training. If most, let’s say 80% of staff, get the answers correct I am happy that the training has developed the understanding I seek. If the score is less, then I need to review the training materials and amend accordingly.
But again, this has flaws. Is a quiz sent out after the training a good measure of understanding? Or is it just a test of short-term memory where staff who may score well immediately after the training, will have forgotten most of the information two or three weeks down the line? At this point I may revise my compliance measure again, releasing different short quizzes at the end of each term to get a better view for how the understanding is retained over time.
At this stage I may decide that the single training session isnt enough, as I want to go beyond basic understanding and look to change the culture in relation to data protection. I am now looking at a multi-modal approach with a big training session, maybe weekly short facts, termly quizzes, smaller training sessions with targeted departments, on-demand video training, etc. In looking to change the culture I am clear that this will involve lots of little changes and activities now and over a longer period of time although the prevailing culture is unlikely to show any signs of change until some time in the future. I am accepting that changing culture is the long game.
Conclusion
The issue with all of the above is that I feel we often get stuck at the top, at the simple measure of the number of people who attended an annual training session. I don’t feel we always question and explore our compliance. I don’t feel we go beyond the simple and easy to measure.
What if, in accepting the need to comply with training, we accept that our complex approach of briefings, training sessions, standing agenda items, etc forms the training and that this is better than a one-off training session, even with a quiz. In doing so we can demonstrate training through details of our approach and evidence to support each part of it; We can provide copies of briefings, PowerPoints from presentations, meeting minutes, etc. Isn’t this enough to prove compliance should a relevant authority ask for such proof? Yes, we won’t be able to provide a nice simple list of all staff having signed attendance at an annual training session, but was that ever the point?