Underreporting and the Implications for Serious Injury Management

Jun 15, 2022

critical control management with a worker hooking up their safety harness

Incident underreporting impacts the number of serious injury fatalities and incidents within organisations. A recent analysis of over 2,000 incidents and near misses within mining and engineering services shows that one in five incidents have a serious fatality outcome potential. Yet many organisations struggle to classify serious incidents, and as a result, about 45% of serious incidents fly under the radar. This indicates there are opportunities to better understand, analyse and learn from incident data.

We sat down with Head of Psychology at Sentis, Dr Amy Hawkes, and Founder of Incident Analytics, Warren Smith, to discuss why underreporting is so prevalent and what we can do to lead meaningful safety change.

What is underreporting?

Amy: Underreporting occurs when people are aware of an incident, hazard, or near miss, and fail to report it within their organisation. Whilst underreporting is the focus of today’s discussion, it’s important to remember that the same drivers affect underreporting and inaccurate reporting. Inaccurate reporting refers to when people submit a report but leave out key information or downplay the severity of the incident – intentionally or not.

Sentis conducted a review of safety climate survey data collected from a stratified sample of over 12,000 individuals. We identified an average 25% underreporting rate. This means one in four incidents – a near miss or an actual incident – hadn’t been reported.

Warren: Often, organisations are driven to react to incidents when something goes wrong, particularly serious events. Organisations can find it difficult to look back and understand what drove the incident, why it occurred, or why it might keep occurring. When in fact, to get ahead of incidents, organisations need to understand the weak signals that drive negative events well before they occur.

Why is collecting total incident data important?

Amy: A whole range of reasons, really. One common reason is that the worker will fix the problem themselves and not bother reporting it. Another reason is a lack of awareness or confidence in the systems that exist. This means employees aren’t aware of the official reporting system, or they find the system frustrating, complicated, difficult to use or time-consuming. Then there are more serious reasons that drive underreporting. The first is apathy, based on past negative experiences. “I reported in the past. Nothing changed. So, I’m not going to bother reporting today.” And lastly, people can feel a sense of fear around reporting. They might feel like they’ll be singled out, dragged through the mud, or that their work will attract scrutiny. These last two drivers (apathy and blame) are more serious and more challenging to change.

Warren: In my experience, workers tend to report things when they know they have to. And in organisations with an immature safety culture, it’ll be the bare minimum. This is often driven by a limited understanding of the positive purpose of reporting. This can be because either:

A) the systems aren’t in place to actually leverage that information and do something with it,

or

B) leaders haven’t communicated how reporting is valuable to the frontline workforce. Workers believe it’s just a compliance activity, rather than understanding why it’s good for them and the organisation.

What role does the board play in setting the tone around metrics and data collection?

Warren: Board members don’t generally realise the impact their interest has. The information they ask for can drive what people throughout the organisation consider to be important. As such, it’s crucial for boards to become better educated about critical drivers that truly indicate the level of risk management in the business. Often, we can overreact to a single incident without truly understanding what led up to that event – there might have been many, many incidents signalling that a bad event was going to happen. That’s why boards are far better off understanding lead indicators.

Why do organisations tend to have such a strong focus on lag indicators?

Amy: It’s historical. Often, lead indicators or weak signals are harder to feel a solid connection to. But as Warren explained, it’s about using the information in the right way to prevent those lag indicators coming through down the track.

Warren: I’ve often heard board members privately discuss that their gut is telling them, “There’s a big one around the corner.” To me, it’s fantastic that they’re listening to their gut. But it’s even better to have data to help form a proper opinion on the likelihood of these events occurring.

There are very specific lead metrics that you want to extract from your general operation on a day-to-day basis. One key one is monitoring and reporting on high-risk activities (particularly those that can be serious if something goes wrong) – particularly when we know that the worker is effectively taking risks if they don’t precisely follow procedure. Collecting this data aids organisations in getting a true sense of what the nature of risk is, and how much risk is being worn by our frontline workers on an everyday basis.

Most data is collected from actual events. As a result, do you find that actual minor incidents steal focus from potential ones?

Warren: Absolutely. No matter how well constructed your risk categorisation is, it’s a challenge for most organisations to get a calibrated response over time. You’ve got people from all levels of the organisation making judgement calls based on an outcome, rather than the potential for an event.

For example, you receive two separate reports of a twisted ankle in the workplace. One might have involved slipping when stepping awkwardly from a truck cabin, the other might have resulted from having a foot run over by a forklift. Same outcome, very different potential, so rather than focus on the worker’s outcome, the report needs to consider the potential of the event. For example, did the worker fall from a fair height? This is critical data related to the event that needs to be recorded so the business can intelligently manage risk.

Amy: Additionally, if the business’s attention is not going in the right direction, it will influence the future behaviour of workers to report.

What we know is that 45% of serious incidents go under the radar, and 29% of them attract unnecessary attention. What does the “Understanding SIF Potential” diagram tell us?

Warren: Several study groups have consistently shown that regardless of your industry, about one in five incidents are likely to have SIF potential, no matter what the outcome was.

In the past, we’ve focused on injuries, medical treatment and first aid events. Now, the things we’re interested to try and extract more knowledge about are near misses, hazards and control failures. These three categories are going to be the ones that are affected by the average 25% underreporting rate that Amy mentioned. It’s these categories that are our lead indicators; they help us avoid the more visible and damaging events at the bottom of the model.

When an organisation ramps up focus on hazard and near-miss reporting, it’ll get real benefits over time. When workers understand why that’s useful and they see there’s an organisational response to the incident data – in other words, things get fixed – you then start fostering ongoing reporting that enables better overall risk management.

What are the features of an organisation with a strong reporting culture?

Amy: Being quick to learn and quick to act. By adopting that learning approach – being curious, seeking information, seeking to understand, seeking to improve – it builds trust and psychological safety with your people. And that, I think, in turn feeds back into your system and employee willingness to report in the future. When you have a mature safety culture, reporting is higher because workers appreciate that the incident data is useful and valuable, that they’ll hear feedback or action will be taken. It makes them feel it’s worthwhile submitting the information.

Warren: And that’s why board education and executive education are so critical, because as your safety culture changes, you’re going to get more reporting. So, it might look like the organisation is ‘worse at safety’ than ever before. But of course, that’s not true at all. In fact, you’re probably better. During culture change, you’ve got to look through other lenses to see there are positive things happening.

 

 

 

 

Warren: This dashboard example really captures what we believe are some of the essential indicators of risk, and how well risk is being managed in the field. You need to be looking at which of these high-risk categories tend to produce more incidents, particularly incidents with serious potential as distinct from serious actual. You want to be looking at what kind of controls are breaking down – and sometimes your incident investigations will show that information clearly. A dashboard like this enables the business to intelligently interpret how well high-risk tasks are being managed in the field. And if they’re not well-managed, ask questions and investigate how they can be improved to prevent serious injuries and fatalities.

What’s one recommendation you’d make to help organisations better learn from incidents?

Amy: I think if you’re looking at building a sense of psychological safety and trust, it’s best to approach everything with a sense of curiosity. Don’t approach it with the aim of identifying a cause – like someone to blame, or something to fix. Instead, approach data collection with curiosity. Later, once the learning is done, you can seek to fix and make improvements that will build confidence in the process.

Warren: I have two. Firstly, I recommend shifting from a reactive to proactive method in safety risk management. Imagine you’re responsible for managing critical risks embedded in your business’s operation. You need good, accurate data in real time to inform you. When an organisation takes a more scientific and systemic approach to safety data, frontline workers have greater faith in the whole process of incident management, which in turn assists in building a healthier safety culture for all.

Secondly, frontline and mid-level leaders need to be in the field as often as possible. This on-the-job exposure enables leaders to observe and ask questions as high-risk tasks are performed. They can find out if there are issues that frontline workers are solving ‘creatively’ to get the job done. These observations inform total risk management, which makes the workplace safer for everyone.

Sign up to our newsletter

"*" indicates required fields

By clicking submit and supplying my contact details, I agree for Sentis to contact me, including via email, to keep me informed about Sentis products and services. I understand that I can opt-out at any time. For more information, please see our Privacy Policy.

This field is for validation purposes and should be left unchanged.

Related articles

This website uses cookies to improve your online experience and to provide site functionality. By using this website you agree to the use of cookies as outlined in our privacy policy. Learn More