Related Content
Explore more comprehensive articles, specialized guides, and insightful interviews selected, offering fresh insights, data-driven analysis, and expert perspectives.
Episode Transcript:
Welcome to The Safety Meeting by Novara. A lot of safety decisions still get made on experience and instinct, and honestly, that expertise matters. But today, we’re exploring what happens when you pair that instinct with real data discipline. My guest has spent her career helping organizations in high-risk industries build reporting frameworks and visibility systems. Caroline Miller is a Solutions Engineer at Novara who specializes in turning complex data into clear, actionable intelligence. Caroline, really great to have you on the show today.
Hey, thanks for having me.
Let’s jump into those questions. Safety has always been a field where experience counts for a lot. When you were working in data analytics on the operational side, what did you notice about how safety decisions were actually being made day-to-day?
That is a great question. There are a lot of decisions being made on a day-to-day basis. Some are more in the short term, someone’s hurt or we see some unsafe events that need to be addressed. Then there’s more long-term, making sure training is getting completed, hitting certain metrics, and making sure for our entire year there aren’t any serious injuries. There are a lot of different metrics that go into a safety program.
In terms of how decisions were being made, a lot of it without any true guidance can either be done ahead of time or just from a gut feel where you may not have any sort of empirical evidence to back up that decision. It’s more, “I’ve been in this industry for years, I know how things work, therefore this is how I’m going to do things.” There isn’t really any data-driven decision being made to say, “Is this the right thing?” And even if you are making decisions, do we have a way to track what’s working and what’s not?
I can see how experience in a field may not exactly correlate with the data you’re collecting. I feel like there are a lot of instances where managers don’t have the data and are having to go off gut feel alone. I think that can create a tension in a seasoned safety manager’s world between that gut instinct and what the data, or lack of data, is saying. Do you ever see these two things conflict? And how do teams that do this well navigate this issue?
There are a lot of instances where there have been tried-and-true approaches that have worked for years and years, and we know it works. It’s not something we necessarily need to track. However, there are situations where we know something is happening but we don’t have a formulaic way of tracking how something is working.
A great example: let’s say we have recurring incidents with a specific body part. Maybe people are constantly hurting their feet or their hands. We know this is happening, we have the anecdotes, we’re seeing our workers’ comp claims come in, but we don’t have any other additional data around that of what exactly to do with that information. Do we need to look into what training people have had? Do we have visibility into training? Our ergonomic program? Do we need to order new gloves? It goes into all these other different discussions. We know what’s happening and usually we do XYZ, but maybe XYZ is not the correct solution of what needs to happen here in order to keep us moving forward.
We’ve heard a lot of stories where people are either having to dig up this data or figure out a way to actually look at it because they have all that anecdotal experience, but the data is nowhere else. When people hear “data-driven safety program,” they may picture dashboards and reports. Tell us a little bit about how that actually looks in practice when a team starts making these decisions differently around this data.
Not to product-drop here, but Novara Flex, one really powerful thing about this platform is there are two approaches that are pretty sustainable on how to collect data. You can either export directly out of the system and get everything in a tabular format that would work with a BI tool like Power BI or Tableau, or there’s a REST API. You could set up something in an automated way where every single day you can just constantly have real-time safety data going into your business intelligence tools. You’re able then to have this very quick speed-to-insight in a sense of knowing exactly what’s going on with your safety program at all times.
I was a previous client of yours at a different organization. When I first started there, we had an EHS system but we weren’t really leveraging any sort of BI tool. It was more in the sense of compliance, making sure people were doing their safety observations. When they were, we would then manually enter it into a spreadsheet. It was a lot of just collecting straws to figure out what’s going on. It was not very scalable, it was hard to repeat, and obviously it was super prone to human error.
Then we took this approach where we were like, “Okay, let’s get Power BI involved. Let’s make the data collection a little more repeatable and very structured.” So we started to have a stronger foundation. When you have a strong foundation, you can use that to springboard. It’s really that “crawl, walk, and run” approach. For a long time, it really was that crawl and walk.
What was the thing that really clicked?
What really clicked was when we started to be very strict on: these are the 10 metrics that matter. This is how we’re going to measure them and compare different locations, different field offices, how they’re performing with those 10 metrics. If you hit it, you’re green. If you don’t, you’re red. If you’re kind of close, you get an orange or a yellow. Then you get a total score.
It was really then where people saw the “rack and stack” and being able to drill down to the person level of what’s going on in their program that people started to get really engaged. It was also so people-driven at the same time. If a location didn’t hit their training metric for Q1, they can go into Power BI, drill down into that exact dashboard that shows who’s doing what. “Okay, Joe Schmo did not complete quarterly training. We need to make sure this person gets trained up, then we’ll hit our mark, we won’t see that red anymore.”
It really got people engaged because at that point it wasn’t just us looking at graphs and moving on. It was us working towards a very specific goal and that element of, “Oh, I want to make sure I meet my goal because I want to be better than I was.” Having that data available where it was easy to collect and track, it would refresh every single day, we knew exactly where we were.
That is such a great example because if you’re a longtime listener of The Safety Meeting, you know we’ve talked many times about gamification and how having data available everywhere makes such a difference in your safety programs when it comes to not only getting people to participate, but understanding where things are going wrong.
A lot of safety programs are still tracking incidents in spreadsheets and relying on lagging indicators, things that happen after the fact, rather than leading indicators. What’s the shift in thinking that has to happen before data can start really driving those decisions?
If you’re deep in the trenches of just getting data, you can’t have that more strategic conversation of what actually is going to move the needle forward. You’re instead just thinking about, “We just need to scramble this information together just so we can stay afloat.” Truly having a very streamlined approach of collecting and storing your data is step one.
Once you have that, you’re not pigeonholed into just tracking incidents. You start to think more strategically. “Okay, let’s see what kind of unsafe events we’re capturing. Are they mostly near misses? Are they unsafe conditions? What does that look like?”
There’s one concept that I really like in analytics, the “so what” and the “what now.” Anytime you’re looking at a trend, you’ll see the “what.” The “what” is the easy part. But then you also need that additional context. If you’re working in safety, you have other background knowledge that can add context to that visualization. Then you can start to formulate those “so whats.” “Okay, we’re seeing more near misses. Actually, in these locations, we’re not seeing any near misses. Is it because they’re under-reporting or are they truly not having any near misses happen?” Once you have the “so what,” you can then start to formulate your “what nows,” which are going to be your decisions.
And when you’re coming from a place where data hasn’t had this impact on you, I think it can sometimes be difficult to shift your mindset around it. Because I think for a lot of people, when they’re moving to something that’s more data driven, they have a concern that leaning too hard into the data can make it feel like it’s a cold, bureaucratic thing to run the safety program, and maybe that they’re managing metrics instead of people.
So, how would you hold on to that human element while still bringing these real analytics and the discipline around it into this process?
It’s definitely a balancing act. But you can build a data model to be a little more flexible. We had that situation in my last organization. There would be people that wouldn’t hit their mark, maybe they were on leave, or they joined the company halfway through the quarter, or they got promoted. You can build different tools into your data models to account for that human element. Life happens, people aren’t going to be perfect.
Really, at the end of the day, safety is so human. All of these things are going into place just so people can get home safely at the end of the day. That’s really all it is. Sometimes it can feel like a drag, but it’s leading towards a better safety culture, which therefore means that people are going to get home to their families every single day.
A great example: we had a metric all around driving. Ever since that certain metric went live, we saw that car accidents went down by 40% in a year. That’s a human element right there. That’s someone not getting hurt in their vehicle on the way to work.
Absolutely. And I think that when you come from these older processes and, and are starting to move into something that’s more software based, it can feel that way. But I think we’ve seen it time and time again like your, in your example. Even though you’re moving to something that is more data-driven, it’s going to have outcomes that are going be good for your people. And when it comes to people, I know sometimes it can be difficult to get people on board with a change. As difficult as it may be to continue to manage things manually and through paper; it’s what people know. So when it comes to data-driven approaches stalling, because either leadership isn’t asking for it or frontline workers don’t trust it yet, what does it take to get real buy-in at both of those levels?
Change management is a huge lift. It’s definitely a team sport. I always find what has worked for me in the past when it comes to buy-in, especially with data, because data is very hard to get people to trust. They’ll find one single metric that doesn’t align with what they think and all of a sudden the whole thing is discredited.
So first and foremost, making sure the data is as clean as you can possibly make it. Step two: train your people, your main points of contact. Make sure they are enabled to be those experts and really have them own it. The person who is spearheading it should not be the main point of contact. It should be those “champions” within the organization who truly understand it, can speak to it, and from there starting to show the value. It can take a long time. It’s also a continuous effort.
Yeah. Getting people on board. When they have a way that they’ve done something, like you said, it’s a huge lift, and making sure that they understand the value or seeing maybe the fun of it in the gamification. I think that can make a huge difference.
And when we talk about moving to these different programs and, and making more decisions driven by safety, I think something that comes along with that is leading versus lagging indicators. Because I think almost every modern safety program is tracking those lagging indicators, injury rates, incident costs, things like that. Those are easy because you can’t ignore those. But what does it look like when a program starts to build out leading indicators and, and how those look and what’s hard about that shift from lagging to leading?
There’s not a lot you can do with lagging indicators. “Okay, it happened, we move on.” It’s not that proactive safety culture. A great example of a leading indicator is training. Making sure people are up to date. There was a region at my last organization that hit 100% training compliance for the whole year. We’d never seen it before. And yeah, their lagging indicators looked pretty good too, they had very few incidents and zero serious incidents. Trust the process.
Unsafe event reporting is another “in control” thing. It takes five seconds. Pull up the form on your mobile phone, fill out that form. Just by nature, you’re creating a safety culture.
Another one that I don’t think gets talked about enough is recognition. Safety can get into this negative space of “do this better,” “wear this.” We also need to encourage the positive side. We want people to feel empowered if they are being an example of how to be really safe. Making sure people feel recognized goes such a long way with intrinsic motivation.
And I think that goes along with the seasoned safety manager thing, because I think someone who is running a good safety program, even if they don’t have these data pieces together, they understand who their team is, what motivates them, and when they’re going to feel safe coming to them to tell them about something. Because you can’t really have that gut feel that we’re talking about if you don’t have a good understanding of your team and trust that they will come to you when those things are going to happen. And that ties into decisions overall, whether it’s the gut feel or the data. You can’t make good decisions if you don’t have the right data.
So, what habits or systems did you see make the biggest difference when it comes to data quality, the stuff that actually makes these analytics trustworthy?
I think a big thing is just really enabling people. Put as much as you can into the system. It’s “garbage in, garbage out.” If you’re not getting good data, you can’t make good decisions. You really need not only that good data, you need that context as well to make the data make sense.
In general, you want to make sure your form approach is intuitive and user-friendly. Leverage dropdowns. Stay away from multi-select fields, I have beef with multi-select fields, it can make data very messy very quickly. Single-select fields are just a lot easier to make decisions on in general. Make it accessible, mobile-friendly and ensuring it works offline. Trying to avoid hurdles that prevent people from reporting on things.
And that’s what’s so difficult about this, is there can be a lot of manual work that goes into that if you don’t have a system that can help you to put that stuff together.
So I know that almost everybody who’s listening is a safety manager, so for any safety manager who’s listening and knows their program is still running mostly on instinct, but they’ve listened to this episode, they decide, “This makes sense; I want to start moving towards something that is more data disciplined.” Where do they begin?
Start to figure out what resources you have available. Look into where to store your data. It does not have to be this full-blown, beautiful dashboard. Truly take it as a crawl, a walk, and a run. Start with something, tackle one quick win, get a feel for it, build a strong foundation, start to get some early buy-in, and then see how else you can evolve with it.
This does not have to happen overnight. This could take you six months or a year. But start somewhere. Maybe there’s just one specific metric that you’ve always wanted more visibility into. Throw it into a BI tool, maybe you have a pivot chart. Just stay curious. It doesn’t have to be this extremely sophisticated thing in the beginning.
Caroline, thank you so much for all your insight on this. Really appreciate you sharing your perspective with us today.
Thank you so much for having me on.
Related Content
Explore more comprehensive articles, specialized guides, and insightful interviews selected, offering fresh insights, data-driven analysis, and expert perspectives.

Caroline discusses how to strengthen safety programs by pairing seasoned expertise with data collection and analysis. She explains common gaps in instinctive decision-making, and how digital tools, good forms, and team buy-in can give safety managers new visibilities. Drawing on her past experience, she explains how using a defined set of key metrics drove engagement, action, and measurable safety compliance improvements. She unpacks the mindset shift from lagging to leading indicators, highlights data quality practices, and shares tips for navigating tech adoption and trustbuilding during change.