Skip to content

Building Safety Culture in the Manufacturing Sector with Haleyanne Freedman

Toby Graham

Podcast promotion: "Building Safety Culture in the Manufacturing Sector" featuring Haleyanne Freedman. Her photo is on the right, with podcast details displayed on a purple background.

In this episode of The Safety Meeting from Novara, we talk with Haleyanne Freedman, Novara’s own manufacturing sector expert, about why near misses in manufacturing may go unreported, and how to fix that.

Headshot of Haleyanne Freedman

Haleyanne explains that production pressure, a lack of immediate consequences, the paperwork burden, and fear of blame or discipline may suppress reporting. She emphasizes how supervisor behavior, no-blame investigations, visible feedback loops, and leadership accountability can help. She even outlines a 90-day plan: first, diagnose cultural friction, then simplify and broaden access to reporting tools, and lastly, publicly communicate corrective actions and safety trends.

Related Content

Explore more comprehensive articles, specialized guides, and insightful interviews selected, offering fresh insights, data-driven analysis, and expert perspectives.

Episode Transcript:

Our guest today is Haleyanne Freedman, Novara’s own subject matter expert in the manufacturing space. She has spent her career on the floors of industrial manufacturing environments, building additive manufacturing divisions from the ground up, launching products in the startup world, and speaking at industry events around the globe. She’s been recognized as one of the most influential women in manufacturing and received the Women in Engineering Innovation Award. She also currently serves as the North American Chair for Women in 3D Printing.

When you work that close to industrial operations, you see near-miss moments happen in real time, and you understand exactly why workers hesitate to report them. That’s what we’re digging into today. Thanks for being with us, Haleyanne.

 

Yeah, thank you! I’m really excited to dive into this.

 

Absolutely. So, I think I want to start with the basics. What exactly counts as a near miss and why do you think so many people in manufacturing or other hazardous industries struggle to recognize one when it happens?

 

Yeah, so a near miss is something that’s an unplanned or a total accidental event that could have possibly caused an injury or caused damage or loss, but it didn’t. So, like a forklift almost hitting somebody, a load shifts but it doesn’t actually fall, or a guard is bypassed or something like that, but nobody actually gets hurt.

So, the reason that a lot of companies and people will struggle to actually recognize one is because nothing actually bad happened, so there’s no quantifiable consequence necessarily or there’s no actual injury triggered. Or in the case of manufacturing especially, there’s no actual downtime. But in manufacturing we tend to react to the actual outcomes of an event like that. So if there’s no actual blood on the floor and production didn’t stop, people tend to just move on. So that can be a reason that it’s not as easily identified.

 

Something almost happens and you go, “Whew, good thing that didn’t actually turn into anything,” and you get right back on the floor. So, when you’re thinking about those near misses that never get reported, what are the most common reasons that workers are giving for not reporting them? And do you think that these are real reasons, or is there something deeper going on?

 

Yeah, so the most incredible part about near misses is that they are the most low-consequence data in the actual prevention of incidents and injuries. So every actual injury that happens was likely a near miss first. So it’s a pretty big deal when they’re not being reported.

The reasons being, you know, if something almost happened and it didn’t actually happen, it’s like, “Oh, that’s not really a big deal”. Or they think that they fixed it, like somebody trips over a mat and they just quickly fix it. But one of the biggest reasons is because they actually fear being punished, or they fear consequence themselves as an individual for reporting a near miss, or they don’t want to fill out the paperwork.

But one of the biggest reasons is just the fear of blame or the fear of discipline, and then also the fear of slowing production. Manufacturing facilities are so focused on high output and how do you keep machines up and running, how do you keep operations going, because that is a quantifiable KPI that they’re evaluated on. So if you slow production by reporting a near miss, that can be kind of a scary thing as an individual.

And then I think the last reason is the fear that nothing will actually change. So sometimes if your safety culture is not amazing, workers will believe, “Okay, if I report this, what’s the point? Nothing’s going to be done about it anyway”. And there’s no actual incentive for the employees to report. They just know that if they report a near miss, they might get blamed for it, or they’re going to have to fill out a bunch of paperwork. There’s nothing from their perspective that feels very positive that would come from reporting that.

 

Right. I can see how if you’re in an environment where fear is the forefront of that thought process rather than, “This will be good for the team overall,” I can see how that could be a huge blocker. And that kind of loops into the next question that I had for you in terms of the talk around psychological safety in the workplace. It’s been a really big topic here on The Safety Meeting; if you’re a regular listener, you’ve heard us talk about psychological safety really recently. So, can we talk about what that actually means in practice on the manufacturing floor and how it connects with whether or not those workers are willing to report?

 

Yeah, so an environment where a worker would feel very safe reporting, and that’s really at the core of it, is: Do they feel safe bringing up concerns? Even sharing their opinions about the general manufacturing process or about how safety is run. In a very psychologically safe environment, that means that there’s not this deeply rooted fear of speaking just in general, and they feel safe enough to express their concerns and their opinions.

And that’s such an important component when it comes to manufacturing because all these different trends and things that we’ve explored over the last decade has been you want to include as many employees as possible who are on the actual factory floor in evaluating some of your processes, even not related to safety, because they’re the ones who are doing it. And so they have incredible ideas and perspectives that should be absolutely considered as opposed to something coming from the very, very top and being passed down.

In a very safe environment, you’re not going to be labeled as careless if you report. It’s not about the person who’s reporting it; it’s more about the actual process. And you’re not going to get written up for reporting a safety issue, and you’re also not going to be viewed negatively by your peers for speaking up. So it very, very much starts with supervisor behavior; like, that’s one of the absolute most important pieces. But it’s also just about your general safety culture, because if you’re on a team of five and the other four people are going to be frustrated with you for reporting that because it’s going to impact them in some kind of negative way, that can really impact your willingness to speak up about those things.

So what that really looks like in practice is supervisors saying, “Thank you for reporting that”, first. And the investigation has to really focus on the process and not the individual person, regardless of what that is. And leadership also being able to call their own fouls or the fouls in the process and be able to openly admit mistakes publicly within the group. So when you have that kind of accountability that’s very publicly shared and exemplified by your leadership, that makes a huge difference in the overall safety culture.

And the last piece is the production pressure, so making sure that that doesn’t override safety conversations. Because production is so important; your actual output is so, so, so important. But if that overrides your safety conversations and throughput always wins, then psychological safety does not exist no matter what. You can’t just say you have it; you have to actually have it in practice.

 

Right. I can see how coming into a team that holds a culture where people would get upset by that, or leadership is not making it clear that it is safe to report, that that can really disrupt trying to get people to report these things.

So, you kind of touched on this a little bit already in the last answer, but let’s talk more about leadership. What does it look like when they get it right versus when they’re getting it wrong?

 

Yeah, so at the core of it, as much as we may not want to believe this, but supervisors are your safety culture. So, you know, your workers don’t really report to corporate values when it comes to safety; they report to their supervisor. So those are the metrics that they’re held to, that’s the actual culture and environment that they are having to report to.

So, supervisors can get it wrong, like the last questions, like “Who did this?” or you know, they get frustrated with reports because that’s more paperwork for them, and they treat near misses like actual paperwork versus opportunities to improve. Or they’ll delay fixes; they won’t be as responsive. And you know, they prioritize output over actually correcting some of those things.

And why that’s so wrong is everything that we’ve already talked about. If your workers are not incentivized or safe enough culturally to report those types of things, they don’t actually get reported. Companies that I’ve personally seen get it really right, they don’t just provide that like safe environment of “if you see something, say something”; it’s beyond that. They actually incentivize and recognize in a very positive reinforcement way when people do bring these concerns up and report these things.

Caterpillar is actually an excellent example. They have an incredible safety program and they very much pride themselves on how well they execute safety. It’s a very, very public thing. So they very much incentivize individual floor workers and operators to bring up safety concerns. Anytime that they make a change based on a safety report that was made, like “Hey, I think that we should change X, Y, and Z to make something more safe,” and there was an actual output. “This prevented us from doing this,” or “We actually saw production slow significantly less because we had less actual safety concerns.” They even had that as an award category for their end-of-year awards for the people who contributed safety tips or safety reports that helped them change their operations.

And I think that that’s an incredible way to do it. You make quick, small fixes wherever you can so that you can show that you’re responsive. Whenever somebody brings something up, you have a calm response, you have immediate gratitude for them contributing that, and focusing way more on the actual system breakdowns and then publicly closing the loop. Sharing with everybody, “Hey, we made this change. This is why. This is how it’s going to impact things,” and really having that perspective and that viewpoint of not just “Oh my gosh, we had this near-miss report and now we have to stop production, that’s going to slow things down.”

You know what’s going to actually slow production down way more is if you have an actual incident that happens. So companies that really focus on how that contributes to the bottom line, publicly recognizes people for contributing those things.

That production pressure can be the silent safety killer. You don’t really have a reporting problem at the end of the day; you often have a supervisor behavior problem.

 

I could see how that would be, and hearing about companies like Caterpillar who are awarding those people making those reports at the forefront of their safety culture can really change how people see the system overall.

Knowing how safety has evolved in the last couple of decades, have you ever seen a company do a complete turnaround on their near-miss reporting culture? And if you have, talk about what’s changed and how long did it take before workers started actually trusting that process?

 

Yeah, I think that there’s a pretty consistent pattern and, you know, the unfortunate truth of what turns safety programs around in general is usually when something really bad happens. You know, they’ll have a significant incident that they end up getting audited on, or they have a severe incident that forces them to suddenly take safety very seriously.

And as manufacturing companies go from being really, really small to being medium- to large-sized companies, you have to see that shift because if you’re a three-person manufacturing company and suddenly you’re 100 people, that’s a lot more variables that can possibly go wrong, that’s a lot more machines, there’s just a lot of things that end up having to be tracked and organized.

And so this big incident happens, you take a really deep look at, “Okay, what is our safety culture? We have to establish objectives and how do we want to really structure this from the ground up?” And then you start to get deep in the, “Okay, we have the safety program, why are we not getting the reports?” or “Why are we still having incidents?”

And I think that that’s something that people end up seeing within Flex, too, is suddenly you have this reporting and you see, “Okay, we’ve had the same safety incident happen over and over and over and over again,” and that’s really easy to miss if you’re only looking at, you know, pencil and paper. And now that you have this data, you have to react to it.

So getting down to the psychological safety component of it, you know, you separate that near miss from discipline and all those things that we talked about where the leadership publicly celebrates early reports and different things like that. It really starts from incorporating that into your leadership philosophies and even providing training on response language. You know, not every supervisor on a manufacturing floor is particularly skilled in how to deal with some of those personal conversations and things like that.

So making those types of changes or even just providing that kind of training and support for your supervisors is something that I’ve seen make a big difference, or I’ve seen companies make their feedback loops visible. Had a bulletin board, or everybody has access to the reporting dashboards. Those are the types of things that I’ve seen that have been incorporated that end up changing that safety culture where everybody feels like they are a part of it, that they have influence, that they have access, and that their opinions matter.

And so I absolutely have seen those things, but what you do have to be prepared for is that reporting is going to spike, right? So you have to be prepared for that in a positive way and not freak out. You don’t want to get executives nervous, like, “Okay, we had no incidents being reported and then we’re making this safety culture change and then suddenly you get a spike in reporting”. You can’t let that make you nervous; it’s what you wanted. Right?

And then 6 to 12 months later after that, you’re going to see that injury rates drop or actual incidents end up dropping even though the actual near-miss report spiked. But it takes time to build that trust with workers; they’re not going to just, you know, the next day be like, “Okay, now I feel super safe and I’m going to report everything and everything’s going to be perfect.” Like, that’s just not going to happen. You have to continuously reinforce that this is a positive thing, this is a really good thing, and then over time, sixish months later, you’re going to suddenly see that everybody feels comfortable bringing concerns up and having more open conversations about it, and you’re going to have a much more cohesive safety environment.

 

Yeah, changes like that definitely don’t happen overnight. It’s a process to get from a place where you don’t have that culture to a place where it is fully established. And like you mentioned, seeing those reports spike can sometimes be scary for the higher-ups thinking, “Oh, where did this come from?”

But the systems that you have for those reports also matter, because if you don’t have a lot of reports and then suddenly you’re getting all these paper reports on your desk, it can kind of be tough to parse those and actually turn that into actionable data where it does reduce those incidents. So, if that’s a big piece of it and those processes are clunky or intimidating, I know it can have an adverse effect as well because workers won’t want to work with that process if it’s clunky.

So, what does a good near-miss reporting system actually look like, and what gets in the way?

 

Yeah, so I think one of the things that kind of gets in the way is if you have—like I’ve seen a lot of companies say, “Well, we only want to make this reporting accessible” or “We only want to make this system accessible to supervisors or managers.” The problem with that is access and the problem with that is also, you know, everything that we’ve talked about when it comes to like involving a supervisor in that conversation for some of these workers. That makes it a lot more difficult to feel comfortable coming up and saying like, “Hey, I have this issue” if you always have to go to your supervisor, you know, that that doesn’t really incentivize or make it a comfortable process.

And so a really good system is something that is quick to report. In my opinion, it being accessible to everybody, you know, within a facility, so having everybody being a part of the safety culture. And, you know, like our system is accessible on your phone, it’s mobile-friendly and mobile-accessible. So if everybody has the app, you know, it can take you two minutes to report something on your phone that’s in your pocket.

Something that also allows for anonymous reporting in some cases can be helpful for again contributing to that culture where maybe not everybody feels the most comfortable bringing something forward. But you’re focusing on the actual hazard itself and the description, not whose fault is it, you know, not who is contributing to this. Especially if it’s a supervisor-established process, imagine having to go to your boss and directly reporting something that you think should be changed or could be changed or a hazard that was developed by that one person. You know, that’s not really creating an actual look at the process, that’s like “Well, who—who made this? Why is this this person’s fault?” You know, and that culture in general is just it’s a toxic workplace environment when it comes to safety. So it’s really not helpful.

But then when those things are reported, the other piece is the transparent status updates. So, “Hey, we got this safety report, this is something that we’re looking into.” Like if you put in a safety report no matter how accessible it is, if you never heard about it again, you’re probably not going to want to keep reporting things, right? Like you want to know how was this responded to, was this useful. So if somebody makes a report and you have transparent status updates like, “Hey, we got four safety reports this month, here’s what they were, here’s what we’re doing to respond to them,” that very, very much makes people want to participate more, want to contribute to those things more, and then also contribute to the data because then you can track things like, “Okay, where is that in the corrective action process? Where is that in being tracked and how far along are we in implementing a change?”

And those things also matter too for regulatory bodies like OSHA and for ISO especially. You know, all those things you should really get in practice of really adequately keeping track of those things and having a not just “Oh, somebody wasn’t trained properly,” but getting down to the root cause: “How are we going to fix this permanently, not just this one time?” All those different things are very important.

 

I think you’ve given some great insight on kind of a framework for how you get started, what it should look like, what—when things go wrong. I think you touched on this a little bit in your last answer about what happens after the report and how those workers can trust that. So, let’s talk about that. What should happen next? I think a lot of organizations collect the data, maybe nothing changes. Thinking back to a clunky process, if you’re taking that in all on paper, it might be difficult to make those changes. So, how do you close the loop in a way that actually reinforces the behavior that you want?

 

Yeah, I think one of the things that I hear from manufacturing facilities the most is “We don’t have time to, dot dot dot.” Because you have so many other things that you have to be concerned about—like manufacturing at its core is just a fast-moving industry, everything’s fast, everything’s dangerous, everything’s always moving at a million miles a minute every single day.

And when you neglect to take these types of things into consideration, like we talked about before, you end up creating much larger, much more impactful issues for yourself later on. So I think that there’s really like three things that can make the biggest difference. And one is to acknowledge the actual safety concern immediately. You know, like it’s really easy to be like, “Okay, I’ll look at that later, I’ll look at that later, I’ll look at that later,” but later becomes never. Like you really have to acknowledge that within like 24 hours. It should be an immediate acknowledgment of like, “Hey, I’ve seen this, I’ve reviewed it.” If it’s a two-minute report, it can take you two minutes to review and acknowledge it.

And then to actually initiate an investigation within days and not weeks. So same exact thing: You don’t want to find out that two months later we’re going to take a look at it. No, this is an incident that deserves to be investigated immediately because what if somebody gets injured tomorrow because of something that was reported right now? And the worst possible scenario is you set up this reporting infrastructure and everything like that, somebody reports something, it takes you so long to review it that it ends up actually turning into an incident within that time. And that’s obviously very preventable.

So once you actually investigate it, then figuring out a resolution no matter how small or how big, and communicating that resolution publicly. So safety teams or companies that I’ve seen that do this correctly will have these safety meetings and they’ll review every single thing that was reported that week and say, “This is what we were in process of investigating this thing, this is where we’re at with it,” or like we talked about before on the dashboards, you can actually show in Flex, like “This is an incident that we have open” or “This is a near miss that we’re currently investigating and this is where it’s at in the process.” So that people can see that real-time feedback loop of “Okay, we’ve actually created a resolution to this and we’re implementing it.”

So I think the most powerful sentence that you can say, especially in these types of things, is “Because of this report, we changed X, Y, and Z.” If the reports kind of disappear into a black hole, then your trust with your employees collapses.

 

Yeah. I think knowing that and hearing the specific struggles that manufacturing has with the fast-paced environment, maybe setting up that culture, it may be tougher in those environments to get this reporting really solid. So, I’m sure there are other industries and organizations where near-miss reporting is done really well. Can you give some insights on what manufacturing settings can borrow from those other industries that are doing it well?

 

Yeah, I think the industries that tend to do it well are those very high-consequence industries or, you know, some of those higher maturity sectors. So like aerospace manufacturing. That’s a very, very high-consequence industry, very strong corrective action systems. You know, even just the process to get materials and stuff like that spec’d in, that is a seven-year process to get new material; it’s a very, very long process because it has to go through the most intense checks and balances and things like that. But they have very strong corrective action systems, they have formal review boards, they have stop-work authority that’s very real, and safety is really treated like quality in aerospace.

And so the same thing goes for industries like automotive manufacturing, also very, very high consequence. You have daily safety walks, you have visual boards that everybody is a part of it, you have tiered accountability meetings and very rapid hazard correction because it all comes back to a very core liability when you’re talking about something that’s going to end up going to an end customer base that’s a consumer. You know, think of the consequences if millions of vehicles were sold to people that had something that was overlooked and all these people started getting in car accidents.

I think we’ve seen the repercussions of that. My dad had one of those Ford trucks where the wheel wells fell off, and it happened while he was driving and he was very lucky that he was going not very fast. And that was such a big deal and such a big recall and obviously incredibly expensive for Ford to remediate that, and that’s one of those things—it’s like it likely could have been prevented. But then they have to be able to go back and see exactly where that breakdown was to correct that, so they have to have perfect reporting.

So I think that what other manufacturing companies and industries can kind of borrow from that is if you treat near misses more like actual defect data, it’s so much more useful to you. And if you’re tracking the severity potential as opposed to just the outcome. So like, you know, somebody could have had their hand cut off but they didn’t, the severity potential is just as valuable to you if not more than when something does happen, because then you actually have to remediate it when, you know, it’s an actual outcome.

So reviewing that and tracking it by severity potential is really important because that helps remind you as to why that was so important, why that was, you know, such a big contributing factor. And then reviewing trends weekly, that’s something that they do all the time in those other industries. They’re always reviewing like, “Okay, what are our trends? Are our safety incidents or near misses going up or down? What is that looking like and why?” And making that available to everybody and reviewing that very, very important. So I think that those are things that you can kind of borrow from those industries in order to incorporate a more effective program for you.

 

It sounds like a lot of that could come from reframing, when you were talking about the aerospace industry seeing near misses and incidents as like a quality issue. So, I think that there are many ways that you could look at that and try to turn it around, but if you’re the only one looking at it that way, it can be difficult. So, what would you say to a safety manager who feels like their workers are just never going to speak up? Do you think that’s a cultural thing that just cannot be changed? And what would be your response to that kind of fatalism?

 

I mean, culture isn’t fixed in general. Culture can always be shaped, and it can always be changed. But how that gets shaped is what gets rewarded, what actually gets punished, and what leaders end up tolerating.

If workers aren’t speaking up, that in itself is a piece of data and that usually tends to point toward like how your supervisors are behaving, or how discipline is linked to that, or if there’s a lack of follow-through. So if your safety culture feels fixed or problematic or that’s never going to change, you really have to take a very honest hard look at your supervisors themselves, but it’s not fixed, it’s not entirely unfixable. It really does start there and you have to take an honest look at that and see how you can then influence your supervisors and how you can shape that culture with those supervisors in order for that to, you know, go top down.

 

And I think if you’re in that position, it’s definitely a tough transition to make, but I think you’re right, it is totally doable.

So, as we wrap up, if a safety manager is listening right now and they’ve taken away everything that you’ve said and gone, “Yes, this is something we need to do,” how can they make some meaningful progress on near-miss reporting in the next 90 days? Where should they start?

 

Yeah, so I think, you know, the first 30 days you really have to take a look at your reporting friction. Like, what is that looking like with your employees? Is there an actual cultural issue between responses from supervisors? Is your discipline policy connecting safety issues to actual discipline? Are people getting written up as a result of reporting those different things? You have to take a really core look at that kind of baseline piece, and then also look at your supervisor’s responses.

So shadowing that, working with supervisors and, you know, communicating with employees. I’ve even seen, you know, people do these skip-level meetings where you go directly, if you’re a high-level executive, go directly to floor workers and provide them an anonymous way to tell you what it is that they are experiencing. And do an anonymous survey: “Like, hey, why don’t you report?” or “Do you see things?” Whatever, you might be able to identify, you know, issues or breakdowns in your management hierarchy or management responses that are happening. You really have to understand at your core what is the problem and, you know, look at all of those different things: the reporting, the discipline policy, your supervisor responses, and really working directly with those employees. I do personally feel like an anonymous survey, like a truly, truly anonymous survey that allows people to have that platform to communicate freely and honestly, is a huge step and very helpful if you really want to get to the core quick.

And then after that, the next 30 days is simplify how those things are reported. So if you don’t currently utilize mobile options or reporting, or you don’t currently allow everybody to be involved in the safety process, start now. Enable that mobile reporting option, train the supervisors on how to respond to those different things, but make it very accessible to everyone. It’s not going to be a successful system if you only have, you know, five to ten people in your plant who are managers who are able to report things. Like yes, they’re on the floor, but your actual workers are the ones who are operating the machinery, who are doing most everything, and so you want them to be involved and you want them to have access to reporting.

And then start actually publicly closing some near misses. So show people immediately, not over an extended period of time but immediately. If they start reporting near misses, even if they’re really, really small, start publicly closing those. Show the feedback loop, show that you’re actually doing that so that you can start to change that culture piece.

And then start reporting on the trend. So start reporting on like, “Hey, we’ve seen a trend down in this area” or even “We’ve seen a trend up, it’s okay if your safety program isn’t quite working yet.” I think that that’s something that you should share with your team, like: “Hey, this is something we noticed an uptick in these reports, we’re not seeing a downtick, how do you think that we should start to resolve this? How can we include you in this decision-making process?”

Or maybe share the near miss of the month. Start recognizing those things as a very, very positive thing to really start changing and seeing that shift in the safety culture, and just really reinforce the, you know, no-blame language, no-blame behavior on individual employees. You can’t really fix safety culture broadly; you have to really focus on fixing the actual behaviors at their core within your facility a lot more tightly. So, that’s my opinion, I think that’s the best way to do it and the ways that I’ve seen safety programs kind of turn around in this way more successfully, but that’s the best place to start.

 

Those are some great actionable insights and these sort of topics are complex issues that require lots of pieces to go into them to make a change. So, I think you’ve done an incredible job covering, you know, how to shift the culture, how to actually get that reporting going knowing that there’s going to be an uptick in reporting, and understanding what’s going to happen once that data is actually being used to improve not only the culture on the team but the safety overall of the company.

So, thank you so much for sharing that with us today, it was so great to have you, your insights have been invaluable and I’m sure our listeners are glad to have them.

 

Fantastic, I’m so glad I was able to share this with everyone!

Subscribe to the Safety Meeting

If you like what you’re hearing, please consider subscribing and leaving us a rating or review - it helps other listeners like you find us. 

Subscribe

Related Content

Explore more comprehensive articles, specialized guides, and insightful interviews selected, offering fresh insights, data-driven analysis, and expert perspectives.

Kat McConnell Headshot

Kat McConnell

Kat McConnell supports Novara's communications team and, during university, spearheaded the creation of the student radio station, fostering a passion for podcasts. Apart from her role, she dabbles in portrait photography, culinary pursuits, and is known for her trivia prowess, earning her the senior superlative of "most likely to be a Jeopardy contestant." Kat is your go-to for Ina Garten recommendations, podcast suggestions, or any un-Googleable questions.

More from this Author >

Back To Top