Podcast: Henry David Thoreau of Process Safety – Trevor Kletz

Trevor Kletz revolutionized process safety through HAZOP advocacy, inherent safety principles, learning from accidents, and emphasizing design simplification over complex add-ons.
Oct. 14, 2025
19 min read

Key Takeaways

1.     Inherent Safety Over Add-Ons: Design safety into the process from the start by eliminating, minimizing, or substituting hazards rather than relying on complex alarms and shutdown systems.

2.     Systems, Not Blame: Human error is inevitable—the solution is engineering resilient systems that prevent mistakes from causing catastrophic outcomes, not punishing operators.

3.     Learn from History: Organizations must actively embed lessons from past incidents into future generations, as "what went wrong" will happen again if knowledge isn't preserved and applied.

Transcript

Edited for clarity.

Welcome to Process Safety with Trish and Traci, the award-winning podcast that shares insights from past incidents to help avoid future events. Please subscribe to this free podcast on your favorite platform to continue learning with Trish and me. I'm Traci Purdum, editor-in-chief of Chemical Processing, and joining me as always is Trish Kerin, director of Lead Like Kerin. Hey Trish, what has inspired you this week?

Trish: I volunteered yesterday with an organization, spending about half a day packing food for people with food insecurity. It was a really affirming activity and highlighted how much privilege I have in my life. It made me feel quite humble, but it was a wonderful experience and great team building.

Traci: It's always good to give back for sure.

Trish: And congratulations are in order for you winning an Eddie Award for one of your articles. Multi-award winner you are!

Traci: Thank you. It was the column I wrote for Earth Day called "Lax Regulations Burn Rivers." I was pleased with it because I got to weave in a little of my hometown lore from Cleveland, Ohio, and talk about regulations and recent news.

Speaking of inspiration, today's episode is dedicated to Trevor Kletz, who sparked the whole process safety movement. Trevor was a pioneering British chemical engineer who revolutionized process safety in the chemical industry. His influential writings — including over 20 books, numerous papers and even articles in Chemical Processing — introduced concepts like HAZOP studies and promoted a culture of safety through shared knowledge. His work fundamentally transformed how industries approach risk management, making chemical plants worldwide significantly safer through his practical, experience-based approach. First question: Did you ever meet Trevor?

Trish: Sadly, no. I'd only been at IChemE about a year and a half when Trevor passed, so I never got the chance to meet him. Though I did meet his son after IChemE named their plenary lecture at the Hazards conference after Trevor. His son came to that first Trevor Kletz Memorial Lecture.

Traci: I didn't realize his son was involved. That must have been interesting.

Trish: It was. As we are with most of our parents, we probably don't always realize how much they actually did in their work life because we see them as our parents. I think it was quite surprising for his son to realize the influence his father had because we generally don't connect that with our parents. They're our parents, right? They're not other people. But certainly, they do have other lives.

HAZOP Advocacy

Traci: He had quite a life. Let's talk about HAZOP. What made his approach to HAZOP studies revolutionary, and how did it change the way industries identify potential hazards?

Trish: Sometimes people assume Trevor actually developed HAZOP. But in his own words, he didn't. It was developed by another engineer called Lawley at ICI, who decided to apply a detailed critical examination technique — which they applied to management decisions — to plant design. He spent three days a week, every week for four months studying the design of a new plant. Trevor didn't get involved in that particular activity because he was busy being a plant manager. But later, when he took a full-time safety job, he became an enthusiastic advocate for HAZOP.

The key part about HAZOP is that it's such a structured methodology that it forces you to consider all sorts of different conditions that if you were just brainstorming, you would not necessarily consider. In fact, you might even discount them and say, "I know that just can't happen." By doing the HAZOP, it forces you to examine every line in the design and say, "OK, what happens if there's no flow, too much flow, zero flow?" You go through a series of guide words and focus on every permutation.

It's now been expanded to so many different applications. We have CHAZOP, which is a control system HAZOP. We do procedural HAZOPs. There's a whole range of different HAZOPs now, and I think thanks to Trevor's enthusiastic advocacy, we have this broad application of a tool that was developed in the 1960s. The core of it hasn't really changed, which says it was a pretty good tool to start with. All these years later we're still using it, and it's still delivering great safety discovery and understanding of our designs. Trevor's advocacy, his use of it, his promotion of it has been fantastic for industries all over the world.

Inherent Safety

Traci: His enthusiastic advocacy of everything process safety really was revolutionary in being able to see the forest through the trees and understand what really mattered. Every day I learn more about Trevor Kletz. I did not realize that about the HAZOP — I thought he created it. Let's talk about inherent safety. Can you explain his concept and give some examples of how it differs from traditional add-on safety measures?

Trish: There are many Trevor quotes over the years. When you're working in process safety, you can't go a single day or at least a week without a Trevor Kletz quote popping up somewhere. One of his famous quotes was: "What you don't have can't leak."

Think about that for a moment. That is the fundamental basis of inherently safer design principles. It's all about how we deal with that hazard in the design so we don't create the potential risk. There's a range of different elements in inherently safer design.

The first one is eliminate. Can we eliminate the hazard? If you don't have it, it can't cause you a problem. This is about going right back to the concept design and saying, "Can we actually take the hazard away?" If we can take the hazard away, then we don't have to manage it. That's the real difference between add-on safety measures and inherently safer design. If we can deal with the hazard in its design, then we don't have to manage it because it doesn't have the same impact.

The next one is minimization or intensification. Can we minimize the amount of the hazard that we have? Again, what you don't have can't leak. If we only have 100 liters and not 1,000 liters, we can only spill 100 liters. This is about how we can minimize the amount of hazard we're using, and we may do that through process intensification. We may increase pressure or temperature in one way, but it reduces the need for something in another way. We do trade-offs in inherently safer design. There's no one thing that's absolutely safe. We have to make some trade-offs.

We also talk about substitute. Can we take that hazard and substitute it for something less hazardous? Another option is moderate or reduce the severity of the hazard. Can we change the design parameters? Can we change the way we're doing it so we reduce the severity of the hazard? All of these are fundamentally built into the design. They're not add-ons. Once we start adding on, we're beyond the realm of inherently safer. It's in the title: inherent. It has to be built into the design.

The last one that is sometimes forgotten — and we'll get to it a little later because you can't have a discussion about process safety and Trevor Kletz without talking about simplification — is: How do we simplify the plant, make it as easy as possible to use so people don't make a mistake? We want to remove the potential for the mistake or for the mistake to drive a significant risk outcome.

All of those elements are things we build into the design before we even construct it. We're not adding on alarms and shutdown systems and all these things. They're not inherently safer. They're add-ons. They're not part of the actual management of the hazard intrinsic to itself. That's the real difference.

Human Error

Traci: Now talking about building in the design, there's a factor there — the human factor — that you don't have in that design. Let's talk about how Trevor transformed the way industries view human error, from blaming operators to examining system design.

Trish: This podcast is going to be full of Trevor Kletz quotes. The next one I'd like to share is: "Saying an accident is due to human failing is about as helpful as saying a fall is due to gravity. It is true, but it does not lead to constructive action. Instead, it merely tempts us to tell someone to be more careful."

When you think about it, yes, we are humans. We all make mistakes every day. I lost my cellphone this morning and had to hunt around the house finding it. I made a mistake. I put it somewhere unusual and forgot where. That was my mistake.

We all make mistakes every day. When the outcome of those mistakes can lead to a significant process safety event, that's a problem that says our system is not designed well enough. Our system is not resilient. We're in a situation where we're at risk of something going wrong when we need a human to get it right 100% of the time, because sadly, we never will.

He really changed this focus and said it's not about the human making mistakes — they're going to make mistakes. What we need to do is engineer our systems to be resilient enough that the mistake doesn't matter. This is about focusing on how we can design our systems to be more inherently safer so when someone does make the inevitable mistake, it doesn't cause the incident. It might be a minor plant upset, it might be an inconvenience, but it's not going to cause a catastrophic outcome. That's what this is all about.

It's really about accepting that humans make mistakes, and it was an adaptation of the human factors field of study, which was first really developed in the 1930s in the aviation sector. Aviation, rail and now healthcare have been doing human factors for a very long time. The processing industries have been doing it now as well for quite some time. I still think we're a little bit behind some other industries. Aviation's been doing human factors since the 1930s and '40s — they've got a bit of a head start. But I think we can do more in the process industries to really get beyond this idea that the human made a mistake, therefore we punish the human. That is not going to solve the problem. As Trevor said, it merely tempts us to tell someone to be more careful, and that never solved a problem.

Learning From Incidents

Traci: What are some of the most important lessons from his case studies, and why did he believe learning from accidents was so crucial? That's something you and I talk about all the time, and it stems from his experiences.

Trish: It's really down to: We need to learn from what has happened before because if we don't, it's going to happen again. This is where I'm going to bring in another two Trevor quotes. The first one is that organizations have no memory — only people have memory. This is one of the challenges we have in learning. People remember something they've been involved in, something they've experienced that had a deep impact on them. When they leave an organization, the learning leaves with them.

That's one of our challenges: How do we take this valuable learning that people have and embed it into the next generation in the organization? We can have knowledge management systems, and it's fine to have a lot of data and information available to people. But if they're not looking for it, then it's just the digital equivalent of a dusty book on a shelf. How do we take these valuable lessons and embed them in the next generation and the next generation? Because it's not a knowledge management system that makes decisions — it's our humans that make decisions. We need them to have this background, this information. That is why learning from case studies is so important, because it takes what might just be a theoretical idea to someone and explains it in such a way that we see how it could eventuate by talking about an example of when it did eventuate. I think that's a really critical part.

One of Trevor's books was called "What Went Wrong," and then he wrote another one called "Still Going Wrong." He tells all of these case study stories. He also talks about incidents that'll happen today and tomorrow, and he actually predicts them in some of his books. You'd say, "Well, how could he possibly predict them?" Because they've already happened somewhere else before, and we are still repeating them over and over again.

I remember as a young engineer, one that stuck with me the most was when he talked about the number of atmospheric storage tanks that will be sucked in every year because it doesn't take much to suck in an atmospheric storage tank. You look at these big steel tanks and think, "How is that possible that a simple plastic bag taped over a vent — because we were painting the outside of the tank and forgot to remove the plastic bag — how is it that that can cause a tank to be completely sucked in on itself?" Because logically you go, "That just doesn't make any sense."

Till you sit down and do the calculation. The calculation for what vacuum pressure an atmospheric storage tank is designed to is not that great a number. It's quite a small differential number, and Trevor in his book goes through and compares it to a cup of tea or coffee. It's only a couple of inches of water pressure — enough to suck in an atmospheric storage tank. He had this amazing ability to take a complex topic and simplify it in a way that more people could understand the message. He was never simplistic, and that's a really important thing. It's not about being simplistic, and I hate the term "dumbing it down." No, it's not dumbing it down. It's just giving you a simile or a metaphor, something you can relate to, to understand it.

I look at my cup of coffee and think, "Gee, it's only that amount of pressure that it takes to suck in a storage tank." That is pretty intense. He created his stories and case studies this way so you could relate to what was there and actually then start to apply it and remember it. I sat down as a young engineer and went, "I'm going to do that calculation myself." Gee, he was right. What a surprise. Trevor was right. I had to go and prove it mathematically. I was a young engineer. But these are really important lessons, and that one stuck with me for decades because it's just so incredulous, yet it's true.

The other quote I was going to give you — which is also one I love because it does come into this learning and these case studies — is when someone says, "Oh, well, you know, that's so unlikely. It's just too expensive to fix." His next quote is: "If you think safety is expensive, try an accident."

That's a really important one. Sometimes we might have to invest in safety, but what we're investing to prevent a loss of is much, much greater. You might invest $50,000, but the incident you're preventing might be $10 million. So $50,000 doesn't sound like a lot when you think about $10 million. But we do get lulled into this false sense of, "Oh, but we won't have that $10 million loss. We'll be OK. We can ignore the $50,000."

You and I have spoken before, and you know I often talk about when we are investing in process safety, we're actually investing in reliability. Reliability gives us productivity, and productivity can let us make money. Process safety can be expensive, but it can also make you money.

The Friendly Plant

Traci: And as you point out, the simple message is the one that sticks with you. I think you are very talented at that as well, and maybe it's because of the cup of coffee and going on from that — because it stuck with you — that you help us with similar stories. I appreciate that. Let's talk about what Trevor meant by a "friendly plant," and how can that philosophy be applied to modern industrial design?

Trish: The friendly plant concept was really coming back to one of those inherently safer design elements I talked about, which is simplify. If we make the process as simple as possible, then there are fewer steps for it to go wrong and less opportunity for a human error to cause something that could lead to a catastrophic event.

This whole idea of the friendly plant is: How can we take a really critical look at our designs? Does it need to be that complex? Does it need to have that many intermediate steps? One of the incidents we often talk about in process safety is Bhopal. Bhopal had methyl isocyanate stored in tanks, but it was only an intermediate product. You would say that perhaps wasn't a friendly plant from Trevor's idea. They didn't need to manufacture it, keep it in storage and then use it at a later date. They could have run continuous operations and consumed it as it was produced because, remember, what you don't have can't leak.

Taking a look at these things: Do we actually need that intermediate storage, or can we produce and consume? Can we change the formula and the process of how we're producing to eliminate steps altogether? Sometimes it comes back to looking at the base chemistry — can we formulate this in a different way? Sometimes it comes back to looking at other elements of inherently safer design. If we need a solvent as part of our process, does it have to be a flammable solvent, or can it be an inert solvent potentially? We look at these different elements we can put in place.

When we think about how we apply this to modern industrial design, we say: Do we need to have all of the complexity? Just because we can doesn't mean we should. When we talk about all this amazing technology — automated shutdowns, alarm systems, all these things — they're not inherently safer. They're the add-ons afterwards, and we can as engineers get quite excited about adding on more and more controls. But that's not necessarily making us any safer. Sometimes we need to strip back and say, "OK, let's go back and deal with the core hazard inherently." We don't need to just keep adding on all of these additional bells and whistles or belts and braces. We need to deal with our hazard at its core. Then there will be some add-ons we put in. Without doubt, we need some of them. But sometimes in our modern plant designs, we get far too focused on all of these different aspects, all of these different technologies. As I said, just because we can put a technology into place doesn't necessarily mean we should. Let's think about it from that simplification and friendly plant perspective.

Complex Safety Systems

Traci: He was very critical of overly complex safety systems. Were there other alternatives that come to mind?

Trish: He very much brought everything back to: Let's manage that hazard. Remember, it all came back to what you don't have can't leak. As engineers, that means we need to sometimes get a little bit more creative. As chemists that are designing processes, designing formulations, how can we apply what we now call the concepts of green chemistry? Green chemistry is very similar to the whole idea of inherently safer design, but it's all around the formulation. Focusing on how we can go through that process and try and manage what we're doing in a different way to deal with that hazard.

Simplification, removing the hazard wherever we can — if we can't remove it, then how do we go for a less hazardous alternative? That's really what he was a significant advocate of. I think our designs are better for his input over the many decades that he contributed to process safety as we know it today.

Traci: He truly was the Henry David Thoreau of process safety with simplify, simplify, simplify. Trish, is there anything you want to add about Trevor?

Trish: If you've never heard of Trevor Kletz or if you've never read any of his books, go and get them. Go and find any of Trevor's books and just have a read. He was a prolific author. He wrote many books. As I said, "What Went Wrong," "Still Going Wrong" — there's so many out there. You will learn so much. They are easy to read. They are not deeply complex. He takes the complexity of incidents and explains them in a beautiful way that we can understand. Go get yourself some Trevor Kletz books and read them. As a process safety engineer or indeed as a leader or manager in a field that deals with process safety hazards, you will learn so much. It will make you a better person in your role, so please go and read his books.

Traci: Well, Trish, you obviously learned a lot from Trevor, and you help us with that. You help us embed the valuable lessons from these catastrophic events. Unfortunate events happen all over the world, and we will be here to discuss and learn from them. Subscribe to this free, award-winning podcast so you can stay on top of best practices. You can also visit us at ChemicalProcessing.com for more tools and resources aimed at helping you run efficient and safe facilities. On behalf of Trish, I'm Traci, and this is Process Safety with Trish and Traci. Thanks, Trish.

Trish: Stay safe.

Editor's Note: Trish has written books on process safety, including "The Platypus Philosophy" and "Let's Talk About Your Leadership — Learning through the art of storytelling." You can find both here.

 

 

About the Author

Traci Purdum

Editor-in-Chief

Traci Purdum, an award-winning business journalist with extensive experience covering manufacturing and management issues, is a graduate of the Kent State University School of Journalism and Mass Communication, Kent, Ohio, and an alumnus of the Wharton Seminar for Business Journalists, Wharton School of Business, University of Pennsylvania, Philadelphia.

Sign up for Chemical Processing Newsletters
Get the latest news and updates.