Avoid Common Pitfalls of Risk Perception Via Statistics, Storytelling

Avoid Common Pitfalls of Risk Perception Via Statistics, Storytelling

Dec. 19, 2023
This episode discusses the challenges of intuitively understanding uncertainty and likelihoods in risk assessments. It also stresses the need for storytelling and stakeholder engagement.

Transcript:

Welcome to Process Safety with Trish and Traci, the podcast that aims to share insights from past incidents to help avoid future events. This podcast and its transcript can be found at chemicalprocessing.com. I'm Traci Purdum, Editor-in-Chief of Chemical Processing, and as always, I'm joined by Trish Kerin, the director of the IChemE Safety Centre.

We also have a guest today. Dr. Melissa Humphreys is a statistician. She creates and applies analytical tools that make sense of our world, working in areas like forensic science, defense, and psychology. The goal is always to increase efficiency, accuracy, and transparency. Building bridges between machines and experts, Melissa's work aims to support experts in making decisions in an explainable way that is useful, easy to work with, and unbiased.

Melissa is an ex-chef who completed her PhD in statistics and mathematical psychology from the University of Tasmania in 2017 with an 11-month-old son in tow. She is currently a lecturer in statistics at the University of Adelaide, and splits her time between teaching, research, and fighting for changes that will make academia more accessible to everyone. Sounds super easy, Melissa. Welcome and thanks for joining us.

Melissa: Thanks so much for having me.

Traci: Well tell me, how does a chef transition to statistics?

Melissa: Yeah, so being a chef is very cool and very fun, but it takes a lot of work and a lot of time. Long hours, long days, Friday nights, weekends, and I figured it was probably time to get a real job where I could just work nine to five and be a little bit more chilled. But kind of the joke's on me because I wasn't expecting to go where I currently am, so you know. The plan wasn't to do statistics; it was actually to do psychology or social work. The head of the English department enrolled me in mathematics instead. So here we are.

Traci: Well, an interesting transition for sure.

In part of the background that Melissa sent over, she noted that you can build a house without a hammer. In a world overflowing with data, we need reliable tools to build our understanding of what it all means. Trish, I know you and Melissa met via Superstars of STEM, and you struck up a conversation about how bad we are at intuitively understanding uncertainty and likelihoods. You want to talk a little bit about the reasoning behind asking Melissa to join us today?

Trish: Yeah, sure Traci.

So I think for me, when I have conversations with people around the place about risk, we can often get caught up in either dismissing something because it's got a really low likelihood, so we figure it's never going to happen anyway so we just ignore it, and we then don't take the appropriate action we need, because we don't really understand what it means. So we often hear people talk about a flood event, say, might be a one in a hundred year event, and so people think that it won't happen for a hundred years, but that's not what it actually means.

On the other side, human nature's a really interesting thing. We go and buy lotto tickets with a chance of trying to win money, and the chances are really, really, really low, but we think that's a good deal and we think that's a good risk to take because there's an upside and an opportunity to it. So I think part of our lack of understanding of risk is really about, we're blinded by the opportunity. We want that so we seek it, and we avoid the negative consequence, but we often avoid it by just trying to pretend it's not going to happen.

And so when I was chatting to Melissa, it was like, "Okay, talk to me about the nature of statistics and uncertainty, because I think this is an area that as a society we need to learn a lot more about."

Traci: And Melissa, why do you think it's so hard to communicate these concepts, even to a highly trained audience such as ours?

Melissa: It's hard to communicate even within statisticians. The problem is that it's not intuitive, and that makes everything really difficult. So, it's not something that you can just hear and kind of run with. It's something that you have to really think deeply about and understand with depth and complexity to be able to get impactful meaning out of the discussions that you're having.

And I know that sounds maybe a bit vague and a bit, but it's kind of context-based as well, right? So, in the examples that Trish was just talking about, when I teach my first years, I always ask them, "Put a number on 'likely' for me. What number do you think 'likely' means?" And often they will say things like eight out of 10, 90%, something like that. But if you were pregnant and I told you that there was a one in five chance that your baby was going to have a rare defect that meant it wasn't going to live, would you feel like that was likely?

So, even the language that we use, whether we're talking about probability, whether we're talking about risk, uncertainty, likelihood, all of those things are kind of tainted by the situation that we're talking about, the expectations, the outcomes associated with them, how positive they are, how negative they are, and that can be really individual as well. So that makes communication of this stuff really difficult.

Traci:  Can you explain the difference between risk and uncertainty in statistical terms?

Melissa: Sure. So, you can look that up really easily, right? And a lot of the definitions that you'll see say that risk is something that you can measure, risk is something that you can predict and you can know, whereas uncertainty is something that we don't know. So, we can't necessarily measure it, we can't necessarily predict it.

Statistically, it's more complex than that. There are risks that we can predict with great certainty, and those are usually risks that are predicted using physical models where we understand the processes that actually underpin the risks that are coming out. So, when you've got risks associated with mechanical failure, for example, you can build a mathematical model that perfectly represents the piece of machinery that you're looking at and test it under various scenarios and see when it's going to fail.

A lot of the time, though, risk is something that we're estimating, given maybe sometimes a physical model, but sometimes a statistical model and a statistical model is something that has been produced based off of data that we've seen rather than an understanding of the actual physical characteristics of the thing we're measuring.

So, when we build a statistical model... You can think about this in finance for example, you want to predict a default on a home loan. I've got tens of thousands of people that have had home loans before. I look at all of the data that I happen to have on those people, and then I build some sort of model that uses that data to predict whether new people coming into the system are going to default on their home loan or not.

That is not a physical model; that's a statistical model. So, we're using the data that we have to make a prediction, and that in and of itself has uncertainty associated with it. So, risk and uncertainty often cannot be disentangled. Uncertainty is this thing that means that although we might have a prediction, we're not sure exactly how precise that is.

And we can capture uncertainty. We can have estimates of how uncertain our models are or how much uncertainty there are in our models, and those things often need to be communicated alongside our predictions. So, when you hear a prediction of risk, eight in 10, one in a hundred years, there will often be bounds around that that try and capture the uncertainty that we have or how reliable we think that estimate is.

Traci: Trish, what are your thoughts? 

Trish: Yeah, I think one of the challenges for me is we often make all these assumptions when we do our risk calculations as well. And the quality of those assumptions impacts the quality of the model that we build, the statistical model, which is obviously going to impact the outcome at the end. And so that can have a really big impact on the result.

But the other challenge I see is that once we start reducing things down to numbers for people and we put those numbers in front of someone to make a decision, one, typically people don't understand what the numbers are saying, but two, they assume it must be really accurate, particularly if it's a number with a lot of decimal places to it, because you've obviously done a lot of calculations to get that many decimal places.

But the fact is that when we are making these models, we're often doing things like multiplying decimals by decimals, and all you get is a smaller decimal. You're not getting more accurate at that point in time, you're getting less accurate. And, so, I think that's also a big challenge we have, that people think that just because we can state a number, that we must be stating that with this degree of certainty, but we're actually not.

Melissa: No. And this gets down to that argument about whether you ever present a single number or whether you present a range of numbers. In forensic sciences, they don't even present numbers when they go to court, they present verbal summaries of what that number might mean to try and help jurors understand the impact of the number, rather than attributing importance to the number itself.

And you're exactly right that the assumptions that we build in have a really big impact on the things that come out the other end. Similarly, as things change over time... Like sometimes we will create a model of risk, we will employ someone to create a model of risk accompanying that moving forward, and in the next three or four or five or 10 years, they're still using the same models, even though their machinery has changed, their software has changed, their personnel has changed, and that means that the model is not actually capturing the things that they need it to in the same way that it was previously.

And, so, making sure that you're understanding that these estimates have assumptions that may or may not be fair, that they may have been trained on things or created in a time that is a little different to now. All of that adds to this uncertainty around the number that you get.

Trish: I love that idea of in that forensic space, actually using words to tell the story, as opposed to using the numbers. Because we understand stories a lot better. And that's certainly one of the things I talk to people around in terms of, how do you communicate the results of your risk assessment? You need to use words to do it, not use the numbers.

Because we've described why the numbers are problematic. Using the words can actually help people comprehend what it's about. They can figure out what it is you're trying to tell them. They can understand how bad the consequence might be, and that might be really necessary to get them to agree to taking action, to making sure that consequence never happens because of its magnitude.

So it comes into my old perennial hobbyhorse, Traci, of we need to use storytelling. We need to use our words to tell our stories.

Traci: Indeed. And you know I like words.

Melissa: The process of storytelling too, though, is one that it takes thought and time to craft and create. I was talking to our chief forensic scientist just yesterday, and he was saying that often when you're in court and you're trying to use these words and stories, they found that the jury will side with the most charismatic presenter rather than with the emphasis of the facts. And so understanding that the way these stories tell and the way that you craft those stories has a real impact on the way people will pick up the information that you're giving is a really important part of that storytelling, I think.

Trish: Yeah, definitely.

Traci: How can statistical methods help identify the influential factors affecting risk?

Melissa: Statistical models are sensational for finding things that we don't know exist or quantifying things that we think might exist. And so many of us will have worked in a scenario the situation before, where we've got maybe a very experienced person that says, "This is the way it is." And you think that maybe it's not like that in your experience. In what you've seen, it doesn't seem to be like that, but they're so adamant.

And when you have decision-makers who are making decisions based on their lived experiences and their beliefs about what's going on, you can end up with decisions that actually don't reflect systems as well as they should. Sometimes they're fantastic, but sometimes they're not. So, statistics gives you a way of being able to quantify whether those things are real or not, but it also gives you a way of being able to dive deeper into the systems that you have and find the things that you didn't know about that are producing outcomes that you do or don't want to see.

There are really powerful techniques that can look in really big data to try and find patterns that exist. There's also this beautiful thing where we can look at the way that variables interact with one another. So instead of just looking at a single thing and saying, "When this thing happens, this outcome happens," we can then look at the event and dig deeper into, what more subtle things came before that event that might have been the triggers for that event occurring in the first place? So it gives you kind of an unbiased insight into the systems that we have, which can be very powerful.

Trish: Yeah, I think that's going to be some of the really powerful techniques and tools we need to grasp more as we go forward into process safety. So not only being able to use things like big data to uncover these trends or these precursors that we just don't have the capacity to see as humans in the dataset because of the magnitude of the dataset, but also to really help get people to understand what it is that we are looking at, and understand that multiple failures that occur.

Traditionally, one of the other things we get stuck on in risk assessments is we never consider what we call double jeopardy in a risk assessment. We never consider that, "What happens if this fails and then this fails, and then that fails?" We go, "No, that's double jeopardy. That's triple jeopardy. No, we've just got to go one line of failure at a time."

Well, wouldn't the world be a wonderful place if we only ever had one line of failure at the time? But every incident I can ever think of had multiple failure sources that occurred that led to it occurring. I think there's space here for us to get a lot better in the AI space and the big data space in really starting to understand some of those things. I think that's an exciting future for us there, Mel.

Melissa: And you know, with the creation of digital twins, have you heard of digital twins before? You build a representation of your system, so you could build a whole factory or a plane or whatever it is that you're interested in, and you build in all of your physical systems and you have them in this computer world that then you can run a whole bunch of scenarios through.

And you can automate them, right, so they can test hundreds of thousands of scenarios one after the other, so that you can find the combinations of events that happen that might lead to catastrophic failure or optimize your performance or all of those kind of things. And, so, with the advances in MLA and AI and the things that we know about the systems that we have, we have these really powerful insights, and very safe ways of being able to test those double and triple jeopardy scenarios without having to run physical experiments, which is very cool.

Traci: Now, you both have touched on this in talking about the statistical findings and being able to communicate to stakeholders, stakeholders that are on the plant floor or stakeholders that are in upper management or the C-suites. How can these findings be effectively communicated?

Melissa: I think there is one very key thing that has to happen to make sure that that is effective, and actually, our Indigenous Australians are pioneering this at the moment with their work in data governance and data sovereignty. And this is the idea that you provide training and insights to the people who you are providing your results so that they better understand the results that you have.

So, that is one very key component, that if you need to be able to communicate these results, you need to be able to provide some training, some insights, some support to the people who you're presenting the numbers or the words to so that they can better understand what you're trying to say. The second part of that is making sure that they are involved in the process of deciding what that's going to look like. And, so, having conversations with your stakeholders so that you can say, "What do you want as a result? Do you want a number? Do you want me to give you a range? Do you need an interactive graph that you can poke and prod?"

We recently built a system for our emergency services to look at trying to decide whether they need to send a plane. As part of the support to the experts, we give them a distribution of time and how long it took. In our top 100 cases that looked like this, how long did the ambulance take, how long did the helicopter take? We give them the best choice that the model suggests, but we also give them that information so that when you have an expert who really understands the system in a deeper way than what we do, they can also see some of those outliers and strange cases and bring their expertise to the table.

And so being flexible in the way that you communicate to try and match the needs of the people who you're communicating with, I think is absolutely essential to getting it right.

Trish: Yeah, I absolutely agree with that, the idea of engaging with the stakeholder. The people that you need to make the decisions, they need to understand what we're talking about. So partly we need to talk in their language, but partly we also need to be willing to help them understand the key components of our language that we can't necessarily simplify any further for them. And I think that's a really important part too.

Melissa: Yeah. Gets back to the storytelling that you're talking about, right?

Trish: Absolutely, yep.

Traci: Is there anything you'd like to add to this topic, Trish? I'm going to toss it out to you first, because you know I always toss this question out, and then we'll give Melissa a chance to formulate anything she wants to add to the conversation.

Trish: Yeah, I'd just like to actually thank Melissa for making maths and statistics a little bit more understandable for the rest of us, and making it sound fun and interesting as well. Because it actually really is, and it's also really important, and so many people just shrug it off as, "Oh no, I don't do maths." Well actually, everybody needs to do maths in this world. It's a key part of how we need to be and how we function. So, to actually have a conversation with someone that can make it so relatable for a general public, I think is fantastic.

Melissa: I agree with you, I think maths is such an essential part of our life, and people don't realize just how much they need it in so many places. And when it comes to business and industry, and even research across the board in different places, people spend a lot of time and money collecting data, collecting resources, putting people on projects. It's a lot of time, it's a lot of effort, and maths can help us get so much more out of that, and if we can just find your friendly mathematician or find your friendly statistician and get them on board, we love problems and we love helping. So yeah, give us a chance to try and help you get a bit more out of what you've got, understand things a little better.

Traci: Well, the challenge is finding a very engaging and energetic mathematician and statistician to help you. And breaking it down, as Trish said, into understandable terms is not always easy. So I appreciate you sitting in with us today and giving us some insight on that.

Unfortunate events happen all over the world, and we will be here to discuss and learn from them. Subscribe to this free podcast so you can stay on top of best practices. You can also visit us at chemicalprocessing.com for more tools and resources aimed at helping you run efficient and safe facilities. On behalf of Trish and Melissa, I'm Traci, and this is Process Safety with Trish and Traci.

Melissa: Thanks so much.

Trish: Stay safe.

About the Author

Traci Purdum | Editor-in-Chief

Traci Purdum, an award-winning business journalist with extensive experience covering manufacturing and management issues, is a graduate of the Kent State University School of Journalism and Mass Communication, Kent, Ohio, and an alumnus of the Wharton Seminar for Business Journalists, Wharton School of Business, University of Pennsylvania, Philadelphia.

Sponsored Recommendations

Keys to Improving Safety in Chemical Processes (PDF)

Many facilities handle dangerous processes and products on a daily basis. Keeping everything under control demands well-trained people working with the best equipment.

Get Hands-On Training in Emerson's Interactive Plant Environment

Enhance the training experience and increase retention by training hands-on in Emerson's Interactive Plant Environment. Build skills here so you have them where and when it matters...

Rosemount™ 625IR Fixed Gas Detector (Video)

See how Rosemount™ 625IR Fixed Gas Detector helps keep workers safe with ultra-fast response times to detect hydrocarbon gases before they can create dangerous situations.

Micro Motion 4700 Coriolis Configurable Inputs and Outputs Transmitter

The Micro Motion 4700 Coriolis Transmitter offers a compact C1D1 (Zone 1) housing. Bluetooth and Smart Meter Verification are available.