Convergence of Safety Culture and Lean: Lessons from the Leaders Featuring Sidney Dekker, Steven Spear, and Richard Cook.
Professor, Griffith University
Dr. Steve Spear
Principal, HVE LLC
Dr. Richard Cook
Research Scientist, The Ohio State University
Founder and Author, IT Revolution
Good afternoon, everybody. Um, why don't you bring, why don't we bring our panel out? Dr. Cook, Dr. Decker and Dr. Spear while you're getting seated. Let me share with you what we did today. We spent four hours together this afternoon to really explore safety culture, where they intersect, where there might be disagreements. I have to tell you, it was one of the most professionally rewarding things I've done in a very long time. It was a magical, and if I may be honest, very stressful experience, and it was to be able to do this with John Willis as a copilot was just fantastic. And I learned so much, I think it lives up to the notion that I had set of said once, which was that I think it will be historic and war. And I certainly believe it will be important. We went on the time.
And one of the things that I found so meaningful was that they actually gave me a lot of concrete advice in terms of how we can accelerate the DevOps movement, how can take advantage of this opportunity that we have and how we can increase the likelihood of this move and succeeding. And so we will make the video of it available to share with you and a much broader audience, you know, sometime in the future. So I have prepared three questions and hopefully you'll get a taste of why this was so rewarding to me. So let me start with first Sidney Decker, Richard Cook, both of you have had time to spend in the DevOps community. What are the key places where safety practices can help us Sydney? Richard, it's a lot for hours. We couldn't keep them from talking. And now
I'm not saying anything it's called stage fright all the fall. Oh my God. Um, I think a qualification is, is in his, in his, in place. Uh, perhaps, uh, there are none. Um, and I think Richard might speak to that, but if there would be, if there would be, um, you know, we've learned through lots of damage and dead bodies, that a safety culture is a culture that allows the boss to hear bad news. And so, um, there's, there's two problems with that. One is to get news to the boss to begin with and for the boss to be open to that news and for the boss herself or himself, to be able to share bad news. Um, but then the question is what is bad news, right? What constitutes interesting feedback that you really think you need to be concerned about? Um, but it's, it's, it's that discussion that I would, uh, I would like to impart. Um, other than that, and this is a discussion that we certainly had genius, the issue, don't see people as the problem to control don't sink that your people need controlling their procedures, more training and more guidance, and know they, they, they are your problem-solvers, they know what to do. Don't tell them what to do. They know what to do, ask them what they need to be better at it. And then don't sweat, counting, negatives, understand how success is created. Yeah.
And you said something that I found very significant, which is it's not just leaders. I mean, it involves leaders and people on the frontline.
There's a symmetry. That's good. Yeah. Particularly in some other cultures, it's not as problematic as in north America actually, but, but, um, it's easy to think that we just always need to legitimate the voice from below who doesn't have status and, or make them speak and make them speak up and make them feel safe. Um, which is incredibly difficult to do. And by the way, I think none of us sits on a good example of an organization that does this perfectly at all, by the way. Um, but it is also the leader. She, or he needs to be able to say, I don't know, I need help. I actually am concerned about this. I need you guys as you people to help me do this and figure this out. Um, and sometimes it's quite difficult for a leader to be seen, to be losing face to not know the answer. Um, and so that culture of psychological safety is that's the term that you want to use, um, goes both ways.
Uh, awesome. Richard,
I think for one of the problems that we face in, particularly in high velocity production environments, is that we are better at forgetting than we are at learning. Um, or we're pretty good at learning, but we also forget very fast. We have a forgetting problem. That's probably more significant than our learning problems. And our forgetting is, is so quick because we're moving on to other things so fast that we can, uh, essentially lose sight of the lessons that we've learned. And I think some, we need to really work on trying to sustain the lessons learned and the experience over time. And that means not simply dispensing with the problem. And considering that it's done, John, Allspaw talked to this a little bit yesterday when he talked about the fact that people in their PMs were saying, we all know what happened, let's just fix it and move on. But I think that, that we really need to be able to remember these events. And, um, if you understand a Highland phrase, grok them and fullness, uh, over time, because I think that's really crucial
Actually, before I ask you the question for the lean community, you actually said something that was very memorable to me about the role of the leader right. Often is the humility, stunning humility of leaders. I could you repeat that so that everyone could benefit from that observation? The RO
You said profound thing. I don't recall this at all. It's on video, so big it out somewhere.
Yeah. Yeah. So it picks up what Sydney was saying, which
Is the adversary for all human activity is ignorance. I mean, if we had perfect knowledge, nothing would ever go wrong. The reason things go wrong is because we didn't know what really we should have been doing are really how we should've been doing it. So if we view ignorance as the adversary for all our undertakings are well-intentioned undertakings, then what we have to do is make sure that we behave in such a way that we recognize we're ignorant sooner than later, which requires humility, right? Which requires humility to say I'm in a situation and there probably some fairly important, profound, profound things where I'm actually quite moronic about these things. And to Sydney's point, I think he's right on it. It's become a very convenient cliche to say, we'll create a, an environment of psychological or emotional safety. We'll give voice to the person on the deck plate, the shop floor, the bedside will empower that person. But then the senior leader still acts like a pompous jackass. And so really how well, how well are they coaching and modeling exactly the behavior they're espousing.
Great. And any comments to that? Otherwise I'll ask Steve the second question. So Steve, the corresponding question to you would be, what would you consider the important lean practices that are important for devils is lean good for us and what are the right things to really learn from the whole lean movement?
Yeah. So we had a longer conversation about the source of the lean movement, and it's important to understand the, the, what the headwaters, where that was Toyota and what was the start point for Toyota? It was incompetency in manufacturing cars that would be competitive in the world market and this Toyota production system orbited around the notion that what we do we're really, really poor at. And if anyone catches on that, we're really poor, we'll quickly become irrelevant. And so you had Toyota leadership from the very beginning, managing their systems with this idea that we have to do is elevate and reveal what's wrong. So we can come to a better understanding of what we're doing and why we're doing it and how we're doing it. So, anyway, what happens with lean is, and this is a longer story, but what happens with lean is that Toyota's approach towards managing to identify ignorance and convert it into useful knowledge, becomes a set of tools by which you can have stability over a system, which would otherwise be chaotic.
But of course, all stability is just temporary because the things you're doing are constantly changing in an environment, which is constantly changing. So a direct answer to your question of what's the most important thing we can extract from, and I'll say, Toyota, not necessarily lean cause term has so much meaning those important principles that have the end on cord. If you think about what the end on cord represents, it's the coupling of standard work, which is the agreement you and I have that this is in the moment, our best known approach, not the best approach, but our best known approach for accomplishing something. But it also is coupled with this humility, this recognition that our best known approach will be inadequate. We don't know when otherwise we'd have an, a better known approach, but it will be inadequate. And because it will be inadequate, we have to have a mechanism to calling out it's inadequacy. The moment that first inadequacy is recognized. So the non chord,
And you feel like that's an embodiment to actually just to think out loud for a second. My first reaction is like, oh, this is uncharted territory. We didn't actually quite get here. We thought this might be a trigger word in the safety community because Andon cord means work stoppage and cease all operations, which could be considered an
Pattern. Well, can I, can I just elaborate on it? So I think that the literature, which attempted to explain Toyota, or at least describe Toyota said, oh, the end on-court. If someone has a problem, they pull this cord and everything comes to a screeching halt. When in fact that's logically undesirable. And also now what happens with the, in Encore does, is it declares that there's a situation occurring, which was neither expected nor desirable. And it has to be swarmed and investigated in the moment. Now that doesn't mean that all these other situations, which are actually working stopped, that would actually not make sense. It's just that, that one right there deserves attention. And just a quick reference to the medical community. If you think about all the instrumentation that goes on a patient, particularly one who is in critical condition, the reason that's there is to indicate, oh, there's something going on in the moment which requires attention. And you don't shut down the hospital just because this person has an irregular heartbeat. You deal with that one and let everything else continue to you discover an adjacent abnormality,
Uh, Sydney, Richard, can you react to that?
Yeah. Well, one of the things that I think has made other industries, and I don't want to be imperialist in any way about that because no one has got this figured out perfectly, but it's to then tell stories about that irregular heartbeat, right? How could we not have foreseen that from happening? And so if you're willing and honest and open to tell those stories and share those stories, not only, and this is certainly something that we talked about and I think is critical. Not only is it, you know, people love to tell stories and we do this naturally, right? We're geared to remembering stories and telling stories even before we don't inhabit a universe of concepts and numbers while he might cause if he's MIT, MIT,
How many times did it come up in the four hours? But that
Makes it so, so, but other people inhabit a world of stories, and this is what we remember. Right? And one of the things that Richard threw out there as a, as a news story, that Steve thread there's a challenge was how do we, how do we keep sensitizing the DevOps community to the risk of getting it wrong? Because the stuff you guys are playing with actually matters. It doesn't seem to matter when you're sipping your Coke and you're going all at 23 hours, you know, I'm still good. Let me bang some other code. If that's what you call it, probably not, but there are consequences to getting it wrong. Now you cannot generically sensitize people to the consequences of getting it wrong. As in a little poster on the wall, all forget that this is the risky stuff or whatever. I mean, that, that doesn't work.
Right? However, telling stories about how things went wrong or almost went wrong. That is remembered. We were having dinner last night, Richard was present as well. And so we actually, and so he and I got plenty of war stories to share, right? Because something is either good or a good story. Right. And so, and so, but we didn't, the war stories that were told, came from the DevOps community and it does something else. It does. It not only builds his memory trace of, oh, hang on. I've seen this, this recognition primes action that where you go, I need to do this because the other guy had this in that story. I remember that there is more than that out of it arises this ethic of what it means to do, right. To do wrong of what it means to be a good professional, to be a good practitioner. And it's, it's that, that trace that doesn't exist yet as Richard, as beautifully identified because you are such a young profession.
And in fact, Richard, I had asked, I had the opportunity to each ask each one of these scholars for advice. What advice would he give us as a community to help help us achieve our goals? And one of them, I mean, could you actually use a save Sydney covered it just fine? Or do you want to expound upon that advice? Because it actually, I found it very meaningful and I realized that I have actually made a mistake in terms of how we, you know, I'll share it with you. Do you want to add on to that?
Um, I think we are the work that you do has, uh, lots of circumstances where you can simply say, I'm just going to, um, close up the laptop and go home. You have to solve the problem. You're confronting it. The situation is getting worse. You're in a situation of escalating consequences, you have, uh, uh, undesirable options. Only all of the, all of the choices are bad ones. And you're choosing between bad options. There's an old surgical saying, if there are more than two operations for any condition, none of them are any good. And I'm afraid that you very often are in that situation. We, the fact that you have to make decisions under those circumstances. And we would like you to be as empowered and knowledgeable and supported in those circumstances as possible so that you can make those decisions. We understand that those decisions may not always turn out well, but you're the one very often who has that burden and that responsibility and like the pilot in the aircraft or the surgeon in the operating room, we expect whatever the longterm value for, uh, the, of learning from the experience might be.
We expect you to act at that moment in the most judicious thoughtful, imprudent way that you can, when we will look at you when we watch you and what you do, especially those of you, who've got a few gray hairs, uh, or who have a little bit of experience as it were. We noticed that you're very careful about trying to help other people who don't have that level of experience, learn how to approach these problems thoughtfully and taking into account. The ways that things can turn out. We think that's probably pretty much crucial to what DevOps is. Dev ops is not simply the practice of fixing problems or generating velocity. DevOps is also the practice of building a community of people who do DevOps. You are the only sources of information about how to do this stuff that are available to the people who are learning how. And so we recognize that you have a certain responsibility by virtue of your knowledge, your position, your status, to take on the care and feeding of those young people, as they are learning how to do this sort of work. And we think that would be a really important
To her mentality. And in fact, I misattributed where the advice came from because it actually started with Steve saying the opportunity may not be in the reward, but in the consequence. And my reaction was like, wait a minute, aren't you going to freeze everybody into paralysis? Because we scare the crap out of everybody, you know, because of what could go wrong. Could you expound on that?
Yeah. Echoing what these guys have been saying is what you all do has consequence, whether you're writing code for physical systems, which are far removed from you and people depend on them for their wellbeing and their safety, where you're writing code for communication or financial systems where people are dependent on those to function. Well, um, it's worth thinking about what's the consequence of getting it wrong is if you get it wrong, there's someone down range. Who's going to suffer for it for going wrong. I just say, and I want to tie into Sidney's point about telling stories. We worked a number of years ago with healthcare providers in Pittsburgh, and this one, fellow Rick Shannon, who was head of critical care. There was trying to motivate people to worry about complications, central line infections, ventilator, pneumonia, that kind of thing. He kept showing them data.
And everyone had an excuse for the data. Our patients are sicker. They always are. These are sick people to begin with. We're above average, et cetera, et cetera. And he made no persuasive progress that way. One day people came in and in the break room were posters of patients. Then normally when you go into a hospital and you see posters, it's the patient you dealt with, you know, the mother who gave birth to a healthy baby, the person who regained mobility. And when people started asking who these people are, Rick started explaining it's. This is the guy because of a complication. He'll never have a catch with his son on a, on a Sunday morning when the Steelers are playing in the afternoon, this is a woman whose daughter will never know her because she passed too early because of a complication. So it was the stories of what the risks look like. That proved to be in the end, highly motivated.
And my claim was, God, Steven, all you do is scare the crap out of people. You're going to paralyze people. And you said something somewhat startling
Look, fear doesn't necessarily have to lead to cowering. Cowering only occur is when your brain almost literally short circuits and gets into like a dual loop of what do I do? What do I do? I don't know. I'll just sit on the floor behind a chair and hope the problem goes away. But that's not what we're talking about. What we're talking about is with discipline rigor, energy, enthusiasm, optimism, recognizing moments where we're not understanding what we should be doing and how we should be doing it. So, yeah, but I know how to address that. I know how to address that because I'm trained in practice and part of a profession of problem solvers. So it need not lead necessarily towards paralysis.
Uh, I had just to reflect a little bit, that means when this distinguished panel was giving this advice to John Willis and myself, I realized I had made a mistake, right? I mean, in the experience reports that are shared, I think we go out of a way to pick the very successful ones, right? These are the triumphant people who over to mobile, it created a coalition. They mobilize and powerful and common systems they triumphed, right? And it made me realize that, you know what, there's equal value in covering the other side. And if you can help me make that more concrete, what is the value of leaders that have captured these stories of obviously failures, but of near misses, near misses?
Yeah. If you, you know, if you it's, it won't happen in here because of the social constraints on it, on the way that we talk about things and that sort of thing. But, but, uh, after, uh, the, you go to the hotel and you have a couple of beers and you sit down and talk with other people, the stories are not about the triumphs. Okay. It's not people standing up and talking about, we were we able to cut our costs in this way, or we were able to do this. The other thing, the stories are all about the catastrophes or New York tests or near catastrophes. Yeah. There's an old, another old surgical saying, which is good, raw, good results come from experience and experience comes from bad results. Yeah. And there's a lot of truth in that. And you, you all know that sharing those stories is actually very helpful.
Being able to talk about what that experience is like and what the kind of dilemmas and problems you've been in is actually very helpful. It's what John all spa and a bunch of other people have been trying to try and to look at in more detail so that we can understand something about how those things evolve. But we have to be honest about the extent to which we find ourselves in circumstances where we don't know what to do. That's what problem solving is all about. Problem solving is what you do when you don't know what to do. And, and that occurs much more frequently than most people are willing to admit. And I think we should be kind of raising that up because your skill is partly in being able to steer the system and direct it to where you want it to go. But your skill is also being able to look at a system that's not performing well, or that's having some difficulty and figure out why and make those sorts of corrections. We need to understand both of those kinds of activities. And, and we'll get to them by understanding, by sharing both success stories and stories about the catastrophes or near catastrophes that have occurred.
And it occurs to me that this also helps reinforce the notion of the humility of leader, right? The leader is the only one who can influence the system as a whole, right. More than the person on the front line. Um, any other thoughts on that?
Yeah. I want to tie back to Sydney's point about the leader and the modeling, the coaching and the social rewards. So many of, you know, the, the Navy had a series of mishaps in the Pacific this last year, three, three collisions in a grounding. And you start thinking through all the reasons that it happened, but let's pick the, it led to a loss of life injury and tremendous material harm. And let's think about the grounding one. So what happens there is a, is a crew finds themselves in circumstances, which is beyond their control. The tide has changed is wind that they had an anticipated. The weather has gone up they're on a schedule, which they hadn't exactly planned for driving a vessel that is made to not be seen by others. There was that that's right, right. One destroyer captain described to me, he said, just to understand the circumstance, imagine going to Walmart the day after Thanksgiving in their parking lot when it's still dark out in a black car, in a black car with your headlights off.
Right. But when started thinking about the issue of stories and leadership, so, um, let's back off from the ship that didn't actually run a grounded encountered circumstances like that. So what's the story that crew tells, well, the story that crew house is how they mastered the situation as a, Hey Jean, Manny. I tell you, it is the ways we're up to here. And the wind was going like this and on and on and on. But we conquered it well, that drives in the direction that I'm successes I is through heroism. Well, they may have conquered it only cause they got dumb luck as the wind just was slightly off from the crew that actually ran a ground. The question is, do they come in and tell the white knuckle story, which is, oh man, Gina was that damn close. And we did our best, but we were clearly not in control of the situation, but for the grace of God go, I, we might've been the one that ran the ground.
So now, now what drives the same story told them I was a hero versus I just escaped by a luck when you come and tell the story, how do you, how do people react to you? Excuse me, in particular, how are your leaders reacted? Is it, Hey, great heroism over there. So that was fantastic. What come here, call me and tell us how you almost went over the cliff. You know, the mixed metaphors. Tell us how you almost did. So the rest of us could figure out better what to do in such circumstances. It said,
I think all spa has said it beautifully, right? That an incident is an unplanned investment, right? Then if you don't see it that way as a leader, you are not getting a return on the investment that was already made on your behalf.
So if I can just pause for a moment. And so you see three, a very collegial interaction between three very distinguished. That's all,
That's all veneer
Three and a half hours ago. It was like worldwide wrestling Federation. Like there were chairs in the air, true yelling. Um, it's amazing to see this. It was a very, when I say it was sort of white knuckle, the time and stressful it was, oh, that was only your
John Wilson was
Actually fighting on one.
He's easily fooled though.
What other advice, any other advice that you want to, that you shared with me that you want to share with everybody else in terms of unsolicited vice, that you would give it to us as a community be so easily fooled. Don't be so easily fooled. Yeah. Thanks man. Doctor. Otherwise,
I'll go to Dr. .
I recognize that there's a lot of expertise in this audience, and I think you have a, a kind of a moral responsibility to share that expertise with the people around you, uh, to help them to become better, prepared to deal with the kinds of problems that they will deal with, even though you aren't going to be there at their site. I think that that responsibility needs to be come part of what DevOps is. I think dev ops needs to be go beyond tool chains and, um, uh, beyond repositories and become a kind of practice that involves people. Uh, what John has called above the line to do this may require that we identify ourselves as people who practice dev ops rather than people who work for company X. That is the practice of being a dev ops person has to be actually a kind of profession, a kind of skill and expertise that exists apart from the particular employment that you're engaged in right now, you've chosen to go to work where essentially the rubber meets the road at the sharp end of the stick at the cutting edge of things.
And because you're working there, you're going to have to encounter a lot of stressful situations, demanding circumstances, places where there's potential for real loss and the kinds of things that will keep you up all night. And I think you need to try and help the people around you who are learning about this, understand what they are getting into and how to cope with that. And I don't know exactly how to accomplish this except to say that, that the answer for this lies here in this room, not someplace else, it is not someone else's job to do this. It is not some other agency's job to do this. It is not your company's job to do this. It is your job to do this here, here. And I think unless you find a way to band together to find those common threads and the, and the kinds of responsibility and moral, uh, activities that you engage in, that you run the risk of becoming, uh, regarded as just another group of technicians. Right. And I think that would be a very great law. I mean like anesthesiology
And hall. Oh, I got to hear scholar jokes. Like how do PhDs and doctors make fun of each other? I got to hear that. How do they, um, I, it was, it was funny in the moment we love you actually. Why don't we end on that joke? Um, yeah, your foot is in your mouth already. So
We are very grateful. We're very grateful to Jean for inviting us through to the three of us here. Um, it, it takes some, he's taking some risk in doing that. We're, uh, as you might've recognized people who are produce reliable performance under all circumstances, and Jean has taken some risk and bring us together and building the kind of powder cake that exists with having three of us together, but he's reaching for something. And I hope that you can identify what it is that he's trying to reach for, because I think that you have to, to embrace that and reach for it as well. I think that more than any other technical aspect of these kinds of conferences is what this is all about. And I encourage you to think about that and to pursue that over the year to come.
And then I will have the closing words. And then I have an ask for all of you. I hadn't made the claim to, uh, this distinguished group, uh, on this panel that this is such an interesting and powerful community because you have all self-selected to be here. The fact that you're here signals that you see something bigger that we aspiring to often at great personal risk and, um, that, uh, you're building that coalition to over overcome very powerful and common system. So I guess would be my ask to you, uh, if you are willing to be able to share some of these stories of uncertainty and making decisions, when you don't know all the answers, uh, I would love to hear from you just email me, Jess and Perry, right? And this will be an obvious action item to create a section for that, uh, in DevOps enterprise 2018. So with that, a round of applause for Dr. Richard Cook the doctor, um, Um, Steve sphere and a city Decker made sure that everyone knew that he's not, he doesn't have a PhD. He's only a medical doctor, uh, Dr. Steven spear who spoke at depth probably 2015, and everybody, he made sure that all of us knew that he's from MIT, not Harvard, not U Penn and the Sydney Decker who reminded us of all the times that he is a pilot and has two PhDs. So thank you so much.
Unlimited users from organization
David Anderson & Mark McCann’s Lean Engineering Organization Playlist
The Flywheel Effect Creates Space for Innovation
David Anderson, Technical Fellow; Michael O'Reilly, Globalization Partners
Emerging Patterns & Anti-Patterns With Team Topologies
Matthew Skelton, Team Topologies; Manuel Pais, Team Topologies