Ron Westrum is Emeritus Professor of sociology at Eastern Michigan University. He holds a B.A. (honors) from Harvard University and a Ph.D in Sociology from the University of Chicago. Dr. Westrum is a specialist in the sociology of science and technology, and on complex organizations. He has written three books, Complex Organizations: Growth, Development and Change; Technologies and Society: The Shaping of people and Things, and Sidewinder: Creative Missile Design at China Lake. He has also written about 50 articles and book chapters. His work on organizational culture has been valuable for the aviation industry and to medical safety, as well as to other areas of endeavor. He has been been a consultant to NASA, the National Research Council, and the Resilience Core Group. He is currently at work on a book on information flow cultures.
Dr. Ron Westrum
Eastern Michigan University, Emeritus Professor of Sociology
Okay. I am so honored and delighted about who's speaking next, Dr. Ron Westrum professor emeritus of sociology at Eastern Michigan university. His name will be familiar to anyone who has read the state of DevOps reports that I had the privilege of working on for six years with Dr. Nicole Forsgren and Jess humble from 2013 to 2019. It is the cross population study that spanned over 36,000 respondents that allowed us to better understand what high performing technology organizations looked like. We looked at the architectural practices, technical practices and cultural norms. And without a doubt, what those cultural norms might look like were made possible by the work of Dr. Ron Westrum, who received his PhD in sociology from the university of Chicago, and has spent decades studying complex organizations, including healthcare aviation and the nuclear industry. And one of the models he created was the same as Western organizational typology model that brilliantly categorized organizations into pathological, bureaucratic and generative is featured prominently in the state of DevOps research that devil's handbook and the accelerate books. And when Dr. Forester and CC'd me on some correspondence that you had with Dr. Western, I almost fell out of my chair. I was able to interview him for four hours on my podcast, the ideal cast. And I'm so delighted that he'll be teaching us today about information flow in organizations and providing a case study of an organization that you have probably heard of. Here's Dr. Western
Today, we're going to talk about information flow cultures, and we're going to talk about the cultures of organizations. So what the devil is, organizational culture. Well, organizational culture is a complicated thing. For instance, it has the following characteristics. So organizational culture is practices. Organizational culture is thoughts. Organizational culture is feelings and it has symbols. So while all these are important, we're going to use another index, the flow of information. Why is information flow? The right thing to do? The reason is basically information is the lifeblood of organizations. If the organization has a good flow of information, the organizations will do well. If it has a bad flow, it's not going to do very well. Information is also a powerful index of how an organization functions and information flow culture. In fact, reflects how managers shape values and behavior. And we're going to describe three different information flow types.
One of them is generative where you have a high flow of information, the best then there's bureaucratic, which has a medium flow of information, and which has a low flow. So let's look at pathological flow in pathological organizations, you get a low cooperation, very high conflict and emphasis on taking care of the leaders, strict boundaries, messengers get shot. You have low creativity. So you have a toxic environment in a bureaucratic situation. You get modest cooperation. The emphasis is on rules and regulation. You have a problems with silos. Messengers are tolerated, not necessarily encouraged. Conflicts are tamped down and creativity is allowed. And here is my slide, which I think reflect the flow of bureaucratic information, which is that it's slow. Now, what we'd really like to have is a generative flow of information, where we have high cooperation. We have emphasis on the mission. We have boundaryless organization where things move quickly over the boundaries.
Speaking up is encouraged. And in fact, people have psychological safety and high creativity. So here is my example of how highly creative organization is supposed to function. I think star Trek is a perfect model. Now let me emphasize one of the features that goes with generative information flow at Google, they had a project called project Aristotle and he studied what made for an effective team. The number one feature of an effective team was psychological safety. The ability to speak your mind without fear of punishment, when communication is easy, there is more of it, but it's also the right kind of communication. I like to say that a high flow of communication has these three characteristics. Number one, it's timely. Number two, it's easy to understand it comes in a form. That's easy to, to make sense of number three, it meets the receiver's needs. Now there's a classic example of this during the famous Redstone rocket program, which was one of NASA's first, a prototype went off course and crashed fair enough on Brahm head of the project, tried to figure out by many analyses what had happened.
The analysis did not suggest to cause now that we're going to have to start from scratch to redesign the missile, but then an engineer came to Von brown. He said, I think I did it, but how fun Brian wanted to know? Well, the engineer said, I touched a part of the circuit with a screwdriver and got a spark I checked and the circuit seem to be fine, but maybe that was the problem. Well, it turned out that was the problem. Okay. So the problem got solved. And then the Von Braun sent the bottle, the engineer, a bottle of champagne. So take a moment to think about your organization. What would happen when an engineer admits to making such a big mistake? Does he get a bottle of champagne, generative cultures they're often found in high-performance organizations, they are common high reliability systems that required greater for success.
There are typical of elite military units whose cooperation is legendary. For instance, the Navy seals, and they are often seen in consumer and service industries when exceptional consumer satisfaction is the goal. And they are often led by technological maestros. So what is a technological Maestro? Well, this word was coined by Arthur Squires in his book. The tendership about leadership and technology and world war II. And it meant the top leaders had these characteristics. Number one, tactical virtuosity, number two, a high energy level, number three, an ability to grasp the key questions. Number four, the ability to grasp the key details, high standards and a hands-on attitude. Now here's another example of a Maestro. Um, in June, 1978, an engineering student called an architect named Welham William , who designed key parts of the city Corp building in downtown New York, the 57 floor building had an unusual footprint.
The student wanted to know whether the building was stable or not. Was it going to be stable in a high wind? The Missouri AA assured the student that it would be stable. And he personally had designed a special mass damper on the top floor to study it, but then he had a second thought. And that thought was that if the building was built, according to specifications, there would be no problems, but had it actually been built that way. So Lil Missouri called him the builder. Well, the builder said they had pretty much followed the plans that they've been given, but there was one detail that was different. They had used rivets instead of Wells to hold the building together on a short building, this would not matter, but on a 57 story building a quartering winds strong enough would bring down the building. How often would such a wind show up?
The answer was about every 16 years. So they had to fix it and they did fix it. They told the newspapers about it, but ask them to hold a story. So for several months after the secretaries had gone home at night, contractors pulled off the well panels and welded the girders together after they fixed a stock structural problem, then the newspapers published what had happened. Oh, by the way, what is requisite imagination? It's the fine art of anticipating what might go wrong. So here is a prime example of requisite imagination. And remember mastering the key details is one trait of a technological Maestro. So maestros build a generative information flow. And this creates the complex web that allows the organization to build things. For instance, this is how you build airliners. So we're going to look at how Boeing, uh, created airliners. So building airliners is big business, and I have a law, the higher, the stakes, the rough of the plate.
So when Boeing builds airliners, this is rough play. It involves very high stakes and high risk. You had Boeing did it for well for many decades. For instance, we have examples like the straddle liner, the straddle cruiser, 7 0 7 7 27, the 7 47 and finally the triple seven airliner. So how did Boeing do this? Well, Boeing had a lot of money, a lot of people and a lot of machines, but Boeing also had a secret weapon. And that secret weapon was a culture that held it, all of those assets together, a culture like a family in spite of crises, like business downturns and strikes and so forth. Culture is actually a form of capital. Any company that manufactured something as large and complicated as a jet airliner forms a complex web of knowledge. So if we take the cultural capital and put it together with the technological Maestro, we get planes like the Boeing, triple seven, Marvel of precise engineering, understand that this human web of knowledge and competence is fragile and may degrade under rough handling.
So if you interfere with this culture of human competence, bad things can happen. And at Boeing, this seems to be what happened after Boeing merge with McDonald Douglas, the merger caused damage that undercut the web of manufacturing know-how. And here is one of those pictures that is better than a thousand words. We have Carl counted at Boeing listening to Harry Stonesifer of McDonald Douglas. And you could tell this was not a happy marriage. So as Boeing's culture went out the door it's aircraft Maestro, Alan Malali went to Detroit where by the way, he took over Ford and did great. Harry Stonesifer McDonald Douglas soon became the new CEO of Boeing. And under him, the culture rapidly declined Stonecipher. One of the new culture, what he described as going from family to teams. And this is a very important set of words, because even though those things don't seem to meet a great deal of difference to the ordinary person and Boeing and made a huge difference.
One employee told Harry Stonecipher, my God, Harry don't, you know, you're changing the culture of Boeing Stonecipher leaked into the air. And he said, my God, that's what we want to do. That's what Stonecipher did. But was it a good idea to do it? What culture was being replaced and what would take its place suppose that Boeing's great accomplishments had only been possible thanks to its culture. What was this culture? Boeing's employees described it as being like a family, but this culture was actually a high cooperation, generative culture. Yet Stonecipher was not happy to be with, but the Boeing culture for making planes, he wanted the culture focused on making money. So the generative culture got replaced by a bureaucratic culture, but the former culture, the generative culture had been the key to Boeing's success. So as the price of Boeing stock went up, the price of its technical product fell.
So the next day airline, or they came down the pipe, the Dreamliner was beautifully designed, but messed up on batteries and other manufacturing issues. And I understand it's still messed up. Stonecipher. Meanwhile had left Boeing in 2005. Other CEO follows, but success did not return. Then Boeing made a more serious mistake. It put fatal flaws and a new airliner. The new 7 37 max had major defects. This airliner had to work to beat Airbus, but it didn't the 7 37 max had a new M cast software installed that caused unexpected motions. This is a perfect example of a latent pathogen using the term of Jim reason. Pilot should have been trained for the new software, but they were not the full toolkit of the knowledge operates. This plane was not supplied. One us pilot after suffering from M cast problems said, I am left to wonder what else don't. I know the flight manual is inadequate and almost criminally insufficient.
So if culture breaks down, things get missed, no Maestro and a messed up culture. You could be flying without a parachute. The flaws in the 7 37 max soon led to two crashes killing a total of 345 passengers. A broken culture had led to a broken airline or project and a huge reputational loss. So what are the lessons we learned from this story? The most obvious one is that if you have a working culture, don't mess with it. And if your culture is not working, you better find out how you can fix it. And finally, if you don't know whether your culture is working or not, shouldn't you find out I'm looking for dialogue, useful critique, case studies, potential consulting, and any questions about the concepts I want to thank you for listening. Have a good day.
Thank you so much, Dr. Westrum, uh, it has been such a privilege to have interviewed you for four hours. And one of the big surprises to me in the interviews was that you had revealed that, uh, when you were introduced to the field of aviation safety, that you were one of the few sociologists that you were surrounded by, uh, primarily, uh, psychologist. Can you talk about like, why so many of the important insights came from sociology as opposed to psychology?
Well, I remember the first time I went to one of these meetings and basically the people who were the hosts were aviation psychologists. And I remember somebody came up to him and he said, I don't understand why you're here. Well, I found that was fairly threatening. Okay. I just joined the field and here I am, you know, he said, and he's questioning why I'm even at the meeting. So, uh, the next morning I remember I got up and it's one of those days where you are going to give a lecture, but you're sort of like dead and you can't really sort of get started cause it's a different time. So well, so after my talk, I thought, oh my God, you know, he's, he's going to creamy. Now. He came up to me after the lecture and he said, thank God you're here. He said, you've got some stuff to give us. And I was, I was really pleased by that. And I think the truth is that they really hadn't thought about the fact that basically an airplane flight deck is a group. Okay. And not only that, but it's part of a larger organization and you need to understand the organization, understand what the pressure is on the flight deck are likely to be.
And, uh, what year was that? And it could even give us a debrief as tutorial on what, what is, uh, cockpit resource management or crew resource management and how did it impact the industry?
Right. Well, back in the 1980s, basically, and I think this started out with United, uh, NASA and the airlines got together and they developed something that would prevent some of the, what they call cruise caused accidents. Okay. And that became known as cockpit resource management. I forget what the original name was United had for this, but that basically was opening up the entire field of aviation to look at the group dynamics involved in keeping an airliner safe. And I admit I was sucked in like everybody else. And I thought, oh, this is really cool. And I cannot tell you how exciting working with that group of people was, it was, it was really fantastic.
And, and for those of us who aren't familiar with a CRM, could you describe what that is and what made it so different than how, uh, uh, crews worked in a cockpit before?
Well, basically after world war II, there were a lot of pilots and, uh, there was still a stream of them coming out of Vietnam and so forth who were military pilots that were basically fighter jocks. And their idea was I'm the boss, I'm going to do what I want to do. And you know, the others can, you know, take a, take a leap or something. And pilots would actually say things like that, that their co-pilot's like shut up. I'm in charge here. Okay. Well, this was obviously a very dangerous condition. And so a lot of what crew resource management was about was getting the pilots to listen to their co-pilots because in many cases they actually had conversations that where they piloted said shut up and don't tell me about what's going on with the altimeter. And so on, the next thing you hear is a plane hitting the ground, you know, because they've, augured in well, so the whole thing about crew resource management is learning to use the intellectual resources you've got on the plane and take full advantage of what everybody has heard or seen or thought and so forth. And so we've come a long way. I knew Bob Helmrick, who was the, the great advocate of all CRM and so forth and went all over the world, basically selling his ideas. And that was, that has made a huge difference to aviation safety.
Amazing. And so you introduced me to the term, uh, the technical Maestro, the technological Maestro, uh, high energy, high standards, great. In the large and the small and loves walking the floor. Um, and you had also mentioned that something else that also had a similar sort of searing aha moment for me. And that was Robin knows law, I think number 23. Can you describe, uh, what that is and why you think it's so important?
Well, Rabene was an inventor. He had 230 U S patents and had basically seen and done everything in a R and D. And he said basically, um, he had a set of laws that came out of that. And one of them was number 23. If the boss is adult everybody under him as a dope or soda will be lost as a dope. Everybody under him is a dope or soon will be okay. Well, I don't know why this, this was a revelation to you, but it was certainly a revelation to a lot of people because that's the problem is it is that lousy leaders tend to recruit other lousy leaders because they're not going to be threatened by people who are smarter than they are.
Um, I love that because for me, it simultaneously describes, you know, those situations I've been in where everything was going great, fully enabled by that leader at the top, or things were going horribly right. And fully enabled by that leader at the top. Does that, uh, am I capturing accurately what the implications of a ravenous law is?
Yes. I, I, I would be at jeopardy here talking about my former employments and so forth and some of the experiences I've had. So I'm not going to tell you that, but all I can tell you is that that's, that's so true.
You had also mentioned something that I found, uh, astounding. So, you know, I think when we think about the greatest experiences that we have working, where we have fully unleash human creativity, we have a sense of enormous satisfaction. We feel engaged in our work. Uh, but then you had described, uh, the opposite conditions, uh, and the Whitehall study. Could you educate us all on what the white house study is and what it found?
Well, the Whitehall study was basically about the relationship between help and, uh, status in the organization. White hall is of course where the British government does its day-to-day work and there was a big study. Um, that's what it's called the Whitehall study, which basically show that the higher up in status, you were the less likely you were to have a heart attack. Everybody in Britain has the same health system in principle, and yet the people on the top are less likely to have heart attacks on the people on the bottom. So the fact is the status is a very important indicator of your mental health and your psychological physiological health. So,
And then you had mentioned that, that, uh, the, one of the implications is that, um, in certain organizations, pathological ones, uh, life is horrible for everyone except for the leaders that, uh, that the whole organization is actually optimized for the leader. Is that a correct interpretation?
That's a correct interpretation. I think the thing that cued me into it was many years ago, I had a student who his father used to work at, um, Fisher body. And he said, every day when his dad went to work, he got sick, going to work. And Fisher body was apparently a place where there were a lot of, uh, problems like alcoholism and suicide and so forth. And it was a very punitive environment. This is, this is probably where I originally got the idea of a pathological environment. Uh, but it wasn't the only one. I also, I also taught a class basically, which was right around the, uh, area where the hydromatic plant was. And this was a plant that GM had that basically my students required constantly after, after working here. Okay. And so I began to realize that there's some kinds of environments that make people sick. And I think by contrast general environments tend to make people, you know, more psychologically together and physically health-wise as well.
So we talked about, uh, the pathological organizations, the origins of your study of gendered organizations. I found genuinely surprising that you sought out, uh, specifically, uh, organizations like the Sidewinder, uh, uh, project at the China lake Naval research laboratory. Uh, could you talk about what it is about an R and D organization that you thought was especially suitable for the study of generative organizations?
Well, the reason that I studied China lake is because the story of Sidewinder is basically the little engine that could, okay. So here is this small group of people, basically out in the middle of the Mojave desert, literally. And they put together this missile that was better than everybody else's missile. Okay. Well, I found that a really appealing story. So I said, what's the larger context of this. And I realized after a while that this entire laboratory, and it was big, actually the laboratory itself about 5,000 people was a huge skunk works. Okay. And I realized after talking to them that they, you know, they routinely created miracles. And in fact, one guy said, you know, we always thought we could do one of anything over a weekend. Can you imagine how threatening an organization like that would be to the ordinary R and D organization, they would take other people's systems and fixed them. Like the spiral missile is a good example, which was, had a lot of problems and they, and channel lake fixed it. Well, do you think the others who got their missile fixed were happy about that? They were not. And so like Cinderella, you know, the high performers tend to get punished.
And what was it about the way that the groups worked at, uh, in the Sidewinder project that marked them as so unusual and so worthy of study?
Well, first of all, they accomplished something that nobody else could do. And in fact, something that they had been told not to do, they were ordered not to create new missiles. And so they did it all on the sly and they literally essentially rolled this thing out basically. And at the end of this process, having the skies that as a targeting project and, uh, some of the other things that they used and, and here was this thing that was better. And so these guys would come out, the admirals zone would come up and this thing would, would do amazing things. And they say, we have to have this missile, you know, this it's fighting the fact that we told you not to do it. We have to have this because it is so good. So you have to ask yourself, okay, so what kind of culture creates that sense of, uh, engagement, that sense of excitement, that sense of, of the ability to do anything and that's exactly what China lake had.
That is awesome. So when I, re-read your information flow of paper from 2013, uh, it was actually startling to me, uh, just how powerful I found the last, uh, couple of sentences you wrote in summing up culture is no longer neglected. The information flow is of course only one issue among many in safety culture, but I feel it is a Royal road to understanding much else. Uh, can you talk about like what that Royal road is and where do you think that Royal road takes us?
Well, in the beginning, I mean, I basically spent essentially my adult life developing this theory or whatever you want to call it. And I think the thing is that at the end, when I realized how many different things, the information flow was likely to predict, I'd said, oh my God, we've have something here, which is really quite amazing. And how did I know that it was amazing? Well, because what I would do is I'd put this chart in front of them. People say, oh my God, I know where we are on this chart. Okay. Okay. So it had a lot of surface credibility. And as, uh, as you know, in dev ops, you know, you, people did a big study and that's what you found is basically it does correlate with creativity as you go from a pathological or bureaucratic to generative organization. It is, it is amazing. And did I know that it would do all that stuff? No, I didn't. It's just been delightful to see, in fact, as you, as you do further studies. And so this seems stronger and stronger.
Uh, awesome. Well, let me end by just stating that, you know, every interaction that I've had with you, I've learned so much, and I cannot overstate just how much impact that you've had on the DevOps community that, uh, that initial finding we found, where we found that the Western organizational typology model was one of the top predictors of performance. So, uh, from all of us, thank you so much, Dr. Westrum and, uh, I look forward to catching you later, uh, in the conference.