Information Flow Cultures

Ron Westrum is Emeritus Professor of sociology at Eastern Michigan University. He holds a B.A. (honors) from Harvard University and a Ph.D in Sociology from the University of Chicago.

Dr. Westrum is a specialist in the sociology of science and technology, and on complex organizations. He has written three books, Complex Organizations: Growth, Development and Change; Technologies and Society: The Shaping of people and Things, and Sidewinder: Creative Missile Design at China Lake. He has also written about 50 articles and book chapters. His work on organizational culture has been valuable for the aviation industry and to medical safety, as well as to other areas of endeavor. He has been been a consultant to NASA, the National Research Council, and the Resilience Core Group. He is currently at work on a book on information flow cultures.


Dr. Ron Westrum

Eastern Michigan University, Emeritus Professor of Sociology



Okay, I am so honored and delighted about who is speaking next. Dr. Ron Westrom, professor Emeritus of Sociology at Eastern Michigan University. His name will be familiar to anyone who has read the state of DevOps reports that I had the privilege of working on for six years with Dr. Nicole Forsgren and JZ Humble from 2013 to 2019. It is the cross population study that spanned over 36,000 respondents that allowed us to better understand what high performing technology organizations looked like. We looked at the architectural practices, technical practices, and cultural norms. And without a doubt what those cultural norms might look like were made possible by the work of Dr. Ron Westrom. He received his PhD in sociology from the University of Chicago and has spent decades studying complex organizations including healthcare, aviation and the nuclear industry. And one of the models he created was a famous western organizational typology model that brilliantly categorized organizations into pathological, bureaucratic, and generative. This featured prominently in the state of DevOps research, the DevOps Handbook, and the Accelerate Books <laugh>. And when Dr. Forsman CC'd me on some correspondence that she had with Dr. Westrom, I almost fell outta my chair. I was able to interview him for four hours on my podcast, the ideal cast. And I'm so delighted that he'll be teaching us today about information flow in organizations and providing a case study of an organization that you have probably heard of. Here's Dr. Westrom.


Today we're gonna talk about information flow cultures, and we're gonna talk about the cultures of organizations. So what the devil is organizational culture. Well, organizational culture is a complicated thing. For instance, it has the following characteristics. So organizational culture is practices, organizational culture is thoughts, organizational culture is feelings, and it is symbols. So while all these are important, we're going to use another index. The flow of information. Why is information flow the right thing to do? The reason is basically, information is the lifeblood of organizations. If the organization has a good flow of information, the organizations will do well. If it has a bad flow, it's not gonna do very well. Information is also a powerful index of how an organization functions. An information flow culture in fact, reflects how managers shape values and behavior. And we're gonna describe three different information flow types. One of them is generative, where you have a high flow of information the best.


Then there's bureaucratic, which has a medium flow of information and pathological, which has a low flow. So let's look at pathological flow. In pathological organizations, you get a low cooperation, very high conflict, an emphasis on taking care of the leaders. Strict boundaries, messengers get shot. You have low creativity, so you have a toxic environment. In a bureaucratic situation, you get modest cooperation. The emphasis is on rules and regulation. You have a problems with silos. Messengers are tolerated, not necessarily encouraged. Conflicts are tamped down and creativity is allowed. And here is my slide, which I think reflects the flow of bureaucratic information, which is that it's slow. Now, what we'd really like to have is a generative flow of information where we have high cooperation, we have emphasis on the mission. We have boundaryless organization where things move quickly over the boundaries. Speaking up is encouraged, and in fact, people have psychological safety and high creativity.


So here isn't my example of how a highly creative organization is supposed to function. I think Star Trek is the perfect model. Now let me emphasize one of the features that goes with generative information flow. At Google, they had a project called Project Aristotle, and he studied what made for an effective team. The number one feature of an effective team was psychological safety, the ability to speak your mind without fear of punishment. When communication is easy, there is more of it, but it's also the right kind of communication. I like to say that a high flow of communication has these three characteristics. Number one, it's timely. Number two, it's easy to understand and comes in a form that's easy to, to make sense of. Number three, it meets the receiver's needs. Now there's a classic example of this. During the famous Redstone Rocket program, which was one of NASA's first a prototype went off course and crashed.


F Van Brown head of the project tried to figure out, by many analyses what had happened. The analyses did not suggest a cause. Now, there were gonna have to start from scratch to redesign the missile. But then an engineer came to von Braun, he said, I think I did it. But how Von Braun wanted to know, well, the engineer said, I touched a part of the circuit with a screwdriver and got a spark. I checked and the circuit seemed to be fine, but maybe that was the problem. Well, it turned out that was the problem. <laugh>. Okay, so the problem got solved, and then vRAN sent the bottle, the engineer, a bottle of champagne. So take a moment to think about your organization. What would happen when an engineer admits to making such a big mistake? Does he get a bottle of champagne? Generative cultures are often found in high performance organizations.


They're common in high reliability systems that require greater cooperation for success. They're typical of elite military units whose cooperation is legendary, for instance, the Navy Seals. And they're often seen in consumer and service industries when exceptional consumer satisfaction is the goal. And they're often led by technological maestros. So what is a technological maestro? Well, this word was coined by Arthur Squires in his book, the Tendership about leadership and technology in World War ii. And it meant the top leaders had these characteristics. Number one, technical virtuosity. Number two, a high energy level. Number three, an ability to grasp the key questions. Number four, the ability to grasp the key details, high standards, and a hands-on attitude. Now here's another example of a maestro. Um, in June, 1978, an engineering student called an architect named Willam, William Li Missouri, who had designed key parts of the City Corps building in downtown New York.


The 57th floor building had an unusual footprint. The student wanted to know whether the building was stable or not. Was it going to be stable in a high wind? The Missouri assured the student that it would be stable, and he personally had designed a special mass St damper on the top floor to study it. But then he had a second thought, and that thought was that if the building was built according to specifications, there would be no problems. But had it actually been built that way. So Le Missouri called up the builder. Well, the builder said they had pretty much followed the plans that they'd been given, but there was one detailed that was different. They had used rivets instead of wells to hold the building together. On a short building, this would not matter. But on a 57th story, building a quartering wind strong enough would bring down the building.


How often would such a wind show up? The answer was about every 16 years. So they had to fix it, and they did fix it. They told the newspapers about it, but asked them to hold a story. So for several months after the secretaries had gone home at night, contractors pulled off the wall panels and welded the girders together after they fixed AUC structural problem. Then the newspapers published what had happened. Oh, by the way, what is requisite imagination? It's the fine art of anticipating what might go wrong. So here is a prime example of requisite imagination. And remember, mastering the key details is one trait of a technological maestro. So maestros build a generative information flow, and this creates the complex web that allows the organization to build things. For instance, this is how you build airliners. So we're gonna look at how Boeing, uh, created airliners.


So building airliners is big business, and I have a law <laugh> about this. The higher the stakes, the rougher the play. So when B Boeing builds airliners, this is rough play. It involves very high stakes and high risk. Yet Boeing did it for, well, for many decades. For instance, we have examples like the straddle liner, the straddle cruiser, the 7 0 7 7 27, the 7 47, and finally the triple seven airliner. So how did Boeing do this? Well, Boeing had a lot of money, a lot of people and a lot of machines. But Boeing also had a secret weapon. And the secret weapon was a culture that held it all those assets together, a culture like a family in spite of crises like business downturns and strikes and so forth. Culture is actually a form of capital. Any company that manufactures something as large and complicated as a jet airliner forms a complex web of knowledge.


So if we take the cultural capital and put it together with the technological maestro, we get planes like the Boeing Triple Seven, a marvel of precise engineering. Understand that this human web of knowledge and competence is fragile and may degrade under rough handling. So if you interfere with this culture of human competence, bad things can happen. And at Boeing, this seems to be what happened after Boeing merged with McDonald Douglas, the merger caused damage that undercut the web of manufacturing know-How? And here is one of those pictures that is better than a thousand words. We have Carl Con of Boeing listening to Harry Stonesipher of McDonald Douglas, and you can tell this was not <laugh> a happy marriage. So as Boeing's culture went out the door, its aircraft maestro. Alan Mulala went to Detroit, where by the way, he took over Ford and did great Harry Stonesipher from McDonald Douglas soon became the new CEO of Boeing.


And under him, the culture rapidly declined. Stonesipher wanted the new culture, what he described as going from family to teams. And this is a very important set of words because even though those things don't seem to mean a great deal of difference to the ordinary person at Boeing, it made a huge difference. One employee told Harry Stonesipher, my God, Harry, don't you know you're changing the culture of Boeing Stone? Cipher leaped into the air and he said, my God, that's what we wanna do. That's what Cipher did. But was it a good idea to do it? What culture was being replaced and what would take its place? Suppose that Boeing's great accomplishments had only been possible thanks to its culture. What was this culture? Boeing's employees described it as being like a family, but this culture was actually a high cooperation generative culture. Yet Stone Cipher was not happy with the Boeing culture for making planes.


He wanted the culture focused on making money. So, so the generative culture got replaced by a bureaucratic culture. But the former culture, the generative culture, had been the key to Boeing success. So as the price of Boeing stock went up, the price of its technical product fell. So the next airline that came down the pipe, the Dreamliner, was beautifully designed, but messed up on batteries and other manufacturing issues. And I understand it still messed up stone side for meanwhile had left Boeing in 2005, other CEO follows. But success did not return. Then Boeing made a more serious mistake. It put fatal flaws in a new airliner. The new 7 3 7 max had major defects. This airliner had to work to beat Airbus, but it didn't. The 7 3 7 max had a new MCAS software installed that caused unexpected motions. This is a perfect example of a latent pathogen using the term of gym reason, pilots should have been trained for the new software, but they were not the full toolkit of the knowledge to operate. This plane was not supplied. One US pilot after suffering from MCA problems said, I am left to wonder what else don't I know? The flight manual is inadequate and almost criminally insufficient.


So if culture breaks down, things get missed. No maestro and a messed up culture, you could be flying without a parachute. The flaws in the 7 3 7 max soon led to two crashes killing a total of 345 passengers. A broken culture had led to a broken airliner project and a huge reputational loss. So what are the lessons we learned from this story? The most obvious one is that if you have a working culture, don't mess with it. And if your culture is not working, you better find out how you can fix it. And finally, if you don't know whether your culture is working or not, shouldn't you find out? I'm looking for dialogue, useful critique, case studies, potential consulting, and any questions about the concepts? I want to thank you for listening. Have a good day.


Thank you so much Dr. Westrom. Uh, it has been such a privilege to have interviewed you for four hours, and one of the big surprises, uh, to me in the interviews was that you had revealed that, uh, when you were introduced to the field of aviation safety, that you were one of the few sociologists that you were surrounded by, uh, primarily, uh, psychologists. Can you talk about like why so many of the important insights came from sociology as opposed to psychology?


Well, I, I remember the first time I went to one of these meetings and, and basically the people who were the host were aviation psychologists. And I remember somebody came up to me and he said, I don't understand why you're here. Well, I found that was fairly threatening. Okay. I I had sort of just joined the field and here I am and you know, he said, and he's questioning why I'm even at the meeting. So, uh, the next morning I remember I got up and one of those days where you, you are gonna give a lecture, but you're sort of like dead and you can't really sort of get started 'cause it's a different time. So, well, so after my talk I thought, oh my God, you know, he's, he's gonna cream me Now. He came up to me after the lecture and he said, thank God you're here. He said, you, you've got some stuff to give us <laugh>. And I was, I was really pleased by that. And I think the truth is that they really hadn't thought about the fact that basically an airplane flight deck is a group <laugh>. Okay? And not only that, but it's part of a larger organization and you need to understand the organization to understand what the pressures on the flight deck are likely to be.


And, uh, what year was that? And could you even give us the, the briefest tutorial on what, what is, uh, cockpit resource management or crew resource management, and how did it impact industry?


Right. Well, back in the 1980s, basically, and I think this started out with United, um, NASA and the airlines got together and they developed something that would prevent some of the, what they call cruise caused accidents. Okay. And that became known as Cockpit Resource Management. I forget what the original name was that United had for this, but that basically was opening up the entire field of aviation to look at the group dynamics involved in keeping an airliner safe. And I admit I was sucked in like everybody else. And I thought, oh, this is really cool. And I cannot tell you how exciting working with that group of people was. It was, it was really fantastic.


And, and for, for those of us who aren't familiar with, uh, CRM, could you describe what that is and, uh, yes. What made it so different than how, uh, uh, crews worked in a cockpit before?


Well, basically after World War ii, there were a lot of pilots and, uh, there was still a stream of them coming out of Vietnam and so forth who were military pilots who were basically fighter jocks. And their idea was, I'm the boss, I'm gonna do what I wanna do. And you know, <laugh>, you know, the others can, you know, take a, take a leap or something. And pilots would actually say things like that, that their co-pilot's like, shut up, I'm in charge here. Okay, well, this was obviously a very dangerous condition. And so a lot of what crew resource management was about was getting the pilots to listen to their co-pilots, because in many cases they actually had conversations that where the pilot had said, shut up and don't tell me about what's going on with the altimeter and so forth. Next thing you hear is a plane hitting the ground, you know, because they augered in. Well, so the whole thing about career resource management is learning to use the intellectual resources you've got on the plane and, you know, take full advantage of what everybody has heard or seen or thought and so forth. And so we've come a long way. I, I knew Bob Helm, who was the, the great advocate of all CRM and so forth, and went all over the world, basically selling his ideas. And that was, that has made a huge difference to aviation safety. Uh,


Amazing. And, and so you introduced me to the term, uh, the technical maestro, the technological maestro, uh, high energy, high standards, great. And enlarge Great. And the small and loves walking the floor. Um, and you had also mentioned something else that also had a similar sort of searing aha moment for me, and that was Raven's Law, I think number 23. Can you describe, uh, what that is and why you think it's so important?


Well, AU was a, an inventor. He had 230 US patents and had basically seen and done everything in, uh, r and d. And he said basically, um, he had a set of laws that came out of that, and one of them was number 23. If the boss is a dope, everybody under him is a dope, or soon will be okay. The boss is a dope, everybody under him is a dope, or soon will be. Okay, well, I don't know why this this re was a revelation to you, but it was certainly <laugh> a revelation to a lot of people because that's the problem is that, is that lousy leaders tend to recruit other lousy leaders because they're not gonna be threatened by people who are smarter than they are.


Um, I love that because for me it's simultaneously describes, you know, those situations I've been in where everything was going great, fully enabled by that leader at the top, or things were going horribly right and fully enabled by that leader at the top <laugh>. And does that, uh, am I capturing accurately what the implications of, uh, Ravennas law is?


Yes, I, I <laugh> I would be in jeopardy here talking about my former employments and so forth, and some of the experiences I've had. So I'm not gonna tell you that. But all I can tell you is that that's that's so true.


Uh, you had also mentioned something that I found, uh, astounding. So, you know, I think when we think about the greatest experiences that we have working where we have fully unleashed human creativity, we have a sense of enormous satisfaction. We feel engaged in our work. Uh, but then you had described, uh, the opposite conditions, uh, in the Whitehall study. Could you educate us all on what the Whitehall study is and what it found?


Well, the Whitehall study was basically about the relationship between health and, uh, status in the organization. Whitehall is of course where the British government does its day-to-Day work, and there was a big study, um, that's what it's called, the Whitehall study, which basically showed that the higher up in status you were, the less likely you were to have a heart attack. Everybody in Britain has the same health system in principle, and yet the people on the top are less likely to have heart attacks on the people on the bottom. So the fact is that status is a very important indicator of your mental health and your psychological physiological health. So


<crosstalk> then, you had mentioned, uh, that, uh, the, one of the implications is that, um, in certain organizations, pathological ones, uh, life is horrible for everyone except for the leaders. <laugh>. Is that, uh, that the whole organization's actually optimized for the leader? Is that a, um, correct interpretation?


That's a correct interpretation. I think the thing that that queued me into it was many years ago, I had a student whose father used to work at, um, Fisher Body, and he said every day when his dad went to work, he got sick going to work. And Fisher Body was apparently a place where there were a lot of, uh, problems like alcoholism and suicide and so forth. And it was a very punitive environment. This is, this is probably where I originally got the idea of a pathological environment, uh, but it wasn't the only one. I also <laugh>, I also taught a class basically, which was right around the, uh, area where the Hydromatic plant was. And this was a plant that GM had that basically my students required counseling after, after working there <laugh>. Okay. And so I began to realize that there's some kinds of environments that make people sick. And I think by contrast, general environments tend to make people, you know, more psychologically together and physically health wise


As well. So we, so we talked about, uh, the pathological organizations, the origins of your study of generative organizations. I found genuinely surprising that you sought out, uh, specifically, uh, organizations like The Side Wander, uh, uh, project at the China Lake Naval Research Laboratory. Uh, could you talk about what it is about an r and d organization that you thought was especially suitable for the study of generative organizations?


<laugh>? Well, the reason that I, I studied China Lake is because the story of Sidewinder is basically the little engine that could, okay, so here is this, this small group of people basically out in the middle of the Mojave Desert, literally. And they put together this missile that was better than everybody else's missile. Okay, well I found that a, a really appealing story. So I said, what's the larger context of this? And I realized after a while that this entire laboratory, and it was big actually, the laboratory itself about 5,000 people, was a huge skunk works. Okay. And I realized after talking to them that they, you know, they routinely created miracles. And in fact, one guy said, you know, we always thought we could do one of anything over a weekend.




Can you imagine how threatening an organization like that would be to the ordinary r and d organization? They would take other people's systems and fix them. Like the Sparrow Missile is a good example, which would had a lot of problems, <laugh> and they, and China Lake fixed it Well, do you think the others who got their missile fixed were happy about that? They were not. And so, like Cinderella, you know, the high performers tend to get punished.


And, and what was it about the way that the groups worked at, uh, in the Sidewinder project that marked them as so unusual and so worthy of study?


Well, first of all, they accomplished something that nobody else could do. And in fact, something that they had been told not to do, they were ordered not to, you know, create new missiles. And so they did it all on sly and they literally essentially rolled this thing out basically. And at the end of this process, having disguised it as a targeting project and a, some of the other things that they used, and, and here was this thing that was better. And so these guys would come out, the admirals would come out and this thing would, would do amazing things. And they'd say, we have to have this missile, you know, this in spite of the fact that we told you not to do it. We have to have this missile 'cause it is so good. So you have to ask yourself, okay, so what kind of culture creates that sense of, uh, engagement, that sense of excitement, that sense of, of the ability to do anything. And that's exactly what China Lake had.


That is awesome. So when I reread your information flow paper from 2013, uh, it was actually startling to me, uh, just how powerful I found the last, uh, couple of sentences you wrote in summing up culture is no longer neglected. The information flow is of course only one issue among many in safety culture, but I feel it is a royal road to understanding much else. Uh, can you talk about like what that royal road is and where you think that royal road takes us?


Well, in the beginning, I mean, I've basically spent essentially my adult life developing this theory or whatever you wanna call it. And I think the thing is that at the end when I realized how many different things the information flow was likely to predict, I said, oh my God, we have something here, which is really quite amazing. And how did I know that it was amazing? Well, because what I would do is I'd put this chart in front of other people, say, oh my God, I know where we are on this chart. Okay. Okay. So it had a lot of surface credibility. And as, uh, as you know, in DevOps, you know, you people did a big study and that's what you found is basically it does correlate with creativity as you go from a pathological or bureaucratic to a generative organization. It is, it is amazing. And did I know that it would do all that stuff? No, I didn't. <laugh> <laugh>. It's just been delightful to see, in fact, as you, as you do further studies. And so this seems stronger and stronger.


Uh, awesome. In fact, let me end, uh, by just stating, uh, that, you know, every interaction that I've had with you, I've learned so much and I cannot overstate just how much impact that you've had on the DevOps community that, uh, that initial finding we found where we found that the we organizational typology model was one of the top predictors of performance. So, uh, from all of us, thank you so much Dr. Westrom <laugh>, and uh, I look forward to catching you later, uh, in the conference.