Virtual US 2022

Out of the Cyber Crisis - What Would Deming Do?

In 1982 Dr. Edwards Deming published a book called "Out of the Crisis." dealing with his frustration leading up to the prior decade. He was 82 years old; he wasn't writing this as a get-rich management consulting book. Dr. Deming wrote it as a stark warning for everyone, manufacturing, healthcare, government, and education alike.



Fast forward forty years, and imagine what he would say today if he were alive. My guess is he would be intensely interested in cybersecurity. More than just information technology, but cyber across all aspects of human infrastructure, for example, "Maslow's Hierarchy of Needs," water, warmth, and food.



Over the past ten years, we have seen cyberattacks on water treatment plants, power grids, oil and gas, and food supply chains. The latest is a cream cheese manufacturer. I have studied Dr. Deming for over a decade, and I believe there is an exciting story to be told around his 14 Points and System of Profound Knowledge. These concepts are also the fundamental ideas behind Lean, Agile, DevOps, and DevSecOps.

JW

John Willis

Distinguished Researcher, Kosli

Transcript

00:00:05

<silence> Hi, I am John Willis. Um, this presentation, uh, is called What would Deming Do in a Cyber World, or Sort of out of the cyber crisis. Uh, I'm known as Bic Loop by most people. And, uh, it mostly images in this presentation be actually generated by OpenAI, something called D. So, pretty cool stuff. Um, so, um, I've done a lot of things. Uh, we won't waste a whole lot of time on going through my background, but, um, probably most known for co-authoring the DevOps Handbook with Gene Kim, judge Humble, and Patrick Deis also did a great book, uh, called Beyond the Phoenix Project Routine. Um, the Green book in the middle, uh, is something I've done with a lot of automated governance, which led us to this book that came out in September this year called Investments Unlimited. Um, it's a great story. I'll talk a little about that.

00:00:55

And this presentation is really based around a book I've been working on for 10 years, um, and, uh, by jean's encouragement, I think I finally have it done. Uh, fifth draft, it'll be done, hopefully, uh, ready for publishing early next year. Um, worked for a lot of companies. I just left Red Hat. Um, I'm actually now working for a company that focuses on, uh, DevOps, automated governance called Costly. So, um, um, yeah, just wanted to ta talk briefly on the Investments Unlimited book. It's, it's really exciting, was written, um, with, uh, nine authors. So a lot of industry expertise. So I'll just say it's a novel about DevOps, security, audit compliance, uh, in the digital age, right? So, uh, uh, I think you'll really enjoy it, and I've done a fair amount of presentations on that topic, but not today, <laugh>. Um, so even if you haven't heard of Dr.

00:01:46

Deming, you probably have heard of Dr. Deming. Um, you know, probably most notably is, uh, I think it's, it's mandatory that all DevOps presentations have at least one DevOps quote. Um, I'm obviously kidding, but, um, not so much. Uh, here's a couple. Learning is not compulsory. Neither is survival in God. We trust all others must bring data. My personal favorite, a bad system will be the good person every time. And, uh, for those you baseball fans, sort of the yogi bear, uh, uh, there, uh, every, every system is perfectly designed to get the results that it does. Um, but if you, if you dive into Deming's life, um, there's some really interesting things. Once he, he started out as a mathematical physicist. So, uh, and you had, imagine this is in the, um, 1923, right? So in the early, um, basically 20th century, he's learning about physics when the second scientific revolution, quantum physics, and it has a lot to do with the way he thinks, thinks differently than he ultimately comes a management consultant.

00:02:50

Um, but the other thing that makes him so unique is he was, um, he was a boundary spanner. So he, he found early interest in epistemology, uh, particularly pragmatism, an American philosophy, um, which really helped him think about knowledge. And, and then, um, his, his probably his weapon of choice, not probably his weapon of choice, was, uh, statistics, but something we'll talk about analytical statistics. He was a professor at many prestigious N Y U Columbia, um, you know, Washington University. He was also very interested in how psychology, how the, how people are motivated, and we'll talk more about that. And he was probably one of the earliest, um, systems thinkers. Uh, in my book, I cover how him working actually in, for the government during World War II in sort of dark projects with Norbert Wiener, um, some of the original cybernetics stuff. Um, he was most known as an industrialist, uh, probably because, um, you know, he's probably most known famously for being sent over to Japan and being one of the, uh, people that helped create what they called the miracle in Japan.

00:04:01

And then for me, you, he's, even though he died in 1993, he's probably one of the greatest DevOps, if you read Demmings 14 points. And well, in this presentation, and it's part of this presentation, he's a futurist because the things he laid down in his, you know, 70 year career, um, really are relevant and probably more relevant today when we talk about cyber. So, um, to set the stage for where we're at, right? Mark entries in 2000 run, the founder of Netscape and, uh, famous VC now for Jesus. And Horowitz said that, um, software is eating the world. This is in 2011. And I, I think you'd be hard pressed to disagree with that. Um, our good friend Josh Corman, he's part of our DevOps tribe, um, he says, software is infecting the world. So it's, it's not a negative, it's just, it, it come part and parcel with, you know, andreessen's statement, right?

00:04:53

And he said that in 2015, and I would say, um, that software is actually now, you know, a little worse. It's worse than, you know, exposing credit cards or, you know, you know, Equifax credit records. I mean, which is not good, but now it's messing with our basic, uh, hierarchy of needs, right? So I would say it's screwing with Maslow our health, food, water, and shelter. Um, just a point that I, I make in my book is you Demming had this idea of the aim creating a purpose. And, you know, if we think about just the healthcare, uh, research says that 5,600 hospitals in the US have zero cyber people. I mean, not even a director of security, no ciso, no vp, no director. Um, that's pretty scary. And it, what's even more scary, if we look at the devices again, things that are affecting, you know, our, our lives, right?

00:05:45

They're, um, uh, the pacemakers. There's, you know, a million a year, there's like uncounted number out there, and a lot of 'em still have three hard, three character hard-coded passwords. Um, there's been, you know, a whole, uh, series on TV homeland, um, based on the, um, infusion, you know, the, the, the diabetes, uh, uh, infusion pumps, right? And, uh, and Bluetooth, right? So even Dick Cheney, they, they had to turn the blue tooth off, like people can die. It's been proven that you can, that hackers can hack into, uh, these devices. Even worse, the hospitals, these devices and hospitals in, in infusion pumps in hospitals, where it's been proven in, in hackathons that you can turn what should be a 30 minute per 30 minute dose into like a minute dose and kill somebody. And then, um, one of the other things is a lot of the radio oncology systems move to the cloud because they believe it's safer.

00:06:46

Um, there's been some really interesting, uh, ransomware where they've shut down the whole radio oncology. And what that means is, like for chemotherapy, they, they literally have to shut down chemo for like two weeks, and then people's lives are basically rescheduled, um, you know, scary stuff. And then, you know, some of the, the, you know, there's sort of a ransomware revolution going on, um, you know, Schreiber Foods, and at the end of, um, 2021, you literally couldn't get cream cheese on the East Coast. Now for me, you know, lock baby on locks, that's a tragedy. But, you know, all kidding aside, nobody dies for not getting cream cheese. But the point was this one ransomware and this company, Schreiber Foods basically shut down the East coast for, and, and during the worst time November, uh, you know, <inaudible>, the holiday season, uh, most people probably know about the J B s meat packing.

00:07:39

Uh, that was probably not as well known because they paid the ransomware immediately. This one is really scary. The, um, hackers tried to poison the drinking water in, uh, Tampa Bay Water treatment plant. Now, here's the really interesting thing. It was about a week before the Super Bowl in Tampa, and it was also on less than 10 miles away. They were literally trying to, uh, increase the sodium chloride, lie basically to lethal levels. Fortunately, uh, an admin caught them using a remote viewer software. Um, there was a colonial breach, and this one's pretty scary too. The first, um, earlier this year, there was the first, um, ransomware related to a litigated death. A woman came into a hospital, another scenario where ransomware had shut down radio oncology. They couldn't do any x-rays. They didn't inform the woman. She had a complicated birth. Had they had x-ray, they probably would've taken her to a different hospital.

00:08:31

Baby died, uh, six months later, and it's the first litigated death. Um, so what does Demming have to do all this? I think we have to take a little bit of a look at what Demming is all about. Um, so Demming, um, literally his whole life was this, um, picking up this, what I call profound knowledge, this, these idea or something called profound knowledge that he fi he finally documented in his final book, uh, called New Economics. It was published in 1993, the year he died. But he considered it these four elements as a lens for understanding complexity. So, um, and he said, if you had, you had to apply all four of these to really understand and be able to understand problems, opportunities, um, you know, improvements, and they were as follows. It was theory of knowledge. So this comes from the epistemology. How do we know what we know?

00:09:17

How do we really know what we believe we know, right? It's scientific method. Um, and then coupled with theory of variation, which is how do we understand what we, so it's a form of measurement, but more importantly, it's a form of like understanding. And I'll give you some good examples. And even once we have those two, we actually need to understand how people are motivated. So if we want people to change or we wanna make an update, like if people in humans are always involved, we have to understand the psychology, the cognitive biases, the intrinsic nature of, of how people work. And again, he was assistant thinker. So, you know, I would say almost like the fifth discipline, the fourth discipline of a profound knowledge is the appreciation system or system thinking where you integrate all four together. Um, so and so, um, demming, like I said earlier, probably his weapon of choice was statistics and analytical statistics.

00:10:14

And you know, one of the things that he says in a great book called Essential Demming is, um, the job of the statistician is to work with experts in the subject matter to help them solve their problems. Is the responsibility of the substantive exit, I'm sorry, expert to decide with or without the help. In other words, I won't read the whole thing, is it's the statistician's job to provide the data for the subject matter expert. Then the subject matter expert can use that statistical amount data to then go discover what the opportunity improvement is. And I'll give you some good examples here. So let's take, uh, system profound knowledge for it risk, right? This is something I've been working heavily on. Um, let's look at three types of controls. One is something we might do in our pipeline called, uh, container scanning, right? Uh, so we're doing image scanning, container scanning, looking to make sure there's no vulnerabilities in a contained image.

00:11:07

Um, and then let's look at, you know, another area where we want certain applications to have a certain level of unit test coverage. Maybe this application A requires 70% unit test coverage. Application B requires 75%, and then, uh, just for third information leakage, right? So these are just three out of many controls that you might see in a software supply chain or a pipeline. And so let's look at the risk control first. And so one of the things that Deming talked a lot about is, um, misunderstanding the difference between enumerated statistics and analytical statistics. And this is really important 'cause I will contend in it, we over rotate on enumerated, and the enumerated is the how, how many, but analytic is the why. So if we take here, I, I took, so, uh, a 17 week each week, um, average of the amount of, um, software component, or, or in this case a container scan failures, like where you literally, um, it was gated and it showed up in a record that was gated.

00:12:10

So if we look at the left hand side, that's a distribution chart, which it would be a classic enumeration of how many, and it gives us some interesting information in terms of like the count and each week, um, you know, and you know, so we can see that, you know, the, the different, the, the different weeks and the different accounts, but we don't get what we see in what's called a control chart or, or otherwise called statistical cross control. On the right is a more analytical approach. In fact, it, it, um, it may look the same here, but if you know what you're looking for, you know, in fact, what we're actually looking for, sort of first glance is does it have a normal distribution or, or more importantly, what, what demming would call a common cause correlation. And what you're looking for is randomness.

00:12:58

You notice there's the average there, the mean, um, we're actually seeing sort of above and below because there's variation in all processes, right? So this is basically the first 17 weeks. But here's where it gets interesting. If I take all the data for 25 weeks, and again, I look at the enumerated side, it, it's mildly interesting because, um, I see I might have some outliers there on the, um, on the right, the, you know, the 28, 29 and 30. But what, and, and, and the thing is, this is the same data for both the distribution chart and the control chart. So same exact data, no change of data. If I look at the control chart, I see up to week 17, I see sort of a random, which is what we wanna see. But then all of a sudden at week, starting at week 17 to 18, there's this pattern.

00:13:44

And we don't wanna see patterns <laugh> in analytical statistics, like, and the pattern here is it just is increasing. So the errors are increasing. And so remember, it's Deming said, you know, it's the status. It's not statists job to solve the problem. All the statisticians now saying, here's your data. And again, enumerated didn't really tell us anything. But on the analytical side, we look and we see, okay, there's a couple of interesting, we've got three points, um, above the upper control limit. That means it's sort of a stand, it's above three standard deviations, or, you know, basically three sigma. That's sort of, uh, interesting enough. 'cause it's a, it's a, like, you know, 0.3 percentile at this point, but then it's got this increasing. So we hand that to the, uh, the subject matter expert. And actually, in this case, it turns out there was a new development team added, and they weren't following any rules from, they were going out to docker hub.

00:14:33

They had their own sort of repositories for container images of how they were building containers. You know, you do from, and it, it includes in whatever the image is. Um, and so there were like this, um, big increase in vulnerability found. But here's the point, right? It, it took analytical statistics, it'll even let us know where we needed to go look or look for the why. So on the, uh, unit test, I have another example. And this has to do more with, you know, the real core of de Deming's message. You know, you'd probably heard plan, do, study act, right? That's a form of theory of knowledge. In fact, it is, um, scientific method. You know, Mike Roth calls it, um, tota, right? It's all really the same thing. So let's take a look at this. But in this case, um, we were taking percentile of, you know, um, per week of the unit test coverage for all the applications.

00:15:26

And so we were running, you know, here it says it's a mean of 50%, so not great, right? Like, we really wanted to increase the percentage of unit test coverage. And so, um, so we, we decided is that, you know, we were gonna make a change. And the idea would be that, you know, and let's say a vendor comes in and they say, Hey, the, the, the T D D uh, training that you have is probably not increasing your efficiencies as much as you want. Why don't you try ize? So instead of just buying the software and just assuming everything worked better, we use analytical statistics. So what we do here is we run another 25 week, and then in week 16, um, and 17, we decide to sue our plan, you know, sort of our plan, do, study, act, right? And, um, you know, this, just make sure that, um, we can see that.

00:16:21

And just in case, so we do our plan, do study act, right? So this, our plan is to come up with like, okay, what we're gonna do now is in the act stage, we're gonna go ahead and, um, do the, um, maybe we take, um, uh, an improvement sprint. So two weeks we do a sprint where we try out this new T d D package, and then we study the results. And it looks like possibly that we've increased the percentage coverage, because now we see a trend starting in week, uh, basically in, in week, uh, 21, go to 25. But here's the thing, on the AX side, we're not quite sure. So we don't want to just tell everybody, Hey, we've done this small test. We onboard a couple more teams. And indeed we do find out that if by onboarding, like, uh, you know, sort of canary, if you will, the, um, this new testing, we see that now the mean is not 50, it's 64.

00:17:17

So we actually have increased, you know, in this way we can sort of keep blasting this out to more and more and see if this approach, and if this approach didn't work, we'd know from the data, and then we can try another experiment. So it is a form of experimentation. And, and so, um, one of the things I'm fascinating about is, you know, using the, the other thing about this sort of, um, these techniques, analytical statistics, it's a hundred years old. I mean, it started with Dr. Stewart at, um, bell Labs in, in Hawthorne. Um, it's been used to make toasters cars run nuclear power plants, and we just don't really use it in it. So, um, so I think there's just a number of opportunities to use it in modern governance practices. Like you just saw SS r e practices, um, DevOps, data lake, and all the things that we get from the dora, and even adaptive skills liquidity, right?

00:18:07

Like we can, that, that, that T D D or unit test was a good example of figuring out that like, status quo was not good enough, even though it was all in control and it was random, like we wanted it to be, we still wanted to improve. So you can think of that even as a skillset, you know, like, maybe we need more training, or maybe actually that team is not the right team for the job, right? So, um, and a couple of things when we start talking about the cyber crisis, like, I think, like if I look at the Knight Capital, the famous Knight Capital, uh, under the lens of, uh, this sort of profound knowledge, you know, most people probably who were listening this know about this, but in 2012, what happened was, um, they put in a new, uh, program. It was in a high frequency trading algorithmic, uh, it was in the N Y SS E it was called their retail liquidity program.

00:18:55

And they put some new code, it was gonna go out to eight, uh, an eight node cluster. And the CIS admin, uh, sounds like manually deployed and only hit seven of the eight clusters on the old eight cluster. There was some old code that was used for sort of testing and stress testing called, uh, power peg. And the power peg was designed, get this to buy high and sell low, right? And it was never supposed to run in production, but through sort of a mismatch of common flags, it got turned on in the eight server, which it never got turned on. And literally there was 3.3 billion trades. Um, um, and normally, and basically in this case, it, that, that buy high sell low, um, power peg, uh, basically, uh, traded over $21 billion of, of bad trades. It was, uh, 44, 440 4 million, less than 44 minutes.

00:19:46

They were literally out of b business of 45 minutes. But here's the interesting thing. The s e c sent a cease and desist Knight Capital and I, I put it on again, the lend of, um, the, the four elements. And so if we look at what's in quotes there, the second technician to review the code installation, this is again, how do we know what we know? Theory of knowledge? Like did they actually have, were they thinking about, you know, you know, the, you know, the plan do study act. Did they really know? And the variation, um, is theory variation is did they have sufficient to orderly install a new code? Uh, you know, not to prevent. Here again, this would've been, um, maybe a pairing, you know, a review on a pull request, just something to, uh, to better understand, uh, the, the, the why or what it was gonna do.

00:20:33

You know, maybe baselining, they might've saw that there was some anomalies, uh, psychology, right? Um, you know, did they have a reason design guide for employees response? This is psychological safety 1 0 1. It was in, was there somebody, somebody able to sort of raise their hand and say, Hey, maybe we ought to p push this delivery. There were four. This is, this was not too dissimilar to like the Columbia and the shuttle, uh, where there was back pressure to deliver on a certain time scale, right? There was a lot of that. There was a lot of people that didn't feel empowered to basically stop, you know, pull the and on cord. And then here again, the s uh, the s E C said there weren't adequate description for risk management controls. They, they, their, um, their, uh, thesis was that there really wasn't a systems approach to how this is.

00:21:19

So here, again, under, um, the lens, and I, I do a much detailed version of this in my book. Um, and early I said that Deming's 14 points, if you needed to be convinced that, um, that he's, this is not DevOps or DevSecOps, you know, you don't have to look much further than, um, I'm not gonna go through all 14. I do go through all 14 with, uh, ex examples of modern technology, modern infrastructure, DevSecOps, cyber, uh, but just take a look at a couple as as time permits. Um, but the, uh, I talked about the Schreiber Foods. Uh, you know, I think this is interesting too, so that, um, you know, Mark Schwartz is seated table. He asked the question whether a company is serious about information technology, if the C I O doesn't have a direct seated table. Well, in the case with cyber foods, they didn't have a ciso, they didn't have a director of security.

00:22:05

Um, I mean, they didn't have a vp. They had maybe a DI director level, which was really a hardware person. And periodically I go back and check, and even after the breach, they still don't have more than this hardware director. So, I mean, you're just not taking, uh, it's like the 5,600 hospitals, um, never stop improving. Uh, Shannon Leach, who's been an incredible mentor for me, um, she does something called adversary analysis. So instead of, um, you know, there's always this never ending cat and mouse game with the adversaries, right? And so a lot of the, um, reactive mode, uh, policies are basically we, you know, we scan, we look for vulnerabilities, and, you know, uh, and, and, and that's table stakes. But what she does is looks for sort of adversary analysis. How often do they come? How long do they stay? So she really uses a theory of knowledge, uh, approach, and that she'll basically experiment on things and then see what the data says.

00:23:00

Did the adversaries stay less after they made this change or she added this thing? Uh, they don't manage. Mostly, I think this is where Demi would be really upset with all of us, is the way we manage incidents. Um, you know, we have, um, you know, we, we basically categorize the incidences, you know, P one, P twos, and, and the truth is, folks, we do it based on napkin math, right? Like, uh, that's a P one, or that's, you know, I, I mean, like, okay, we have critical services, but was it really a critical, because it was a critical services, did it really start at 9 0 5? Or is that when somebody wrote it down on a napkin? Um, he would say that, like, and, and this is a classic case of enumerated statistics, right? We count these up and here's a couple of points to consider what Demming would love.

00:23:46

John sbar, by the way, 'cause John Spar says, incidents are unplanned investments, right? And, and so we, you know, we, look, I look at most organizations, they don't even have enough bandwidth or time to process all the P ones. They ignore P twos, p threes, and there's just a wealth of opportunities. So Demming would say, stop using enumerated statistics for this data. Use analytical statistics and just throw it in stop arbitrary, creating abstract extraction layers for P ones versus P twos, and just let the pattern show up in the analytical statistics. I mean, there's just a wealth of opportunity here. Uh, no silos. Uh, there's the famous Equifax, right? I think, uh, the, the point there is there was such a Conway's lore. The CISO reported a chief legal officer. Um, and, um, basically, you know, when asked under, uh, testimony when the, uh, house of house of Representatives did, um, a review, um, the, they asked like, did you think it was odd that you as a CISO would be reporting to chief log legal officer?

00:24:44

And then the response of the CISO was, I figured they knew what they were doing later. Uh, when asked, uh, how come you didn't report the breach to the C I O? It was, I, I didn't think of it. Like, again, of course she didn't think of it because the, she reported the chief legal officer, right? It, the, the, the, the organizational chart mandated how they were gonna react to that breach. Uh, this one's probably, you know, Deming's sort of most, uh, known, um, is that, you know, Deming hated quotas, hated MBOs, MBRs. And by the way, folks, you know, put your seatbelt on for a second. He would've hated OKRs. Um, you know, basically Demming was a proponent of like the method, didn't care about the well, cared about the results, but figured that the only way you could truly get results on a repeatable fashion is if you understood the method that it took.

00:25:36

He would say, by what method? There must be a method to chee achieve the aim. And, um, you know, he said that, you know, the problem with the sort of Western style of management requires manager leaders, they focus on outcome management by numbers. MBOs work standards, specifications, zero defects, appraisal performances, um, they must be abolished. And, you know, and again, uh, okay, this to me, the OKRs are really no different. And, um, you know, and, and so, um, you know, I, I think that, um, the, the thing of that Demming would really focus on, and it, it's an following slide, but one of the things that, um, that, that I think, uh, Lloyd Nelson, who was a student of Demming, said that if you can accomplish a goal without a method, then why, why didn't you do it last year? And I would say, why can't, you know, there's no guarantee you'll do it next year, right? Like the, the goals for just goals sake, without understanding how you achieve those, what we wanna do is teach people how to achieve. And the other thing I would say is that when we set up the goal mode, or results mode or sort of objective mode, we force people to, um,

00:26:42

In, in, in all truth, in, in my, you know, travels and I travel to a lot of companies and do a lot of qualitative analysis with a lot of people and a lot of big organizations, there's a, a number of results. One is people lie, it's the watermelon melon, it's like red all over in the middle, and it's just green on the outside. And or people tell the truth and they're sort of status to leadership is always yellow and red, right? Or they find these workarounds. Uh, I, I I just recently talked to somebody who uses a risk analysis tool for a risk scorecard. They've just fig one team figured out like how the scoring works, and they just feed the scorecards application exactly what it wants to hear. And they're always basically, a's you know, and b pluses, right? Um, so in other words, the, the method is the Chinese proverb, give a person a fish and you feed them for a day, teach a person to fish and you feed them for a lifetime.

00:27:36

This is really important. I think that, you know, uh, some people I, I never heard demming refer to, but management by means, so you'll hear that you can Google that, and it really is pure demming is, it's, it's about the means of what you, uh, the method of what you get there. So just, um, you know, Demming wouldn't hated everything we do today. I actually think Demming would've loved ss r e, you know, and not to go sort of ss r e you know, sort of madness. I think ss r e done, right? Uh, Demming would've loved it for a number of reasons. One is, you know, if we think about an SS l a, um, that's really a systems thinking approach. If we think about an SSS l o it really is our, um, measurement, our, our sort of why, our understanding, our theory of variation and our SLIs are our theory of knowledge, right?

00:28:22

The, um, you know, the, the indicators. Um, and here's the other thing about this. It's, you know, the psychology is, it's hard to game this if done ss r e done, right? It's decoupled from the person or the team. It's the system that generates these performance indicators. It is the s l O that sort of encapsulate that in the s l a encapsulate s l o. So we remove the MBOs, the OKRs from teams and people, and we let the system do the work. So we, we take out that sort of human, uh, factor of, of, you know, people having to game the system or work around the system or tell the truth a lie. Um, and then the K locks, right? Like, don't need to say much about this, but like, it's another form of forcing people to work around. So if you know thousands of lines of code, I'm either gonna not clean up my code because I want, I don't wanna affect my klock, I might not wanna write new code.

00:29:13

Um, you know, so there's a lot of things that sort of just the Klock system, and you'll have these slides and you can read it. Um, so like I said, I have a book, uh, coming out early next year. I think it's, I'm gonna rename it to, um, I've got some good advice to call why Deming matters. But right now, pro prototype names called Profound. Um, if you, um, grab that barcode right now, the first 200 people, I hate to be sort of salesy, but like, I really want like the first 200 people that sign up for on a waiting list. You don't have to to pay for it, uh, just to sign up. Uh, I will send, you'll have to pay for it when the, when the book becomes available. But I will send you, um, a signed and a special copy for the first 200 that go. If you hit that, it'll take you to that link. Um, anyway, I know I went kind of fast. I think there was a lot of information here. I'm John Willis. I, most of my blogs and everything are in profound demming.com. Also, you can go to cosley.com and see what I'm up to on that bot loop. There's my LinkedIn and uh, John dot will@cosley.com. Thank you very much.