San Francisco 2015

Pain Is Over, If You Want It

Technology is always the easiest part of any problem.


This was true of Google in 2005, when Mike Bland joined the Testing Grouplet’s effort to drive adoption of automated testing throughout a highly successful company as its organization and systems increased in complexity at an alarming and unstoppable rate. This was true in late 2013, when the Healthcare.gov crisis led to a stunningly successful recovery after private industry experts were given clearance to fix the technical issues. It is also true of the U.S federal government today, as Mike has joined 18F as part of the effort to modernize how software is developed and procured, and to steer the culture towards maximum transparency, autonomy and collaboration.


This talk will outline Mike’s experiences at Google that shaped his outlook and honed his organizational skills, and describe his efforts to capitalize on the opportunity produced by the Healthcare.gov recovery to effect broad cultural change throughout the federal government.

MB

Mike Bland

Practice Director, 18F

Transcript

00:00:04

<silence>

00:00:09

Thanks.

00:00:12

Thanks everybody. Let's get right to it. You might remember in October, 2013, healthcare.gov was being crushed under the load, which on the one hand, it's good news because there's actual demand for this service, but healthcare reform was about to fall apart because of a website. Seems like a ridiculous proposition. But fortunately, a month later, they opened the doors so that folks from industry could come in, do what they do, and lead the recovery effort. And so by the time enrollment closed at the end of April, 2014, they had not only enrolled more than the 6 million people they were shooting for as a revised target, and the 7 million that they had as an original target, they ultimately accumulated 8 million people enrolled in the program. So this website recovery was a dramatic success, thanks to a, a very dramatic change in the tech culture of healthcare.gov, because once the administration decided that it wanted to fix the problem, rather than just adhere to decades of conventional wisdom around contracting and procurement, it fixed the problem.

00:01:19

But the big question is, what happens next? How does it stay fixed? And is there a way to build on that momentum to try and reform how it and government has done across the board? And can they learn lessons from experiences such as the, uh, the drive to, um, promote automated testing adoption within Google? And how did I get mixed up and all this stuff? Because at the time, I had actually quit the tech industry. I had left Google and the industry in, uh, September, 2011. I had gone to Boston, enrolled in Berkeley College of Music. I was completely off the radar, but I ended up, uh, getting a call from an old friend or rather an email. Um, my buddy Jason Huggins found me on LinkedIn and he sent me an email and I'll, I'll read part of it to you. He said, as you may or may not be aware, I helped late last year with the healthcare.gov website Rescue. It's gonna take a while before the government gets good at software development and testing, and they're gonna need a culture change. But the White House gets it. Now, that fundamental change is needed in how they create and test software. And more importantly, that change imposed top down is likely to fail. Again, no pressure. I respect your desire to focus on other things, but there's this whole <laugh> your country needs you thing, right? <laugh>

00:02:39

<laugh>.

00:02:42

Yeah. So I grew, I grew up in a military town, so that was like, uh, you know, that was the nuclear option. I was, I was done with music school and I joined 18 f my current team, uh, in November, 2014. But to understand why he recruited me, let's go back to Google oh five. So obviously, like it was an extremely successful company. Already everybody wanted to work there. We must have been doing something right, and in a lot of ways, we were doing a lot of great things, right? But then there was another side that, um, you know, most people didn't really see that we were, you know, the threat was we were gonna be crushed under the weight of our own success because obviously we were hiring more people, writing more code, building more products became increasingly more complex. And even though we tried to hire the best developers we could and give them as much support as we could, there comes a point where all the brainy heroics in the world cannot overcome sheer scale.

00:03:38

And so, you know, we were running the risk of reaching a point where we would eventually hold ourselves back, slow ourselves down. And when fear of change and all the things that might go wrong would stifle our courage and innovation, it would lead us to miss opportunities to become bureaucratic and petty and mediocre and irrelevant. And there was no service, no product, ah, come on iPad, uh, that was more visibly susceptible to these forces than the google.com homepage. Uh, and that was managed by the Google Web server team, or gist for short. You're gonna hear me say gist a lot. Um, and I was not a member of this team, but I was very close to the folks who were on that team as will become evident. And so, gist back in the day, was not the glamorous place to be inside Google. 'cause it was basically a dumping ground for all the unrelated changes that all these other teams developing search features, you know, had to dive into all the time.

00:04:42

So just imagine if you break google.com or you're on the team that allows google.com to break. So we're talking about, you know, thousands of queries per second that are slow or return bad results or, you know, who knows what. And those thousands of queries per second quickly build up into billions of broken promises. And we're not just talking about lost revenue, but damage trust. And that's very hard to recover. But things seemed to be mostly working. And a lot of people thought, why is there, you know, what do we need to fix? Like look at our stock price. Um, and you know, some people were, you know, they, they kind of lived the Google Life and, you know, they, they believed in themselves. And then there were like late comers like me who were just living, I mortal fear that we were gonna be revealed as the frauds that we actually were.

00:05:33

And so, you know, there was an intense pressure to like deliver something and prove yourself. And bear in mind that also, where do I point, uh, testing back then wasn't the common experience and the common sense thing to do that it seems to be to many of us now. Uh, a lot of people just had no experience with unit testing or writing testable code, and they really, you know, they didn't have the capacity to really think about it. And the tools, the development tools we were using at the time, they were really starting to get crushed under the load. And there was some truth to the proposition that people didn't have time to test. So the big one though, that we were up against was trying to basically prove the value of a negative. That there's value in preventing a problem rather than fixing it after it hits.

00:06:26

And, you know, in, well, sorry, <laugh> forgot my cue. Uh, now of course after the fact, when things like say, go to fill and heart bleed happen, we can make a case saying, you know, there, you know, it's conceivable automated testing could have prevented a case like this. But people still tend to think that it can't happen to them. And this is especially true when people's value system relies almost entirely on objective measurements. And I, I use value system priority structure interchangeably with the concept of, um, like a cultural norm of what's most important to an organization or what I sometimes call a corporate religion. And I don't want to condemn Google's data driven management style decision making process. It's actually really, really good. It's obviously served the company well and it served by extension society very well. But it did make it extremely difficult for us to make the case inside the company of the value of an investment in automated testing. And we couldn't really blame them. Uh, 'cause like I said, they didn't have the experience outside of the status quo at Google, where things were very slow and brittle. Uh, and they were under constant delivery pressure on top of that. And because we couldn't communicate using the language of data, it, people just didn't understand why we were so passionate about it. And when people didn't have the time to build their code, they didn't have the time to learn about testing. So we had to find other ways to achieve our goal.

00:07:59

But first, you know, let's step back a little bit and think about culture change more broadly. And, uh, you know, one thing I will say is that I don't believe culture change happens like this. 'cause there is nothing worse with Cartman, with author <laugh>. Like no matter how good the idea, how pure the intention, if you know somebody high up in the organization starts issuing orders, it can backfire. It can actually hinder adoption. And fortunately, uh, many of the Google execs had already learned this lesson at Microsoft and Bell Labs and, and digital and sun. And it also doesn't happen like this with some rockstar guru ninja savior type coming into, you know, save the day. And I, I, I want to emphasize the point that it's one wasteful at best and dangerous at worst, to think that change is only possible through the magic and charisma of a selected few.

00:08:49

Now, power and mythology are not bad things, but they require cultivation and care so that they produce something that becomes repeatable, basically a model, not because there's any way to exactly repeat the steps other people have taken in the past, but you can at least frame it and use those lessons to inspire your present course. And despite the problems, the limitations that we had, uh, inside the, uh, the company, we actually had a lot going for us by virtue of the existing environment at Google. So we had access to information and tools that supported that. We could see who was working where anywhere in the company, what they were working on had this documentation culture. Uh, we were empowered to try new ideas and to try to, you know, prove their value to the company. You know, most notoriously through the concept of 20% time where everybody had up to a day a week to experiment.

00:09:44

And we also had a very, you know, startupy culture for a company that big and something that we called Grouplet, which was basically, uh, people pulling together their 20% time to collaborate on an issue that affected the entire company. And we also, without realizing at the time, we kind of slid into the, the crossing the chasm model, which you might be familiar with from Jeffrey Moore's book. And so, uh, the idea here is that getting the right message to the right people, the right way in the right order is key. And, you know, all the way on the extreme left, you have the innovators and early adopters, or who I like to call the instigators. And so it's up to them to kind of build that bridge across the chasm to reach the majority so that the, you know, the rest of the population can, well, the rest of the population can eventually, you know, adopt the initiative.

00:10:35

And I'm gonna return, you'll see this little bridge and what it's made of in a few minutes. But before I do that, let's get back to gw. So, uh, the tech lead of the GW team, Bart Meda, he believed that automated testing could cure a lot of the problems with GU related to complexity and fragility. And so he had the team take a hard line. They were gonna, they were not gonna accept any more changes to GU without accompanying tests from anybody. And they set up a continuous build, of course. And, you know, were religious about keeping it passing. They set up coverage monitoring, they made sure their numbers were, you know, up and to the right. And they actually established a written policy and guidelines for writing tests for GU that they, they insisted that everybody abide by both inside the team and outside.

00:11:22

And this was not the most popular policy at the time in the company. Um, but in the end, it helped, it GIST was able to turn a corner. They were able to get to a point where they could confidently integrate large number of unrelated changes from many different projects coming to them and maintain a brisk release cycle. And when new members joined the team, they were very quick to actually make productive contributions despite the complexity of what they were working on, because of the confidence that came from this high degree of coverage. And because the code was in such good shape in the process.

00:11:57

So ultimately, this policy, this radical policy enabled google.com to expand its capabilities very rapidly in the middle of a very fast moving competitive landscape. And it goes without saying that, obviously, obviously GIST is the model team, like you wanna learn how to do automated testing, look at gu, but the problem was, it was still just one relatively small team within a large and growing company. And so we had to find a way to amplify its voice and its message and build that bridge across the chasm. So what happened was Bart teamed up with another Googler named Nick Laski, and they started the testing grouplet. Uh, and I eventually became one of the leaders of this grouplet after they handed it off. Um, and we had very little budget, zero author, but we had all of the creativity at our disposal to attack the problems that were hindering automated testing, adoption from, you know, fresh angles.

00:12:56

And we had the gist experience that we could rely upon as a model. So we did a whole bunch of stuff that I'm gonna run through quickly. We, we did a lot of partnering with NGDU, which was our internal, I still say our, um, which you know, is Google's internal training. Uh, and, um, well, the internal training, uh, organization that maintains things like the, like the, uh, the new hire lectures. And so we had a lecture in a lab so that at some point during everybody's first two weeks, they would be introduced to unit testing. They helped us cultivate these self-guided training materials called Code Labs so people could work through examples and get a feel for the tools and the techniques. And of course, they helped us organize internal tech talks and helped us bring speakers from outside the company as well. And we worked very, very closely with our internal build tools and testing tech teams to try to find a way to reduce that friction that created that whole, I don't have time to test excuse, but of course the biggest thing we're known for testing on the toilet.

00:14:00

So if you're not aware, this was a publication. We just took the initiative to start putting up in every bathroom in the company every week. And we were able to just kind of incrementally increase the degree of knowledge and sophistication when it came to automated testing all through the company. Um, and I chose this episode to post here, not just because I happened to write this one, but it also encapsulates two other major initiatives of ours. The test certified program was a roadmap that was inspired by wis, uh, to do two things. First, we hacked the culture a little bit. We said, do these tasks, you'll achieve these levels, and we will put your name on the ladder, right? So they, they had something to measure themselves against and they thought it was great. Um, but secondly, it also gave people a path to overcome that big scary obstacle, like, where do I get started?

00:14:51

So we told them level one set up measurements, builds coverage bundles, um, label any tests that you know, to be flaky so that they could be, you know, singled out. Uh, level two was establish a written policy for your team, no changes without tests, et cetera, and set some low level goals. And then once the team was really bought in and feeling the rhythm of this, level three was where you set some, you know, far out high end goals that really stretched your capabilities. And we, we recruited a lot of volunteers who were passionate about testing across the company, especially our software engineers and tests. And these, uh, these people would act as mentors to the teams participating in the program, and they would provide advice, and they would also be the ones to validate that Team X reached level Y on, on the test certified ladder.

00:15:38

And eventually we, with this framework, we realized we had a strategy. We could take this framework and say, we want to get every project in the company to test certified level three, even if they're not actually in the program, we want them to operate as though they're at that level. So for the projects that needed a little more hands-on help, uh, we created the Test Mercenaries, which is another group that I was a member of. And we were like internal consultants. We would go into a project, spend a few months working on their code, uh, using our tools and techniques to show them, you know, how to do these things. And we used test certified both as a guide to what we were doing and a goal for the team to achieve. And again, working closely with the tools team, it was a very tight and productive feedback loop where we would try the latest and greatest get real world experience in, in some challenging projects, feed it back in.

00:16:31

And that drove the innovation that produced the tool chain that, uh, you know, Google has today that, you know, enables enormous amounts of builds and tests to be executed every day. And so we also did other, this other thing, uh, called, uh, fix. Its, which these were like, you know, these kind of informal, Hey everybody, let's do this thing today because we really need to, um, address these important but not urgent issues. They were completely grassroots, they were not mandated from on high. And, um, you know, we could give people little tasks like, write more tests for your project, fix the tests that are broken or flaky, and then eventually we'll do this to climb up the test certified ladder. And then when we got to the point where the tools were really in shape, it was a great way to get them out to everybody in a very short period of time.

00:17:20

And the power of this came from the fact that we had, like these very concrete goals. They were time boxed. It created a sense of urgency that produced a critical mass of activity. And so every time we did one of these things, we just ratcheted up the state of the art in terms of the tools and the techniques and our entire culture changed. Mission would reach a new plateau. And plus they were fun. There was all this energy and we gave out free stuff. And if you know anything about Googlers, just know that they love free stuff.

00:17:49

So this whole thing took about five years, and I won't go into the whole chronology, but let's see how the different pieces that we came up with fit together into this bridge that we built across the chasm. So this model, uh, I borrowed it from, uh, a fellow ex Googler, uh, named Albert Wong. He did a, uh, two week sprint with, uh, citizenship and Immigration Services in July, 2014, and produced a talk about his experiences. And he introduced this model, I thought it was brought, I changed the name. Um, I, I wanted something a little more, you know, like something like a splinter that would stick in your head. But, um, but I just thought it was a, a brilliant way to delineate the different functions that you have to cover in order to get that initiative from one side of the chasm to the other. And it's phrased in terms of the needs of the majority that you're trying to reach.

00:18:45

And it also has this really nice linear quality to it where you can clearly see that some activities are more dependent on you doing the work, but eventually the goal is to get people to be able to do it for themselves. So, you know, obviously things like mercenaries and tools, teams where hands-on test certified serve many functions, but it especially locked into the validation need, uh, produced all kinds of material, saturated the environment with testing on the toilet and everything within GDU, uh, fix. Its were these big high energy things. And we'd also give out, like these build orbs that would sit there in glow green or red, and the whole team could see if the build was broken or not. Um, and I think people started forming interesting attachments to these things. But, uh, <laugh>. And then of course, you know, at our core we were a community, you know, we were trying to work with people to help them.

00:19:35

And then when we enlisted the help of the mentors that really scaled our, you know, scope and our reach. And then finally we la we had two big fixits where in January, 2008, the revolution, that's the first time we put the modern, uh, tool chain in everybody's hands that took away the, I don't have time to test excuse. And two years later, we rolled out the test automation platform in March of 2010, built upon that tool chain. And it was so fast and so accurate that most build break breakages that affected multiple projects would be reported and fixed before most people even noticed.

00:20:14

So what did this enable? I'm gonna use some actually Old Buy now numbers here from a fellow who I, you know, I actually did not know Aaron, but 15,000 devs and 4,000 projects working in one big pile of code making 5,500 changes a day. And I did the math that, uh, 75 million test cases a day turns out to about 868 tests a second. Um, and I know Rachel Puffin from the Build Tools team recently gave another talk that make these numbers look like child's play. And so that's the foundation that we laid. And, you know, what does that equate to tangibly once people had the power and knew what to do with it, doing the right thing with regard to automated testing just became what you did. It wasn't even a question anymore. The only question was how to do it. And so, you know, there was no more of that fear of making a change, that things are gonna break.

00:21:08

People could stay in that state of flow and focus on the future and exciting new features. And it brought the joy back to programming. And so I'm proud to say that after five years of grassroots teamwork, we'd done the impossible. And that made us mighty. And I love using, I love using Caravaggio's, David and Goliath here because it's actually a self portrait of Caravaggio as Goliath. Because the point here is we didn't have any external forces working against us and the technical side of the problem, we eventually solved it. It was pretty not easy, it was a challenge, but it was solvable. But what we had to figure out was to give people the kind of power and knowledge they needed to change their perceptions and have the kind of experiences they needed to have to be persuaded of the value of testing. Because oftentimes the biggest obstacle to the change we wanna see in the world is how we as individuals, teams, or organizations already see it.

00:22:07

And on that note, let's come back to the government. So I'm gonna mention just a little bit of background about at f Blow through it very quickly. It was founded in the wake of the healthcare.gov recovery as part of the general services administration. And the goal is to try to reform the way government builds and buys software. And so we do everything, we do everything open source, we're very deeply steeped in agile methodologies, but the goal is not to build all the things and replace the vendors. The goal is just to establish kind of a beachhead within government that proves that this model can work here and to give, uh, procurement officers and vendors alike a new model, a new framework in which contracts can be written and work can be done. And we believe that by following this model, there will just be millions of dollars, tens of millions project after project that just magically isn't getting spent anymore.

00:23:01

And it's for the name. Um, yeah, they went through a few dozen ideas and they were all trademarks. So they looked out the window and said, well, what street are we on? So we're 18th and F Streets, uh, Northwest DC in case you were curious. And so some of the things we've worked on, we've worked with, uh, the United States Digital Service team from the White House along to, to work with, uh, the United States Customs Immigration Services, uh, to not just re you know, reform the software architecture and delivery process, but also improve the user experience for prospective citizens. Uh, we've worked with the Department of Interior to deliver every kid in a park, and it's notable because we actually did user research with nine year olds. So if you're wondering why there's no social media buttons all over this site, it's because nine year olds have no use for them because they can't get accounts.

00:23:52

Who knew? Um, we recently worked with the Department of Education to make all of their data accessible and useful as possible so that perspective students and their families can make informed decisions about their, uh, college choices. And then we recently launched this, uh, web design standards project, another joint effort with between us and the United States Digital Service team. And the point is to provide a, a better user experience so that government websites are no longer special snowflakes. Like you can kind of see like all these different buttons from actual government websites. Um, so this design standard work tries to provide design elements and a style guide for a common look and feel. And then we also have our consulting operation that take all the time you need. Oh, okay. <laugh>, thank you. Jean <laugh>. Um, so, uh, the consulting wing, uh, they, they work directly with our partner agencies to give them kind of the taste of agile in their mouths.

00:24:52

Uh, they'll go in, they'll do discovery sprints for an agency, they'll provide proposals, recommendations, even prototypes, um, and then they'll actually run agile workshops and lead them through problems that they're actually trying to work on. So everything we're doing is off to a fantastic start, but how do we make sure we keep it up right? Like, how do we make sure we don't just become, oh, you know, that was another great experiment that didn't pan out. Uh, well first let's go back to the organizational forces that exist within government, and you'll see some parallels in the large here. So typically in government, there is a premium that's placed on compliance with the existing rules rather than a focus on the quality of products and services.

00:25:38

And part of that comes from the fact then the government, we don't have the same incentives like, uh, you know, stock shares or, you know, micro kitchens or ski trips or, um, uh, nevermind, sorry, uh, <laugh>, uh, um, actually I really missed the espresso machines, but there was also this thing passed in 1883 called the Pendleton Civil Service Reform Act that kind of provides the structure for a lot of this job security. Um, and it's created an environment where I think Ja, Jamie Zelinski might characterize it as, uh, where people want to come work for a successful company rather than work to make a company successful. But at the same time, there's this awareness that, you know, while people want to avoid risk and avoid accountability, that you know, the really talented people, they're kind of crazy risk takers. So they're not really gonna come here.

00:26:30

But then what does that make us? So there's a little bit of an inferiority complex, and of course, the waterfall model is still by far the dominant model in the psyche of the government, and particularly when it comes to automated testing. Like that's way at the end of the pipeline, right? Like, you know, why should I have to test that's somebody else's job? And then of course, there's outdated tools. Of course, there's outdated procedures. And the amazing thing is sometimes the government can't even get to the code and the data that actually provides its products and services, they can't physically get to the information and the tools they need to do their job. And so, you know, all these barriers were erected because, you know, there's some fear of something going wrong and being held accountable, which in a lot of people, they, a lot of people's minds, they think they're gonna get fired or dragged before Congress or something like that.

00:27:24

And so again, this fear leads to missed opportunities, pettiness, bureaucracy, all these things that we typically associate with government. And what also happens is, what also happens is, um, nobody on any side of the situation, the, the policy makers, the administrators, the developers, like nobody has access to the full information they need to actually meet their objectives. And, you know, this goes beyond ignorance. It's a full communication breakdown, right? Because these groups are isolated, they don't have a common goal, they don't have a common language, they don't have a common understanding of what the objectives and needs are.

00:28:02

So if we're gonna overcome, we at 18 f now, not we Google, we at 18 f if we're gonna overcome these kind of challenges, we need to build an organization at least as robust as Google. And, uh, you know, not just to withstand the pressures, but to also then, you know, communicate effectively that yes, this new model can work within the government. And so we've got a ways to go. And let me do a little AB with you here. So, my first day at Google in 2005, you know, we had all the things already. Like any question you needed answered was pretty much at your fingertips. I, I described it as jumping into the fire to drink from the hose. But my first day at 18 f however, all these questions I had to dig for the answers and I had to run around asking people for them.

00:28:50

And that was not just a drag on my time, but on theirs. 'cause certainly they're answering the same things over and over. And that's when it dawned on me that, uh, when I walked into Google, there was not just, you know, values around transparency and autonomy and collaboration, but there were the tools to support it. And everything that we did in the testing group, it was already there. We took it for granted. So I figured we need to have that at 18 f we need to start developing not just these values, but some tooling around it. So I started trying to steal the ideas that I brought with me from Google. I created this thing that we call the hub. It's an amalgam of several systems that I was familiar with at Google. Kind of a prototype that kind of has, you know, lived on perhaps past its usefulness, but it's made the point that kind of scaling our documentation is crucial to scaling as an organization and sharing information.

00:29:40

And one of the things that it tries to do is expose a graph of connections between all these different, uh, you know, these individuals working with projects with other people in certain locations with certain types of tools. And we've extracted that kind of graph engine into something that we call the team API. So that instead of it just being a feature of the hub, it's exposed as an independent JSON, uh, set of JSON endpoints. And we've started experimenting with how to keep that current and automate kind of getting that data. And one of the things we've been doing is we started adding these little files to our repositories, just little metadata files called about do YAML that talk about, oh, well this is the project, these are the people working on it, this is the impact, these are the technologies. And we've already started building up a pipeline to harvest that information automatically, directly out of the code repository, you know, mung it through the API and then publish it, who knows where.

00:30:33

And so the point of this is I'm trying to create a space so that these instigators can more easily discover one another and create their own grouplet, or in the parlance of 18 f working groups in guilds. They're basically the same thing and guilds are a little more official. But you know, the idea is that same model can apply to our own team that we had with the testing grouplet. And I run three, the first one is the documentation group. 'cause that's the foundation for all of this. Second being alright, second being testing. And then third, being a working group. Working group because we want to help working groups. Um, but to also make it easier for these groups to share their knowledge and information. We've basically copied GitHub pages, but for our own infrastructure and started publishing the series of documents we call guides, which kind of is an exposition of this is how we do our work and they're very much works in progress.

00:31:29

Um, but we've already gotten not just great discussion and feedback and activity within the team, but from other agencies in the general public through our GitHub issues and, and pull requests. And then hopefully we'll get an 18 FEDU off the ground one day that can emulate Google's DU and become kind of the permanent custodians of all this training material. So just quickly, you can see how a lot of these pieces fit into place. And it's not just us doing this work, and it's not just me doing this work. There's so much work that I'm just scratching the surface with my own story here. Um, but our network across what we call the digital coalition with the digital service team, agencies like Consumer Financial Protection Bureau, we're all doing great work, building a community, trying to find tools to expose this because the insights and the methods and the products that are generated by this combination of transparency, autonomy, and collaboration, that's what empowers a team to create products and services that not only satisfy the needs of customers or the society at large, but to actually exceed their expectations.

00:32:33

So can it work? I believe it can, because we've got the right people in the right time, in the right place at the right time, doing the right things for the right reasons. And I actually had a lot more Beetle slides in here, but, you know, time, um, <laugh>. But will it succeed? Well, that's the big question, and I think it can, if we really wanted to, and I'm gonna share a quote from a colleague from the digital service, Charles Worthington, if you think there is a problem with how government does tech and that you could help government do it better, then the question is, what are you doing to improve it? I'm not gonna, it's not gonna get fixed on its own. And the people in charge have never been open, more open to new ideas if we don't try, who else are we expecting to? So the ask here, I have to be careful as a government employee, I cannot directly solicit unpaid labor from the private sector <laugh>. However, there are things, there are things you can do to help validate the things we're doing to help inform the government what we're up to, and to inspire change by shining a light on the work we're doing. And it, you know, it's the usual thing. It's like blogging, tweeting, writing articles, giving us feedback. I mean, if you want to jump into a GitHub repo of ours, we're not gonna stop you. Um, but by doing this, by helping amplify the voice of our small team, you empower us,

00:34:02

You empower us to build that bridge to help make government of the people and by the people for better, for the people.