Your browser is not supported

For the best experience, use Google Chrome or Mozilla Firefox.

Episode 36 | Reimagining Cyber

A Discussion with the Software Angel of Death, John Keane | John Keane

June 27, 2022 | 25 minutes

Episode Description

John Keane, Software Angel of Death, discusses securing the supply chain, the important of contract language, and shares his unique perspective on the cyber space on the latest episode of Reimagining Cyber.

Show Notes | Links

John Keane

About the Guest

John Keane has nearly five decades of experience in the software industry. Keane has been working on software assurance-oriented contracting language that could impact the standardized contract language required in President Biden’s executive order. In 2016, he worked with developers to develop the language to determine what could be accomplished by complying with rules. One of his rules is to fully comply with the OWASP Top 10, MITRE Top 25, NIST Top 25, or SANS Top 25.

Connect with John Keane on LinkedIn

Visit the blog link for this episode

Episode Transcript

Episode 36 | Reimagining Cyber
A Discussion with the Software Angel of Death, John Keane | John Kean


John Keane  00:03 

I was asked to go through to produce evidence, why a software quality is not different from software security. And I have done extensive work on that over the years. I went through and scanned that database of just punched in keywords like “availability.” And sure enough, up comes, at that point of time, it was 67 security weaknesses that has an availability characteristic associated with it. 


Rob Aragao  00:42 

Welcome to the Reimagining Cyber podcast where we share short to the point perspectives on the cyber landscape. It's all about engaging in casual conversations and what organizations are doing to reimagine their cyber programs while ensuring their business objectives are top priority. With my co-host, Stan Wisseman, Head of Security Strategist, I’m Robert Aragao, Chief Security Strategist, and this is Reimagining Cyber. So, Stan, who do we have joining us for this episode? 


Stan Wisseman  01:09 

Rob, our guest today is John Keane. John has more than 40 years of experience both as an active-duty officer and as a civil servant for the U.S. government. One of John's areas of focus has been to be an advocate for and implementer of best practices associated with the full spectrum of software development that includes code quality, architectural correctness, Dev SecOps, as well as security weaknesses and vulnerabilities. John has worked closely with the NSA, NIST, DHS and many others to advocate for and advanced the practice of softward assurance within the U.S. federal government. He is more widely known in the softward assurance community as the Software Angel of Death. And John it is great to have you with us today. Anything else you'd like to add about your extensive background? For our listeners, perhaps an explanation of how you earned that designation of Software Angel of Death? 


John Keane  02:06 

Great, Stan. Thank you very much. Yeah, my involvement in the business, actually, I believe, goes back to when I was in graduate school in 1972. I am that old. My computer architecture instructor had worked on the development of the first British mainframe in Manchester, England in the late 1940s. And as I told someone from Britain, who was in a meeting with me that he looked at me and said, “oh, my God, you mean he worked with Alan Turing.” So, using the analogy, I am one step removed from Alan Turing. So, I go back that far. When I came into the government, this time, back in the civil service in 2009, I was hired by a good personal friend of mine, the late Dr. Greg Guernsey to implement a practice called software code quality checking, which I have received several notices on LinkedIn recently from people I've worked with is the way that they are implementing Dev SecOps. Because we were taking a look at code quality, as well as code security, as well as architectural soundness. So, it's a practice that I discovered has been around for quite some time, just under different names, or never really formalized until recently. So, when I took over this job, I was required to go through and take a look at a series of scans that were performed on the software. My team was essentially an IV and V team. At one point in time, I received an ATO package. ATO meaning Authorization To Operate. I took a look at this, there was a request for approval for this system to be implemented, because it was secure. But at the same time, I received a package of SCQC scans from my team. And the scans were absolutely horrible. There was nothing that I had ever received prior to that, that was as bad as this particular application. So, I wrote a note to the government project manager and I said, “I think you should return the software to the vendor and tell the vendor to do it over and to do it right for a change.” So about two days later, I walked into a big conference and a big display where vendors were showing all of their wares for the Military Health System. And the program executive officer was talking to the contractor whose software I had written the offensive report on and as I walked into the room, at a loud voice so everyone could hear, he said, “there he is the Software Angel of Death.” 



John Keane  04:51 

Dr. Greg Guernsey went through and liked it so much. They never called me by my name. I was always referred to as the Software Angel of Death. 


Rob Aragao  05:02 

So, John, let's talk about kind of these past 12 to 18 months or so. And one of the hot topics obviously has been centered around securing the software supply chain. Right. And a lot of it's due primarily to the open-source components and vulnerabilities have been found out there. And the impact that we've seen. So, if you think about kind of your approach, you've discussed this with us, at least in the past, and we've seen a lot of it publicly, you have a strong emphasis on enforceable contract language, right around softward assurance, you've been doing it for a long time. I’d love to hear from you your perspective though on the importance of how to properly vet that security, right, that the software organizations are either using today, in some cases, right without have having done so. But more importantly, that process of why they should be doing it ahead of time before actually procuring it, and putting it into production? 


John Keane  05:51 

Well, I was always surprised by what has been written about enforcing things like softward assurance and software security, but which for some reason, goes largely ignored. So, I was at a breakfast recently with a fairly high-level executive from the defense department. And the two of us agreed very quickly that everything needed to implement the President's executive order within the Department of Defense has already been written and has already been in policy. So, there is in fact, a policy memorandum that was recently reinforced by the DoD CIO, that says, “if you're using open-source software, you must do a full software assurance assessment of the code before you put it into use.” I can tell you that almost no one goes through and follows that or is even aware of that. But the reason why I was so interested in it is that in the wind OWASP (I hope everyone understands what OWASP stands for), came up with the tool dependency check. They asked the National Security Agency (NSA) to vet the tool for its usefulness. Well, NSA unfortunately, could not because it required access to networks that NSA was prohibited from connecting to. So, the fellow who was my friend, who was the head of NSA center for short software, asked me to go through to assess the tool with my team, which we did. So, we validated that the OWASP dependency checker did in fact go through and find the vulnerabilities, which were related to the weaknesses that we had been discovering in the code. So, we did the full Softward assurance scan of our libraries and found that not only did the tool work and tell us about vulnerabilities but fortify went through and identified multiple other security weaknesses, which had yet to be found as vulnerabilities. There were multiple technical weaknesses associated with the code. And there was a lot of architectural unsoundness. So, when I told a friend of mine recently that I found open-source code to be architecturally unsound, technically weak, weak from a security perspective, and filled with security vulnerabilities is that all I had to say. And his comment to me was, I was too polite. So, what I did on a rather critical application that is still in use today, to enable interoperability sharing of information between the Department of Defense and the Veterans Administration, I engaged with my team to train the developer of the software, on how to use dependency check, and how to go through and use all of the other tools to make their code, right. But when I went back and took a look at some of their recent scans, I went back to them. And using dependency check, they were able to go through and reduce the number of vulnerabilities from 600 to 16, in a matter of two months, which include refactoring the code. So, what it proved was that if you use the tool correctly, a trained practitioner, you can go through and get rid of a tremendous amount of flaws within your software. And so, I've always been interested from it from the perspective, can it be done? And once it can be done? Can it be written into contracting language? And in fact, in 2016, a federal oversight group adopted and modified some contracting language that I had been working on with several other gentlemen and actually went public with it. And it was endorsed by two major corporations here in America. 


Stan Wisseman  09:46 

Let's pull on that thread a little bit more, John, because as you know, if development doesn't have a requirement, they're not going to focus on it. When you're looking at defining security needs, when you're purchasing software or when you're using open source, because again, you're talking about the, the needs to ensure that the open source you're consuming, and your development process is secure. It's critical to understand what those security requirements are and helping ensure that you're putting the accountability in the right place. You know, is it really the consumer of that software that's responsible for that? Or is it the one producing that software? So, you know, I've also been a big advocate for proper use of contract language, when you're looking to ensure that the software, you're acquiring does not add additional risk to your organization. I was involved with the DHS acquisition, security working group under Joe Jarzaback back at DHS built security and project back in the late 2000s. And we produce some example contract language in that effort. But what are some of the things that you're seeing as far as updates? Because we've been trying to do this for a long time, you know, what are some of the new developments that can help ensure that the acquires of software, build in the right language, and you need to and possibly leverage some templates or things that are out there as opposed to trying to come up with it on their own wholecloth? 


John Keane  11:13 

Well, one of the requirements of the President's executive order was to come up with contracting language. And we have a lobbyist. And one of the things that when I was brought on board, as a consultant, I mentioned to him this contracting language, which had been developed in 2016. And I sent it over to him. And he went through and has forwarded to certain high-level people within the federal government and also within the congressional staffs. And all of the language was based on the fact of what my team who were software developers, not cybersecurity professionals said could be accomplished, just by going through and complying with the with the rules. So, we wrote the rules. And for example, in 2016, one of the rules we said that you will go through, and you will fully comply with the OWASP, top 10. And either the MITRE or the NIST or SANS top 25. And for anything that would you do not comply, you have to fully justify. Now, we only said that, because we were able to go through and to teach people who are making those mistakes, of the little amount of time it actually took to write code correctly. SQL injection being a perfect example, about how many of them I've seen over the years. And yet how many times we've had to go back to a developer and say, in 15 minutes, you could have done ABC and not have made the mistake. So, it's not just the contracting language, it's also going through and taking time out of your scanning activities and everything else you're doing, to go through and to teach people. So, the contracting language will probably be written for the federal government, and possibly, my language will be some of the input to that. I also go through and one of the big complaints that we have is our software developers are not trained. So, one of the things that I've mentioned to you, when we've seen multiple errors of the singular type I've had my team go through and train people, it takes a little bit of time, a little bit of training, I've never asked anyone to do the impossible. And to show you how simple some of that is. My software, IBM V team actually trained me how to use some of the tools. 


Rob Aragao  13:44 

Very nice, very nice, good approach. You know, let me pivot the conversation a little bit to an area around just terminology. Terminology, unfortunately you know, equates to confusion in the cyberspace as a whole for security. You know, it's confusing enough, but let's just talk about it from the area you have emphasized and been an advocate for a long time around software quality, software security as a whole. If you just take the term software security just in general, right, it gets completely misunderstood. But love to hear your perspective on how can we actually do things in a manner that really makes it so much more simpler, so much more consistent for people to understand. So, there's not all this time lost and having to educate people on different terms and confuse them in the first place? Right. So, I know you've been an advocate of Essbase. I'd love to hear more about how your approach. 


John Keane  14:36 

Okay, well, one of the things that a little bit of background. During my less than illustrious career, I was a part time instructor. In fact, I taught that back at. I was invited back to the school that I graduated from to teach in the math department. And for those of you who are familiar with this, I did not go to a real college. My colleagues in the Navy Marine Corps and Air Force content, I went to a trade school that you may have heard of called, West Point. Okay. By the way, that's a mutual insult we have to each other. But I also taught a variety of courses here in the Washington area. And I taught grammar and writing. So, I get very upset when I see poorly worded definitions. So, I don't know how many people are familiar with the new ISO standard, ISO 5055. Okay, which just was published recently, and which was the subject of my presentation, partially yesterday. And so, I was asked by the people who hosted that to refer to that standard. 


Stan Wisseman  15:47 

And John that the presentation was to who? 


John Keane  15:50 

That was to CISQ. The presentation to CISQ yesterday, yesterday being June 7, I was asked to go through to produce evidence, why a software quality is not different from software security. And I have done extensive work on that over the years, I went through and scanned that database, and just punched in keywords like availability. And sure enough, up comes at the point of time, it was 67 security weaknesses that have an availability characteristic associated with it. And the reason for doing that was to go through to produce evidence that it was with people focused on making a make writing code correctly, to achieve a correct technical outcome, that there was a high likelihood that they would also achieve a positive security outcome. And so, when I was doing my research and taking a look at these terminologies, in fact, becomes the standard, by which people are supposed to apply, in order to go through to meet software quality and software security. Software quality is defined within the standard. And it is a good derived from an ISO, glossary. And it also meets John's standards of being plaintext English and understandable. However, we found out that within the standard, which goes through and discuss the software security, it never defined software security, it defined security, but not software. And that the definition that they have for security is, well, it's somewhat acceptable, but it is rather lengthy. And so, when I went out to the web to find the definitions of software security, I found multiple definitions that are out there. And most of them are very poorly worded. So, I will be engaging with CISQ on another effort in which new standard definitions and better definitions will be my point a contribution to the development of a new standard. But I really don't like definitions that start out by saying software security is a specific concept. Because when you start out defining something as a concept, I have sat around the table, watching people pout at each other over what is the meaning of a concept, and you never get a resolution. So, we need a definition that I pointed out yesterday of software security. For those of you who are familiar with the CWE, the Common Weakness Enumerations, they went through they are engaged in going through and really cleaning and modernizing them. They have suffered from a lack of interest for the past couple of years. I spoke to the leader of that project for DHS, and they are making remarkable progress. 


Stan Wisseman  19:08 

And John, just interrupt just for our listeners, again, there is a distinction between vulnerabilities in software and weaknesses, right, I think that'd be worthwhile to, because again, that could be a point of confusion, The CWE to your point, you know, are leveraged by a lot of folks or tool vendors, practitioners. But again, when we're speaking to developers, sometimes it is confusing when we're throwing out these terms of vulnerability versus a weakness. And perhaps you could you know, help our listeners better understand that distinction. 


John Keane  19:42 

Well, once again, ISO has a very what I consider to be a reasonably good, human understandable definition. In fact, it was highlighted out at a recent congressional hearing by one of the participants testifying to Congress that she found the language that was used, she specifically called out NIST as being, quote, “too scientific, and not understandable”, but a vulnerability according to ISO on and looking at the slide right now, a weakness of an asset or group of assets can that can be exploited by one or more threats. So that was pretty good. Now what I didn't like about the website, they went through and said a threat is something that can exploit a vulnerability. Hence, we have the notorious circular definition, a is defined by b, b is defined by a, and trying to unconfused those. But in using weakness, and I had a discussion with the folks working on this at MITRE, is that there is no formal approved definition for weakness. And it shocks everyone. And they all do their research while they're talking to me on the phone and come back and say, “You're right.” So, within the ISO standard, they have a definition of a weakness, which is allegedly derived from MITRE’s work, and which they go through a specific structure of program elements in the software source code, sometimes referred to as a software anti-pattern. Okay, so John stops, right there is what is an anti-pattern? And why did you include that word in the definition, and it says that the, the pattern, anti-pattern inconsistent with good architectural or coding practices violates a software quality rule and can lead to operational and cost problems. Everyone I spoke to in MITRE denied responsibility for that definition. However, within the CWE website, which they did agree to weaknesses are flaws, faults, bugs, or other errors in software, or hardware implementation, code design or architecture, that if left unaddressed, could result in systems networks or hardware being vulnerable to attack. Now listen to those words. Those are nice, clear words. Now someone might say define flaw, faults and bugs. But by and large, my friends at MITRE said, but the definition says, “Please don't do anything stupid, or you'll get into trouble.” That's what the definition really says. So, what we need is to dig up a good definition human understandable of what is a weakness? And we have to add that to our vocabulary. The purpose is, you know, how can you possibly ask someone a non-IT business, okay to implement the President's executive order on cybersecurity, if you don't understand what they're trying to do. And, we had one study that I took a look at, and testimony in front of Congress, a businessman said, “I don't do it. I provide physical supplies to the Department of Defense. And when I read what you were trying to have me do, there was a questionnaire that filled out, I could only address five of the 37 questions, I had no idea what the other 32 were talking about.” So, if you're asking people to support an idea of better cybersecurity, and they're not IT private practitioners, they're not IT professionals, we have to do something of cleaning up our terminology, and also our acronyms as well. And once you deconflict, that, and you go through and get the definitions, people are impressed by how simple the concepts really are. Stan, I did an analysis, I haven't put it in briefing slides yet for definition of vulnerability management. But I went back to my old days of where I was the federal practice for IT Service Management for a company I used to work for. And if you change the word from vulnerability to problem, the definition of Problem Management and vulnerability management are almost identical.  


Stan Wisseman  23:59 



John Keane  24:01 

So, after all, isn't a vulnerability nothing more than a special type of problem? That's the standard. That's the reason why people in the government don't like me. 


Rob Aragao  24:19 

John, so listen. So first off, thank you on taking us back to when you began and more importantly, 50 years of the service, you've given our industry, the push that you've given it as well, but also just we appreciate that the passion energy is still there today, if not even more so at times that as you're speaking today, we really do appreciate that. And I think your point of the need to understand who you need to actually explain things to and keep it very simple, clear and concise, is core of obviously your message and has been a message of yours for many, many years. So, we appreciate you coming on sharing your journey sharing your emphasis in the things you really care about and hoping that you continue to make more changes in this regard. So, thanks for your time today, John. 


John Keane  25:05 

Great. And please eliminate anything in the in the audio, in which I refer to my wife as being an Italian from Brooklyn. Hence the ability to make an offer you can't refuse. She gets very upset when I say things like that. 


Rob Aragao  25:20 

She knows a guy is what I'm hearing.  


John Keane  25:23 



Stan Wisseman  25:25 

Thank you, John. Appreciate it. 


John Keane  25:26 

Hey, Stan, thank you very much. 


Rob Aragao  25:28 

Thank you.  


master-9380 | Thu Oct 5 23:47:08 PDT 2023
Thu Oct 5 23:47:08 PDT 2023