Good? Yes, of course. Here's a call. This is going to be good stuff. I know. I know, Jordan. Anyway, so I had this talk from about 50-plus minutes to about 30 minutes or so. So it's going to be a little discombobulating to try to get kind of the meat of it. So the project is called Security Outliers. And the sub-type of cultural cues from high-risk professions. The idea was, how do we take a look at other regulated licensed, more advanced, if you will, in their formality and experience? Professions like that incorporate the appropriate aspects of risk management and cultural approaches that they have built into their training, education, as well as operations and audit, et cetera. Things like flight, medicine, surgery, special operations, people like that that have inherently risky jobs where the tactical is really implicit in strategic outcomes. And so that's the idea. My collaborator, you might know of him from Twitter. His name is Dr. InfoSec Chris Belsos. PhD, CSSP, CSN, CFPP, GCFB, I'm sure he's working on a couple others. I'm not a doctor. So basically, what we've done is we've put together a layer eight approach. So we're not dropping OAs on you. It's not about lead hacksaurs. This is not about anything like that. It's basically a layer eight type of talk. And we've both done this kind of in our spare time. I need your help to make this more appropriate to the types of teams that you're involved with. So InfoSec has become kind of like medicine, or the military, or surgery, or these hyper specialized little micro niches where everybody's really, really good at one thing. It used to be 10 years ago if you knew a firewall, and an IDS, and some antivirus, you were a security guru, right? And now it's like, I'm an IDS guy. Really? What kind? Or I'm an endpoint security guy. Oh, you do full disencryption? Like, what is it that you do? Are you post intrusion detection? What kind of endpoint security? Do you do VDI security? What kind of endpoint security? This has become so frustrating. It's not a technical talk. It is how do we integrate stuff, and how do we attack the layer eight of politics, the business, the communication layer that puts it all together. So soundness is based on my experience in working with CSO tech people in big companies and big universities and a lot of politics that constrains the big budgets and the technical know-how that is resident there. But this is a layer of talk, not a technical talk. So I want your help to figure out where some of these analogies or approaches are appropriate in information security, if at all. So what's the deal with security outliers? I'm just going to get this thing up. Security outliers came to me in a flash as I was watching, I think, C-SAN Book TV, because I'm not a nerd. And this dude with a really huge app program, and this is my old app program. Some of you have seen pictures. It exists. This thing is Matthew's Advil. He was on Book TV, and he was talking about his book called Outliers. And he talked about a particular part of the book that was about aviation disasters and aviation safety, not security, but PSA, as we heard from David, more about aviation safety. And how do we get planes off the ground and then back in the ground in the same fashion as they took off, same amount of live people, functioning aircraft? That's quite a huge area of research in psychology, corporate resource management, a lot of interesting things in layer A type of stuff. Even when all the airplanes and the systems are working just fine, there are still problems and people die in tragic moments and stuff. So that fascinated me, and it's really reminded me of a lot of my CSO customers that I consult to and help them figure out how to get things done on the network. And the idea here again is figure out how do we take lessons learned, literally with dead people on the ground and charred remains, and translate those lessons learned from very high risk profiles into information security. And I think you'll see some parallels here. So the idea is to create a leadership product where we take people who are technical and are going to move up the ranks and become chief security officers or director of net security, whatever the issue is, and help them understand that political layer in our communications issue. And then also work with people who have background, let's say in law or management, who have JDs and MBA types that are now becoming all of our bosses, the people who are executives. So information security has changed. As Nick said, money changed the game. And there's a lot of money in information security and maybe even more so in compliance. And politics and communications is a big piece of that. Technology is, in some sense, almost secondary. And so in my co-present time when I'm not Twittering all the time, I'm looking at this. And the future state is going to be a wiki, a website, maybe some training of cons, et cetera. And it basically consists of interviews with people who are leadership and practitioners in high risk professions, as well as some academic research. And obviously Dr. Velsos focused on academic research, and I focused more on interviews. I'm actually going to be doing some things with high risk professional people, checking out their training, going to their bases, going to their trainings, and then talking to them, trying to experience some of these things and report back to you. Hilarity will, I still guarantee it. So the idea of the aviation disaster, we're looking at typically an aviation disaster results from about seven consecutive errors. And that basically means any of those seven errors in and of itself would not have caused a disaster. But together when you chain them in just the right order and the right circumstances, you have that perfect storm kind of thing, and then really bad things happen. And people have studied aviation disasters and figured out that in many cases, maybe in most cases, it involves errors of teamwork and communication. Not suddenly the airplane blew up or suddenly a fuel line went bad or suddenly an engine dropped out of the sky. Because interesting enough, the pilots are well trained to deal with those. And when those do happen, they typically solve them with their training. They compensate for it in some way in a technical fashion. It's a technical issue. But the real disaster happens when there are teamwork and communication errors. And something doesn't go right between the wetware and the property. Security breaches, the folks from the Verizon Business Blog, about a month after Biltus and I presented RSA in March 2010, they came up with this and said, they happen exactly like that, like aviation disasters. The result of a combination of minor or seemingly insignificant errors. And again, if you read one book this Christmas season, I would recommend the paperback outliers go to it directly. I think it's chapter 7 where he talks about aviation disasters and look at this stuff. So one example we'll go through is Korean Air in Oman and Womx. Korean Air had a major problem with flying into things they weren't supposed to be flying into and killing people. And that's the title of the term. 4.79 per million departures in terms of accidents, as opposed to 0.27 per million departures. For example, United Airlines, which was a major US carrier. So the average major US carrier back then, comparing them, was 0.27 per million departures. And Korean Air was almost five. So 17 ads over the normal American airline. Why is that? They hired people to study them. Why? Because they were about to get decertified from flying over North American airspace. Which if you're running a for-profit business, it's really bad for your bottom line. So you can't come in over Canada or the United States because we don't trust you to operate really big machinery laden with fuel and 500 people over our airspace. And certainly not land at our airports where you will run into other. Yes? Yeah? So they had to really understand something fundamental was missing. What was going on? And they figured out that in many cases, again, teamwork, air, communications, it was actually a linguistic and communication issue that would later be one more problem. Specifically, some of the people they brought in in aviation safety area. Look at inter-cockpit communications. Cockpit resource management is between a captain and a first officer and a navigator. And then between the cockpit and ground control, for example. These are different types of communications. At least when we're talking within the crew, we speak the same language. We're from the same culture. We've done the same training program. We have some of the same SOPs. But now we're talking to ground control JFK. Now we're talking to ground control in Bogota. And now we're talking to ground control in Tokyo. And that's an extra layer of complexity and newness and innovation, if you will. Every time you land at a different airport, it's a different intercultural gap that you have to make sure to understand how they're doing it. So the previous one we were talking about, Avianca Airlines, they actually were circling JFK and trying to talk to the ground crew there. And if anybody, we have a couple people here from New York. And the folks from the Columbia here were trying to talk to the JFK ground crew and said, hey, we kind of want to land. And ended up being brushed aside by the JFK people because the JFK people didn't really feel the urgency. And what ended up happening was that the plane literally ran out of fuel, circling the JFK, and fell to the ground. Which is ridiculous. But it happened. And it was strictly a communication issue that Columbia folks in the cockpit, well-trained pilots that operated on a fairly safe aircraft just plowed into the ground because they ran out of fuel, circling a major airport in the city. Because the difference in communication between JFK ground crew and the cockpit. In the career, it was more of an issue of what's called power business. What does that do with technology? Nothing at all. It's all about the weather, right? So psychologists and linguists figured out that there are several ways to mitigate language. And we all do this every day when we talk to our friends and spouses and our bosses and our subordinates and our friends and family, et cetera. So the first one, look at this kind of hand. Hey, those clouds ahead, they look menacing. How about a preference? Wouldn't it be good to do XYZ? And then a query. How would you like to do this? So moving up the stack to a little bit more of a suggestion. Let's do this. And then a crew obligation statement. We need to comply with PCI. I mean, we need to not plow into that mountain. And then a command, do. Now, in some cases, when pilots get trained, if you're a captain, you don't rank me in the cockpit and I'm a first officer, I would say, and I report to you essentially, but I'm as well trained almost, if not as well trained as you in the cockpit, but formally you're in command. If I see you do something that you're so endangering, everybody behind that cockpit door, they worry about the plan of the mountain or do something really dangerous, and I don't have time to explain to you the nature of the disaster that you've steered us into, I will actually grab the controls and say, aircraft is mine, and your sole response will be, aircraft is yours. Not a pilot, but I talk to people, and that's basically, at that point, we skip over all the mitigating language and just go, aircraft is mine, right? And they train with this kind of stuff, but in some cases, in Korean Airlines, for example, was not enough. And so they had figured out over all the cockpit crew that they've studied, the linguists, captains would issue commands, and most first officers would be, all the way down was that. So formality of rank and hierarchy, which in information security is kind of like, ugh, that's gross, right? Who does that? We don't need that. That actually exists in a lot of other professions, most other professions, much more in a kind of a formal and strict and hierarchical way, and in the info cycle, we're a little bit more averse to that kind of requirement. But in some cases, it's actually really good to be averse to overdeference to rank, because what happened at Korean Airlines, the people who studied it basically said there was a cultural issue. It was not their aircraft, they were, not the right aircraft, they had a great aircraft. It was not their ground crew, they were turning the knobs and switches wrong, they were maintaining the right, you had some bolts there, putting the right fuel in and all that good stuff. So it wasn't the ground crew, it wasn't the pilot training, it wasn't the aircraft themselves. It was strictly a cultural issue, it was an overdeference to rank inside of the cockpit. And so when someone eventually did something stupid that was gonna steer them into a mountain, the ocean, whatever, then the people whose responsibility was to sit everybody on their plane and say, airplane is mine, or move up a stack a little quicker from hand to parole obligation statement, for example, they wouldn't do that because they were afraid of being not deferential to rank. And so that was strictly a cultural issue, and they ended up fixing it. And they said, hey, nothing personal, Captain, but your first officer, your navigator, and your crew are going to call you on it, and that's your job, and you can't do anything about it. So that's the big change that happened. So, you know what the answer is, but does this just remind you of somewhere where you've worked and you said, hey, screw this place, I can't work here. They don't listen to me, they don't get it, their management is really all about rank and power games, not about getting things done. So we gotta be careful to not be too rigid or too soft, but find that balance within the particular situation that we're running. So looking into how big disasters happened, we talked about the Swiss cheese effect, and it's basically, looking at all these accents, human factor analysis, and FAA studies, this kind of stuff, it's a little bit old, but the general principle is here, is that aviation accents cannot be attributed to a single cause or even a single individual. It used to be when, you know, 70s, 80s, people were like, let's go and investigate this aviation accent and see if it was pilot error. And so people wanted to know, is it the pilot or is it not the pilot? And in reality, there's a lot more than that, because the huge system of interacting into a lot of components, from the ground crew to weather to all these things going on inside the crew and between the crew and the ground control. So all these things come together, and they call it Swiss cheese. The Swiss cheese looks kind of like this, where these are some of the problems, organizational influence, unsafe supervision, preconditions, front-side acts, and actual unsafe acts, act of failure, is what we call it, kind of a fun term, act of failure. And when the holes in the Swiss cheese align just right, think about a hunk of Swiss cheese, a triangle of Swiss cheese, and you shine a laser through it, and just at the right alignment, it'll come off your side, and that's when things go boom. So this is what that looks like. All these things together align just right or wrong, and then you have dead people in a chart box. Yay. Any questions? Hey, Paul, did you talk about, I have to miff out one thing. Did you talk about the language, the language saying something about this here? Yeah, the mitigation? I mean, the actual, how they made the South Koreans when they speak in English. What's that about? No, I didn't want to go into that little detail. Yeah, that's pretty cool, because it kind of relates to what you should talk about, how security people act when they communicate in a different language. Yeah, I mean, we're going to, a little bit about that in terms of the CSO layer. So to me, the CSO is kind of the layer between the people who do the technical work and are very, either their technical job and the people who are kind of the executives in the business, or in some cases, other pieces of the business, like development operations. They have to be the liaison and the champion and the translation from our technical talk to finance to budget to operations to dev, et cetera, get people to cooperate with us. So I think I'll incorporate Mark's suggestion. He's also read the book and has some specifics about how to train people to talk in a language that the other team that you're talking to, kind of intercultural, will literally understand, using their terminology and standard terminology. Is that right? Yeah. So again, please read the book, Outliers, and help contribute a minute. I'll probably put up a website in a couple weeks when I have some downtime over a crystal clear plan. So talk about this. And medicine. So medicine obviously has its own safety and quality issues. And it always passed on to me. I used to be an EMT in my first probationary period at Red Cross when I was an EMT in the 90s. I showed up there in my stars and iron uniform, and I thought it was cool. And I got attached to a pretty experienced EMT, and it was a charged nursing ball. And we got a call, and we ran. Probably half a mile behind this 70-something-year-old woman who was having chest pains, radiating pain to the arm, classic heart attack stuff. And I was setting up the oxygen. And as the senior EMT, as the test, he was interviewing her and helping calm her down. I was expecting the shh when I pressurized the hose to get her oxygen as we were waiting for the rig to come in, and I heard nothing. So failures like that are very similar to a lot of the problems that we have. Oh, somebody forgot to take the perfectly good technology and perfectly good people, and there was a failure in little accountability in the shift change. Nobody switched out to fresh oxygen. Now, she ended up being just fine because the rig came literally right then as I was doing the technical tap on the O2 pressure gauge, because I literally thought I was hallucinating. This can't be happening. That would have been a huge, easy lawyer rolls Royce right there, because even for volunteers, failure to do the basic oxygen as an EMT. So we talked a little bit about checklists with some of the medical people we spoke with, and obviously aviation has a major reliance on checklists and doing proper procedure. And if you talk to fighter pilots or pilots, they tell you, if you ever get in an aircraft and you see the pilot just kind of skipping stuff, especially the small craft, you don't see them doing the, certainly the airplane and checking everything out and the list, get the hell out of the airplane. Just get out. Doesn't matter. Could be Claudia Schiffer taking you to the Caribbean. Get out of the airplane. Whoever the male model equivalent is, a Claudia Schiffer substitute. Sorry, I don't know. Chris Nickerson could be taking you to the Caribbean. Well, he's pretty good at that. He is, though. But seriously, so medicine and aviation military, they have specific, rigid education training requirements and operational, very, very important quality assurance metrics and huge database of what works and what doesn't. Where are professionals starting to get to? We're starting to get better at understanding what works and what doesn't, getting a little bit better visibility of metrics. So the issue with medical folks is that they actually started learning from the aviation community. And I was interviewing an ER doc who was responsible for training ER docs at UCLA residency program for emergency medicine. He said they were looking at aviation and said, OK, one of the things that research practitioners learned about how to deal with system staking, checklists, liability, personal versus systemic. What they did was they looked at a few decades ago, it was again, was this pilot responsible for this accident, as opposed to what was the big system, the Swiss cheese effect? So they're looking at this now and thinking, how can we get doctors, or maybe even information security folks, to report problems and near misses? The use of this is you're a pilot and you died in a crash. Well, guess what? People knew that you died in a crash and it was either your fault or not. And then people became more sophisticated and said, OK, systems staking, where did all these seven mistakes line up in Swiss cheese? But then they did something smart because they wanted to get better metrics. They wanted to understand, what about all the near misses? What if there were four or five or six pieces of Swiss cheese holes that were lining? The seventh one was just near the surface. We want to know what that looks like. How prevalent are they? What are those four or five links in that chain? And so they changed the liability and they started sharing information without blaming the pilots individually. Because again, looking at the big system. So they got a lot more understanding of the big picture of the near misses in aviation disasters and made it a lot more safe for everybody else to fly. I spoke with a vascular surgeon that talked about implementing what was described in Atul Gawande's book, The Checklist Manifesto, which I also recommend you read. It's an amazing book about how to implement checklists properly, what kind of checklists are there, where they fit, where they don't, where they come from. And it all was interesting, kind of like when I was an empty in the Red Cross and they gave me an oxygen kit with an empty oxygen bottle. An inexcusable, ridiculous mistake. How does a surgeon with 20 years experience, went to med school, got his internship, his residency, and then 15 years more experience, et cetera, how do they go into an operating room, aircraft in aviation, weld it, and then take off the wrong way? What's up with that? So they came up with a universal safe surgery, safe first surgery checklist and started implementing it. And one of the issues is, how do we get these specialty surgeons who are kind of the rock stars of our industry, they're kind of the drop in the old age people of the medicine community, right? How do we get those people who are very, very experienced, very well-highly trained, specialized, do amazing things, miracle workers every day, how do we get them to say, hey, we want you to listen to this nurse over here? She's your scrub nurse. She's not a big shot surgeon. She's not a rock star. The anesthesiologist over here, he's not a rock star. But you know what? You're a team. And you're responsible for the outcome of this surgery as a medical chief of that team, basically. And if you don't listen to the nurse and the anesthesiologist or some random janitor that came in and saw something, you get this whole making a hissing sound. Isn't that weird? Right? Good. And these are some of the things they told me about, that they actually caught major mistakes in oversights that would have killed or maimed a patient, which is not good for the hospital, not good for a doctor, certainly not good for the patient and the family. So he explained how they went around inside the hospital community and basically tried to sell the concept of the safe surgery checklist. It's so basic and silly. Do we have blood of this type if we make an artery? What leg are we amputating? Things that look stupid but are tragic consequences, strategic consequences. And so the issue, and I don't have time to go through all the helplessness today, but we'll put all these interviews, verbatim off on the website, is that you have to operate as a team and understand what your role is as a leader in a team or as a support in a team and figure out how to get the cultural play, the wetware interface with the technology and the wood bank stuff and get that properly allocated and properly communicated when you're doing any sort of high-risk work. In some cases, daily operations might be high-risk work because if you're afraid to go to someone and say, hey, I think I saw something. I'm not sure I quite understand it. Is this weird? Or if you see a problem and there's a no-winding mentality. I am scared of people who say no-winding. No-winding to me means I don't care. Is that weird? To me, no-winding means I don't care. You come to me and with a problem that you may not know how to solve. Some people, have you ever heard people say, don't come to me with a problem unless you have a solution? Why don't you take a stick in your mouth and just pull a trigger on your career? Because what you've done is completely cut off communications from people who daily see these types of oversights. Hey, your oxygen bottle is empty. Is that weird? Is that supposed to be like that? Hey, it looks like you're about to cut into the right leg, but I think I saw the cancer was in your left leg. Is that weird? Hey, don't come to me. Come on, people. So a lot of people who see problems aren't necessarily the people who can solve them. And what is the problem solution anyway these days? These are such complicated systems. We could take a steering committee of doctors with 20 years experience and look them up with seven different ideas on how to solve a problem. So don't tell people to not whine in a productive way. Tell them, come to me and spill it. I want to know what's going on. Chantra, Baltimore. Anybody here from Baltimore? Yeah, Baltimore. I love that place. It's the best place to get shot. The world's super. Fair, fair. How do you learn Baltimore? So University of Pennsylvania Wharton Business School, which competes for number one, two, three best business school in the country after Harvard, Stanford, Chicago, depends on the day. They went out to Chantra, which is part of the University of Maryland system, and said, hey, these guys have an amazing savory. You can get shot, stabbed, shot and stabbed right over, shot stabbed right over, dropped on a helicopter. And if you get wheeled in or shot, they do amazing things. They know how to save people. That's their specialty. And what they said was not was, why are those guys scalped sharply over across the street of Hopkins? No, that's not what they did. They looked at leadership and looked at how people interact in a team. Why didn't they study the scalps? Why isn't their echocardiogram not better? Why isn't it that they get to the MRI faster? How about the volume? It happens a lot. So therefore, they're the most experienced with it. Yes, but there is absolutely a talent of experience. But the way they approach it was a specific aspect of this article. I recommend you read it, and we'll give you the slides if you want. You have to, we'll talk a little bit about that. There's just not enough time to get out of it. We have Adidas coming through. Yes, you're doing a nice adidas. Adidas, huh? So Mark Fisher, you might know him as Armourguy on Twitter. I spent about three hours on the phone with him and typed verbatim until my hands were like claws. And it came up with these gist of the interview. He came out from the Army, and he used to be up until recently the director of security incident responsive at the World's Biggest Ceremony. And these are some of the leadership perspectives I came to talk with him. He says leverage newness and stability orientation to retain quality talent. So if you're out there looking for people to staff your team or if you're a specialist, know yourself and know the people you're working with because not everybody's the same, and not everybody's the same as you. And that's good, because some people like stability, some people like innovation and newness. Figure that out about them and give them something to work with that complements that personality type. Definitely came from the Army. Obviously, he wanted to create a planned disruptive change to test out readiness for incidents, which was their job as a team. He wanted to absolutely address the power distance issues that I asked him about from the aviation safety research. And he said that his style doesn't allow his subordinates that work from him to directly challenge his leadership or his authority, but rather, he allows them to be wrong. And he gives them the authority and the latitude to take calculated risks. And he had some pretty interesting metrics that he said, when I was in the Army, no more financial damage than $1 million, and nobody gets to get killed. But you can take risks that don't result in that. And he had some other metrics. We're going to skip over some of this. So this he specifically addressed. You can't tell that I'm not the boss and I can't tell you what to do, but you absolutely should take calculated risks. And you want to tell me when I'm wrong, and you want to tell me what your opinion is, until I get to the point where I decide this is what we're going to do. Know your team. Who's emotional? Who's analytical? I'm going to go with bigger on you, because I've got to finish this up a little bit. Balance being directive versus collaborative. Some people, in certain situations, there's a time to be collaborative. Hey, what's going on? What do you got? What are you finding? Why don't you do this over here? Why don't you do this over here? And then come back to me. And then at some point, he'll basically say, OK, it's time to be directive. In some cases, you're doing the strategic planning. Let's talk about what's important. What's the new innovation we're going to implement this year? What should we talk about in terms of priorities for the budget? More collaborative. See, he switches between collaborative and directive. And then the collaborative is more, let's figure this out together. But I have the final decision. And the facilitator people really talk without fear of political repercussions. Your job as a leader is to let them come to you and whine, quote unquote, and tell you what they see. It's their job to tell you, what do I see that's wrong or needs improvement, even if I don't know the solution. Because again, we're dealing with rapidly evolving, complex situations. Nobody knows the solution. And certainly not the person who's admiring the technical details necessarily would know the solution. Because a lot of times, again, it's the system's thinking, big picture. So we have to get together, put our brains together, and figure this out. Concepts, rules, and scenarios, high intensity training, allows them to make mistakes and learn without actual consequences. Again, it might not be revolutionary to a lot of you, but I don't see this happening in a lot of places. And I've worked with some pretty big, end-to-start teams. Mentor people give opportunities to make mistakes to cities in low risk scenarios. Post-training, put pressure on them, but don't break them. Know your folks where it's too much. Don't humiliate them ever. You're going to create an inside threat there. But put pressure on them to grow and help them get training and the resources they need. Leadership is two ways free. As a leader, he came from the Army. He understood and explicitly ceded technical superiority to the people on his Innocence Response Team. Hey, I'm Marvin, and I run this team. And I can tell you what to do when that's appropriate. What do you think is going on here technically? What is your expert opinion on this particular packet that I finished, this particular image that we created from this hardware? So he doesn't pretend to be technically better than the people who are technically better than him. Very big deal. Anybody have a boss like that? Give your full name. Smart people respond to getting on your side. If you're going to create pain and hostility when they come and give you feedback, when they come and say, hey, I have a problem, or hey, I need to run some and buy you, and we'll have a wrap. If you're going to create pain and hostility when they do that, you're going to lose. You force them to shut up in order to avoid the pain. I like the zone hack, right? Change the tone to avoid the pain. Don't be the Zohan in that case. Don't be that guy. Sooner or later, it's a quote from him, right? Response leader and the biggest man in the world. Sooner or later, the circumstances come together to go bad in a catastrophic way. What? That kind of sounds like Swiss cheese. So interviewing all these people from radically different parts of life, aviation, security, started talking to some people in special forces in 2011. We talked to some people in pretty interesting tier one type of people. Not active anymore. What's special forces then? Special army, special forces. Yeah. Special forces. Special forces. Yeah. So this is power distance during the Swiss cheese. So every single leader that I identified and talked to essentially had this paraphrasing the power distance and Swiss cheese effect. It's as if they conspired to say the exact same thing when asked an open question about what is your biggest leadership lesson. This right here. Sooner or later, circumstances come together to go bad in a catastrophic way if you don't treat your people right and you don't let them talk to you. Ross Alisson. He's in charge of the ER program at UCLA. Half a year he's teaching ER docs out of the ER docs in one of the busiest hospitals in the world. Half a year he goes to places like Sierra Leone, Iraq, and other fun places where there are a lot less technical resources and a lot less liability as well. And he actually says he really loves practicing medicine in those types of places because he actually gets to practice medicine and not defend himself against theoretical liability, over test, do all this kind of defensive medicine things. Interesting. He's the author of a very cool book called The Last of War. He was a third year medical student and he went to Sierra Leone where there was a version of acute really nasty hemorrhagic fever kind of like Ebola-like. And he interred himself voluntarily in that situation. He wrote a pretty interesting book about that. And he talked about systems versus personal liability. So how do we encloset that? I don't know, it seems like maybe Alex Hutton and Verizon folks are starting to do this in the DBI or in the ER system. How do we get pulled for encloset metrics? But the idea for me of implementing that is we want to create safer practices by allowing small mistakes and then say, hey, I made a small mistake. It's supposed to, hey, I made a small mistake. Oh crap, I'm getting sued. So how do we take that system from the aviation community and insert that into medicine as they did, but also potentially encloset? Because nobody here has ever been breached. Those other people have been breached. Because somebody had firewalls, I.D.S., and elite hackers working for us. So other companies have been breached, but we're safe. Except when SB 1386 and his very hands forced us to report PII issues. But nobody wants to talk about being breached. That's starting to change a little bit because everybody's getting known. But I want to figure out how do we create a system that allows people to talk about being breached without some of the problems that said, hey, I was breached. And VARIS is an interesting approach to getting some visibility to what kind of breaches are going on without some of the problems. So kind of like that program. Going to follow it and see what makes sense. This is kind of the airline model. It's encouraged to report near misses, as well as all these catastrophes that actually happen. Because the system tends to catch it and looks at trends before they become catastrophes. Again, it's Swiss cheese and stuff. Studies of ER admissions. This is a little bit about elite hackers and our communities love affair with rock stars. Love the rock stars, no problem. They're all very smart people. But we have to be careful of nobody in this room. But some people in the information security community have some version of, I don't know, adult ADV or shiny blinky syndrome. Again, nobody in this room. Over there. But we tend to focus on the really cool stuff, right? The old is that the new sploids, the new ways to bypass existing controls, even if they get detected, you get arrested, whatever. So they looked at ER admissions for heart attacks. Basically, you get rolled into an ER and you have coronary. All the cool stuff you see in the ER documentaries and ER shows like splitting the chest and electricity and open heart massage, all this invasive, very expensive, and dramatic type of thing that security people, and ER people tend to gravitate to. Not sure why, but it's just, again, not only in this room, but some people in this community tend to gravitate toward that dramatic stuff. But it's interesting because the most consistent winner in terms of effectiveness of reducing mortality when you get rolled into an ER is aspirin. What? Aspirin, over the counter, four cents a pill. What the hell, aspirin? I'm gonna crack this chest, open heart massage, it's gonna be some electricity, it's gonna be like super cool, it's gonna be blood, everything. No, no, no, you can do that if you actually have to, but before you do that, the moment that person gets wheeled into the ER, you will find someone who will open a little nylon bag and take a little aspirin and say, hey, aspirin mouth, shot, swallow, check. There's actually someone on the team that is responsible and accountable for doing that stupid, small, undramatic, strategically important procedure because no one will sue you for it, giving someone aspirin in the ER. No one will say it's too expensive. And yet, despite universal agreement, this was absolutely the most attractive thing and it had literally zero downside. They did studies, why are 10 to 15% of the patients not getting an aspirin? That's ridiculous, right? So again, what could go wrong? It's not that it costs money, it's routing the offer. It's not that someone will sue you for it, it's not an experimental, dangerous procedure, it takes a second to do. You don't have to get elite, certified, you don't have to get elite surgeon training to put an aspirin in someone's mouth, right? So what's the issue here? So they studied this, they said, hey, 10 to 15% were not getting aspirin, that's an astronomical failure. And so they said, okay, we need to figure out how do we get people to actually comply with this concept of aspirin on arrival. If you go to some hospital's website, they actually have an aspirin on arrival page on their website, it shows a little bar graph, national average, 95 or whatever, our metrics, 98%, and then Obama says, 98 is better than 95, can't make that up. So we don't wanna miss the evolving threat, the cool stuff, the blended threats though, it's all those things, they're absolutely important because our industry changes so fast, we need to understand what's going on, what's going on over the horizon, in fact, that's my favorite thing to do, is to talk to CSOs and tell them, in two to three years, you will be spending money on this or else, and they tend to listen, because in some cases, I'm right. But we absolutely have to also look at the very basic things of, hey, why are we missing this? What is the analogy, what is an analog in this information security? I don't know, we find the major administrator in the UNIX server farm, what did we forget to do? Whoa, we forgot to revoke the credentials, I'm like, we can't crawl in our servers now, how did that happen? Because we forgot that the basic stuff, who was accountable, big word accountable, for doing the stupid, silly, free, no one was suing for it, hey, Bob over there did something stupid, got in fire, and now he's crawling back into the network with his unexpired credentials, right? What's up with that? So, in some cases, the shiny blinky is really cool and important, but we forget to do the basic stuff that is absolutely important. The problem is when you fire, or get rid of the person whose job is to give the aspirin, or to terminate the credentials. So the question was, and he gets a shaker for that. Oh! Because that's absolutely true. In many cases, I was in a non-profit, I was an IT guy, I was a printer guy, I was a security guy, I was a security guy, I was a security guy, I was a security guy, I was the bodyguard for the people who came in from overseas, I was the bodyguard for people from New York, and I was like doing all that stuff, and if I had a problem with them, it would have been screwed up. So a lot of places have like the IT guy, and if you're really hot-speak, you're gonna be the IT guy, and you're gonna be the security guy. So, that's absolutely the problem, is how do we monitor the people who are responsible for those, where's the accountability if you're kind of self-policing? And that's a resource problem and a management problem, not a technical problem, which is again, layering. One of the interesting things is, he also came in and said, because of accidents, kind of like in aviation, always multi-factor liability is very doctor-focused. So when an airline crashes into a mountain, you don't go and say, hey, I'm gonna sue that pilot. I mean, unless he was really doing something stupid like drinking on the job or sleeping on the job, which happens very rarely, but does happen. It could be that they go in and sue the airline, not the doctor. The way- That's because of the insurance. Yes, so the way insurance works in the medical community is really weird, and I think we're gonna see information security a little bit of a risk management by insurance in the next three to 10 years, it's starting to happen. So that's kind of an interesting artifact, is this fear defensiveness of personal liability, even though they're part of a huge system that should be interlocking and working as a team, but the doctor or the MD is gonna get sued and basically lose their career. One of the things that the big push is in healthcare IT is putting electronic medical records online, and electronic medical records systems that allow the doctor to basically navigate through the care procedure and ideally spot human errors. Hey, you weren't supposed to give this person who was 180 pounds.1 per kilogram, but rather.001 per kilogram. It's different between death and not death, right? And so some of these things are really cool because they have the potential to stop human error type accidents. People are sleepy, they're tired, they're exhausted. It's bad, care, talk to any of our doctor, a surgeon or a resident and they don't sleep ever. So the idea is, and what we've seen is in some cases, these things are actually engineered by kind of demand to be a liability defense for the hospital and to create a workload to optimize for maximizing billing. Not necessarily patient safety. So in a lot of cases, the doctors who are the end users of these systems are being forced to use a system that is really, really complicated, takes a lot of time to go through, get trained on and actually operate in the clinic every day. And then I'm saying, hey, this kind of seems suspicious. It looks to me like it's really optimized for billing and liability, not for the care of the patient. So what ends up happening, and I know this doesn't happen in information security where we have informal documentation and conversations between people, you know, kind of under a friendly aid, but it happens in the medical community. And so what they ended up doing is because of these weird EMRs that are not properly designed according to doctors, for patient care, they end up having a parallel chart system where the docs and the nurses kind of have their own. I'm not gonna talk about it here. And then you put the stuff that's in the computer for the lawyers to see, right? And what ends up happening is that some of that stuff over here doesn't get documented over here. And so the next doctor might see it gets transferred to a different hospital and the electronic medical record is incomplete or otherwise problematic. Interesting, not sure what the info-sec analog is for that, but somewhere this is happening in our community. I'm not sure where. Any thoughts on that? Not documented at all. Not documented at all. Or documented in a way that is more covered by a compliance than actual security for the patient health, maybe? Well, sometimes it comes out to where you have to have an SOP. So you write an SOP, you get stuck in a folder, but the SOP's five years old or three years old. And it's just become the normal undocumented pass from person to person lore, this one person knows everything. But the SOP is just dragged out every year to show the compliance officer that, yep, got an SOP, it's right here. So it kind of sounds like reducing liability, maximizing billing, but patient safety is kind of a thing. Whatever. You know, we're also seeing issues between specialization and processes, where they're moving away from less specialized people and putting the blame on processes. And so if things aren't working, then we need more processes because it's not people anymore. I would say it's we don't have enough process. So that sounds a little bit like, Devin's talk today about TSA, where he said, use tech for smart people instead of dumb people to support smart tech. Is that right, Mark? Yeah. Mark Silver, I'm gonna run through this real quick. He's more of a business person who's never really fully technical. He's from Australia. He was a, now he's a CIO for a major multi-billion dollar portion of a global 50, I believe. And his success for anything was basically translating between the business problem and the technical problem. He worked for the Queensland government in Australia as a state governor basically. And he explained to a minister who was in charge of all these multi-billion dollar budgets the size of encryption keys in terms of planets and atoms. So he found a way to translate by analogy and help someone understand the technical problem. That was his kind of initial beginning of the love affair with being a translator and advocate for technology. InfoSec at this very large global 60, I think, company perceived as an inhibitor to what he ended up doing. There's a lot of people talk about these days but he actually succeeded in doing. I followed this team for a while. Embed InfoSec and identifying InfoSec requirements at the beginning of the project were at an empty end. Huh, minimize the cost and it's embedded in the culture. Amazing. What the, where are we again? What? So SDLC and secure code and getting people to talk to us at the beginning of the project, it's like, hey, we're putting up a website or, I don't know, we're throwing up terabytes of anonymized information on the web. What could possibly go wrong? I don't know. De-anonymization algorithms with GPU clusters in the cloud, for example. So these things are happening and so what they did was they discussed specific executive sponsorship and a specific, very powerful executive was responsible for the success of the project to countable and they created serious meetings and they were aware of the style of teams of security and communication for later. Now, this might not seem revolutionary to you if you've done the work at that level but for a lot of people coming up to the technical ranks, this is kind of weird and scary and so I think we need to figure out how to learn from people that have done this kind of stuff and emulate the model. So technically focused CSO, according to Mark, is doing a disservice but not having the right discussions with the CEO, CTO, CIO, CFO, board, et cetera. Technology secondary focusing on the business conversation and to virtual technical people to help them translate that risk, almost done. We talked about special operations a little bit. Special operations is kind of glitzy and interesting and cool. Most info sec is not quite so dramatic, right? But there are certain parts of info sec that require extremely high intensity training, extremely high intensity skill sets that are very cutting edge charts and revolutionary in some cases are very flexible in thinking. We looked at some academic analysis of what makes special operations teams special. How are they different than regular military? There's actually just this many people, about 15,000 special operations forces on record I guess versus a really big multi-million person active military. And so Booz Allen did an article about what makes them special. It's very readable, you don't have to be a military person to understand it, I recommend that. Total immersion, very deliberate practice, extreme realism, constant feedback and after action views. Does this happen in your environment? Are you comfortable with the kind of training you're getting? Right? Special forces folks are deep generalists that have a very coarse specialty but they're also cross-draining with other people. So if one person goes down they can kind of fill in their role, right? Everyone knows something about the other person's expertise. So they have a very deeply linked cross-functioning team and they work in small teams in strange and hostile environments and they get the job done because of their selection process and the extreme training they go through. So almost the end, 10 ways to get leadership to make their cases a little bit of distillation, achieve a strategic position while a whole plan to explain all of what that means but figure out how to integrate yourself in the culture and the executive group by understanding how they talk. Like you said Marcus, you want to understand their language and what their priorities are. Work with champions, makes it pretty relevant, not fought but makes it pretty relevant to what they're doing. Dispel the media hype instead of being a part of it. Make it real in some way, try to quantify risk, get some good metrics if you can, they're coming out, they're not perfect but they're starting to come out. Be part of the organization, help institutionalize security and if you can, know the CEO and speak to them, work with them, get to know the executives if that's at all possible and make them understand what you do. Talk about that. So basically within information security, some of the things we're starting out with, auditors who look beyond the business technology, do you have a firewall, do you have an inferior detection system, do you have any firewalls? Look at actually how the people in information security talk to other parts of the business or within themselves asking how does this work, what do you do different? What kind of teamwork do you have, what kind of leadership do you have, do you feel comfortable talking to them if you solve a problem? Have you ever talked to someone and said hey I have a problem, they said shut up. If you're an auditor and you see that, dig a little deeper and put that in a report because that's actually a very important thing. Security managers, understand are you the kind of person who would deter people from coming to you with problems or are you the kind of person who would say hey, it's irrelevant or relevant, I'll listen and figure out how we can create it. To figure out what your perceived fire distance is between you and people who report to you and try to chip away at it, it's a little too hierarchical and formal. Actively manage your people, talk about how armor guys are going to have to provide training, life simulation opportunities, send them training, get them exercises in the house. If you're in risk, look at people and processes, not just the traditional risk measures but get some of that larity type of stuff. And then governance, we need more of the security culture, we need understanding of communication, sharing, and liability and starting at the very top. That's it, any questions? Nice. Yeah, our dealer. Thank you. You have one more question? Yes. Go ahead. That's a question so under the comments on this idea, I really like it, but we've seen in other areas the flight industry needs a lot of people using it, for example, because it has this high power, but also in things like supply chain of standards, using the federal state, a lot of people are trying to apply similar models, like, well, if we look at what's done in flight industry with chain of custody for airline parts, they're both developing high-flying software, people are trying to do that similar modeling as well. I think it's a good way to go. Good news. The biggest problem is when they do that, how do they realize how these people can acquire that data? Yeah. From the established industry of many techies as versus what is established in the field like techies. Yeah, so that's the comment was basically a lot of types of professions are trying to incorporate the 100 or so years of risk management culture from aviation into their particular risk chain or risk process, and it's not easy to do because it doesn't have perfect analogs, and again, it's that cultural change. But I think we can learn from aviation and other highest professions to figure it out. Better start now than later. Cheers. And now, Mark Sattery introducing our guest. Hey, I'm Paul. Paul is my director. He's absolutely right about that book. You can actually get that book on audiobook if you're anything like me. Around here is great to have audiobooks. Some people have up to an hour or so to mute. So if you can download that book right now and listen to it on your own.