Security Economy Episode 5: Don't Trust Your Gut. Cyber Security & Behavioral Economics with Dr. Gleb Tsipursky

In this episode Katelyn Ilkani interviews Dr. Gleb Tsipursky, a behavioral economist, and disaster avoidance expert. Gleb is going to share with us why trusting your gut is actually usually the wrong choice, and how all of this applies to cyber security decisions in particular.


You always hear to trust your gut. People think that if their intuition says something that's right, it must be the right course of action. But what if those beliefs are wrong?

In this episode Katelyn Ilkani interviews Dr. Gleb Tsipursky, a behavioral economist, and disaster avoidance expert. Gleb is going to share with us why trusting your gut is actually usually the wrong choice, and how all of this applies to cyber security decisions in particular.

Topics covered include how your gut instincts, intuition, and cognitive biases work, why social engineering is so successful, and what COVID-19 could mean for cyber security planning over the next five years.

Dr. Gleb Tsipursky

Katelyn Ilkani

Hi, Gleb. Welcome to the show.

Gleb Tsipursky

Thank you so much for having me, Katelyn. I appreciate it.

Katelyn Ilkani

I'm really looking forward to talking about our topic today, and learning more about why trusting your gut is a bad idea in cyber security.

Before we jump in, can you tell us a little bit more about yourself?

Gleb Tsipursky

Happy to. I'll tell you actually how I got into this field of studying decision making, risk management, cyber security, and other areas.

Actually, I got into this partially because of my experience as a kid. Well, not partially because of my experience as a kid.

My parents, like everybody else, just felt that they should go with their gut - trust your intuitions, follow your heart, and so on.

And they told me the same thing. Well, I saw them making some bad decisions when they followed their gut and trusted their intuition, because their guts, unfortunately, often disagreed with each other.

For example, my mom liked to buy nice clothing, so she'd go out and she'd buy a $100 sweater. My dad was kind of a cheapskate. When she came home, he'd yell at her and say, "No sweater should be worth over $20."

And then they go at it, you know, bringing up past hurts and so on, and that taught me that their gut intuition, decision making really was not very good.

It showed me what not to do.

I was born in '81. I grew up in the .com boom, when I was 18. That was 1989 when tech leaders were partying like it's 1999. And just a couple of years later, when I was 21, 2001/2002, was the .com bust.

All the tech leaders who were the heroes of  the Wall Street Journal, there for all the right reasons, in 1999, were in the Wall Street Journal for all the wrong reasons in 2001/2002.

That taught me that it wasn't only my parents, it was the people who are lionized in our society as the great decision makers of our times.

All the tech and industry leaders make terrible decisions. And we just don't know it yet.

Katelyn Ilkani

Exactly, time will tell.

Gleb Tsipursky

Exactly, time will tell. Boeing has made some terrible decisions; talk about Equifax made terrible decisions with covering up the cyber security breach.

But those are examples where people who are lionized, and top leaders of our industries are then becoming demonized in the newspapers for bad decision making.

So, I decided to study decision making. And unfortunately, there was very little material out there on quality decision making, just because we're always told to go with our gut, follow our intuition, trust our heart.

I studied the material that was available and I started doing training and coaching in these topics as I started learning more, but I quickly ran out of quality material. I had to go into academia and study these topics.

In academia and this topic, specific areas where you study decision making is cognitive neuroscience and behavioral economics, where we study the various aspects of our brain, and how they cause us to make bad decisions.

Those are called cognitive biases - the specific patterns that cause us to make bad decisions. We'll talk about that later.

I learned about cognitive biases. I've learned about them in economic settings, and I've spent about 15 years in academia, including 7 years as a Professor at Ohio State, studying these topics while also doing consulting, coaching, training and speaking on the side.

My background has been a combination of this practical pragmatic experience as a consultant, coach, trainer and speaker, and academic expertise in cognitive neuroscience and behavioral economics.

I brought together both of those aspects of my background in my new book, Never Go with Your Gut: How Pioneering Leaders Make the Best Decisions and Avoid Business Disasters.

Katelyn Ilkani

Before we get too much into a specific topic, I want to make sure all of our listeners understand how intuition works. Can you explain this in more detail?

Gleb Tsipursky

Let's talk about how our gut reactions work before we talk about how our intuition works.

Now, gut reactions, what are those? Well, our gut reactions are actually not adapted for the modern environment.

They're adapted for the savanna environment, when we lived in small tribes of 15 people to 150 people. We were hunters and gatherers, doing tribal activities, fighting other tribes. Our main impulse, for example, for facing threats is called the fight or flight response.

You might have heard of it as a saber toothed Tiger response, because our ancestors had to jump out 100 shadows to get away from that one saber toothed tiger.

So right now, we react in a very inappropriate way to many threats, to many problems. We react to them as though they're a saber toothed tiger.

So, we get a nasty email from an employee at a company who is upset about some cyber security restrictions that don't allow them to do what they want to do in such a way that they would then get their account hacked, right?

So, we react to that, cyber security folks, react to that, with an outsize stress response. With a strong stress response, strong discomfort, and the desire is to do either of two things, depending on your personality: you either ignore that email and kind of delete it, it never happened, whatever. That's the flight response.

Or, you want to write back and say, "I'm right; you're a jerk. You'll get hacked; you'll cost the company millions of dollars." That's the fight response. Neither of those responses is the right response.

You want to actually look at the situation, study the underlying causes. Maybe there are 100 other employees who feel the same way, and they don't write emails. Or maybe they're actually not doing what you tell them to do, but they're still not writing emails.

There's one employee who decided to write an email, and that is important and valuable information for you to have. So, you want to look at what's happening underneath the systematic problems that are causing such responses.

Maybe it's a lack of compliance to the needs of cyber security, or it may be a lack of education or understanding.  Maybe there isn't an emotional investment into the problems that would happen if cyber security practices aren't followed.

It might be of a whole variety of causes, and you want to explore that and understand what's happening. Then of course, address the specific employees' concerns, calm them down, because ideally, you want to find out more information from this employee.

Again, if one person said something, there's likely 100 other people who feel something, stuff that's important to understand. And that's a good and appropriate way of responding to an irate email from an employee at your company about cyber security issues.

But that's not the intuitive thing to do at all, right. That's a very complex thing and hard behavior to follow. So that's not something that we tend to normally, intuitively do. That's one type of poor response.

We also respond very poorly to major long term threats, like the COVID-19 pandemic, and we were just talking about this before we started the interview. We are not wired to respond to slow moving train wrecks, like the COVID-19 pandemic. Not at all.

There are a number of people who just ignored the information on COVID-19, who just thought, "Oh, it's not a big deal to blow over it will be just like the Zika virus just like Ebola, just like SARS. It won't touch me." A lot of people thought that and some people still think that. Now that's one. That's the flight response.

The fight response is people who respond in a panicked way. Whether going out and buying all the toilet paper, or the equivalent in professional settings.

Not wise decision making, not smart steps. Instead of looking at the situation, understanding that we have very quickly suffered from a major disruption.

And we're in the new normal, and we need to do things very differently, whether in cyber security or in other business areas from now on. We can talk about that in much more depth.

So that is our gut reaction. That's the reason that it misfires.

Now, intuition is something that, unfortunately, we can't tell whether how we feel comes from the gut, comes from those primal tribal impulses, or from learned patterns of behavior.

Our intuition refers to learned patterns of behavior where we have effectively learned how to improperly deal with certain issues.

That comes from a combination of quick feedback after a certain behavior, and it comes from quickly determining whether the behavior was right or wrong, and repeating the behavior a lot. You probably heard the idea that mastery takes about 10,000 hours of practice.

Now, there are many things where cyber security folks have done 10,000 hours of practice, sifting through a lot of information and figuring out what's frightening, what's not. They can quickly evaluate things at a glance.

If you're working in a company with a cyber security official, you can quickly evaluate whether someone doing a pitch to you is the real deal or not, whether they know what they're talking about, or not.

But to do that, you have to have a lot of conversations with people and learn who are worthwhile people to have conversations with and who are not.

You can trust your intuition only in those specific areas, and that's what the phrase expert intuition refers to. Those specific areas where experts have a lot of feedback through a lot of practice, and they can quickly tell whether they're right or wrong. That's what expert intuition refers to.

Katelyn Ilkani

It sounds like going with your intuition, in a field like cyber security, could be okay if you're an expert, if you have a lot of experience with a specific situation, but it could be a very bad idea in other instances.

Gleb Tsipursky

The problem is that we can't tell whether it's a good idea or a bad idea, just based on how we feel, because in both cases, it feels very comfortable.

Like when you look at a box of Dunkin Donuts that somebody left in the break room, to eat the whole box of a dozen doughnuts feels very good. That's what your gut is driving you to do.

That's what we feel like doing. That's what feels right to us; it feels comfortable. Because of course in the savanna environment, we had to eat as much sugar as possible when we came across a source of sugar, like honey, like apples, like bananas. It was very important for us to do that.

We are the descendants of those who successfully ate a lot of sugar. We're the descendants of those who had a very strong fight or flight response, because the other folks didn't survive.

We are those descendants. And right now, we feel very comfortable doing both learned behaviors.

Let's say when you're driving your car. When you're switching lanes, it's probably going to be hard for you to not look in the rearview mirror. Because you've learned that behavior. You've learned that, "Okay, I need to look in the rearview mirror. I need to look in the side mirror. Those are my blind spots." Otherwise, somebody might come out and crash into me.

That's not an intuitive behavior. You have to learn how to do that. But right now, it will be hard to not do that behavior. It feels comfortable; it feels right. And so a number of actions that cyber security folks take, they feel comfortable, they feel right.

But sometimes it's going to be the appropriate learned action, and sometimes it's going to be the dominant action, like eating the dozen donuts, so it's very hard to tell in the moment.

That's why you should analyze what's going on, analyze each individual session, and see if it's the type of situation with which you have extensive familiarity, where you got a lot of feedback and which has been shown to be where you have learned whether you're right or not.

Katelyn Ilkani

So in a business sense, cyber security is framed a lot of times like risk management, either business risk management or IT risk management.

Are there situations that you can think of when a gut reaction serves us well, in a risk management context?

Gleb Tsipursky

Sure. So those situations where I talked about how to evaluate whether you have learned a topic for a long time apply here.

For example, if some employee sends you an email and says, "Hey, is this legit or not?"  You can probably quickly tell whether it's legit or not, based on having seen a lot of spear phishing emails, and having a lot of techniques that you can use to determine whether it's a legit email or not.

So for example, you can hover over the link to which the email wants you to click on. You could see if it's going to an actual, quality website, or if it's going to, you know, spamdomain.com, right?

Katelyn Ilkani

If you're a super user, if you are in cyber security or IT yourself, I think those behaviors are more of those expert behaviors.

Gleb Tsipursky

Yes. So you have learned how to do that, you know that that's something that you can easily do.

Now, let's say it's a database implementation project.

Your goal is to create a plan for implementing a database and evaluating the cyber security risks associated with implementing a database. That is not at all a project where you should trust your gut.

A lot of people do, and they're wrong. They make bad decisions because of it.

They look at a database project, and they look at how to implement things. They think, "Okay, here are all the steps that need to be taken."

Well, guess what? How often have you done a database implementation project?

That's not something that you do frequently. That's a major, major project. You do that only rarely. So, you don't have a lot of experience with getting immediate feedback on the success or failure of your database implementation project.

It's a long term project, takes a lot of time, so you can't intuitively evaluate whether you can trust your gut reactions in how to do database implementation and cyber security in that context.

So that's the kind of thing where you don't want to trust your gut, where you want to make sure to evaluate  using your head as opposed to your gut, the risk management implementations in that area.

Katelyn Ilkani

At the beginning of our conversation, you mentioned cognitive biases. I think this is a great segue into that topic, where we think about how cognitive biases lead us to leadership disasters. Can you tell us more about those?

Gleb Tsipursky

Cognitive biases, as I mentioned, are the specific decision making errors we make because of how our brain is wired. A lot of them come from our evolutionary background.

I mentioned the fight or flight response, and others, and some of them come just from the structure of our brain because of the way that we process information. Now as an example of a cognitive bias, I mentioned COVID-19. So let's talk about that.

One of the most prominent aspects of COVID-19 I've seen from folks, including in cyber security running into serious cognitive bias issues, is with something called the normalcy bias.

Now, we intuitively feel that the future will be like the past - at least in the foreseeable, short to medium term future.

It feels like it will be similar to the past. Because in the savanna environment that was very much the case, the future was very much similar to the past, so  it was not helpful for our survival to perceive the future as being in any way seriously different. That's not what happened.

Now, what happened with COVID-19?

Well, our world has very quickly changed, and it's become very different. And it will be that way a long time. Why? Because we won't be able to deal with COVID-19 until we find a vaccine.

Now, a vaccine will take a very long time to find. We know that a couple of vaccines are already in trials. And the trials will take a minimum of 12 months if we're super lucky, and the first vaccines go through correctly.

So that will take 12 months and then of course it will take you know if we do find that after 12 months, one of the vaccines works it will take maybe another 12 months to master, use it, distribute it and vaccinate people.

So that's two years. That's two years if we are very, very optimistic. It's a long time in a very optimistic scenario. And that's super optimistic more realistically.

The earliest we'll have a vaccine that will be distributed will be, you know, sometime early, mid 2022. And more realistically, sometime by 2025, because it's not likely that one of the first vaccines will work; it's much more likely if you look at the percentage of vaccines that make it through trials, it's a pretty small percent, that actually makes it for trials.

We should more realistically assume that one of the first vaccines won't work.

Now, when I talk about this to cyber security professionals, a lot of folks find it hard to believe, find it hard to accept. They feel like it'll blow over in a couple of months.

Not a big deal. You know, maybe the summer will burn out the COVID-19 or maybe, we'll just have no more restrictions and everything will be fine.

It won't happen.

That's not what will happen, we will continue to have this; it very much looks like COVID-19 keeps spreading in warm weather. So, it's very unlikely that the summer will burn out COVID-19. And it's much more likely that it will keep spreading, and it will keep popping up in places.

So there'll be some loosening of restrictions, and then it will grow into a new outbreak and then there will be a clamping down on the restrictions.

That's what happened in countries in Southeast Asia, like Singapore, like Japan, like South Korea that have successfully controlled early outbreaks, and then they loosen the restrictions and then they had secondary outbreaks. Then they had to impose restrictions again.

That is what will tend to happen and that's what you have to face. That's what you have to be ready for. That means that you need to fundamentally change your cyber security perceptions and plans for the next, realistically speaking, five years because you don't want to prepare for the most optimistic scenario, right?

Risk Management 101, you want to be realistically pessimistic. And realistically, the pessimist, you want to be preparing for being in this setting with waves of restrictions until at least 2025, the foreseeable future in a five year plan, right?

So you want to look at your strategic five year plan for cyber security, which I really hope you have, and figure out how will you manage cyber security, given that it's not going to be the kind of conditions which you're used to from January 2020.

You need to accept the new normal and that we will be in this new normal, and that you will have to work in this new normal going forward.

Now, what does that mean? Well, a lot of people are working on virtual teams, right?

And virtual teams are much easier to hack. Virtual connections with somebody's home computer, then the much more secure setup that you have  in the office, it's much harder to hack those.

So that's one of the big things you have to be thinking about - what happens if there is more frequent hacking of people's computers? So what will you do about that situation?

You have to train people on cyber security differently than you do currently, and then you have done in the past. You have to get them used to a different modality of working.

I mean, what if you know, somebody is working in their office and somebody is working in their home office, and then they forget to log out of your secure portal and their teenager finds it, and then goes, you know, exploring. You don't want that to happen.

How will you prevent that? How will you train people differently? You will need to think about training people differently, and you will need to think about security differently given the virtual context. You'll also need to think more differently about collaboration and problem solving.

Now, there's a lot of tensions that I see in cyber security, between the loosening of restrictions in order to allow companies to do business, you know, versus the what kind of restriction cyber security. Do you have to prevent being hacked, to prevent the kind of problems that Equifax experienced.

That will have to be dealt with in a different fashion, because people are not used to thinking about their home as a venue for cyber security, and you will not be able to deal with problems and resolve them as effectively.

That's just  one consequence out of one cognitive bias, dealing with one issue, so normalcy bias dealing with COVID-19.

And of course, there are many other cognitive biases, I talk about the 30 most dangerous ones in my book, Never Go with Your Gut: How Pioneering Leaders Make the Best Decisions and Avoid Business Disasters. You want to be aware of each of these cognitive biases and what you need to do to address them.

Katelyn Ilkani

I think the world has changed so quickly. And to your point, it's not how our brains are wired. A majority of people can't fathom what COVID-19 really does mean going into the future, and thinking about this isn't temporary, right?

I was reading on Twitter that someone was interviewing for a cyber security position and the recruiter was saying, "Well,  this is not a remote job. You might start remotely, but you're going to need to come into the office." And there is not a recognition of the fact that we are going to be in this situation for quite some time. Or, it's not broadly recognized, because it's so uncomfortable.

Gleb Tsipursky

Yes. And that feeling of discomfort is really important to learn to recognize. So if a cyber security professional wants to actually address cognitive biases, that's one of the most important things you need to do.

You need to understand that feeling of discomfort is an indication that you're going against your instincts. By definition, it feels uncomfortable to not eat a box of a dozen doughnuts.

Katelyn Ilkani

That's a good mental pointer to keep. How am I really feeling about this situation? And maybe that should tell me a bit how I should act?

Gleb Tsipursky

Yes. So when you're feeling comfortable with a situation, that likely should indicate to you that you're playing into your existing intuitions, and in many areas, they'll be wrong.

And unfortunately, especially right now, as I mentioned transitioning to home, there'll be more wrong intuitions, because you're not in the office, you're not used to things. I mean, let's say collaborating with people.

One of the important cognitive biases is called the empathy gap.

So the empathy gap is where we underestimate the kind of emotional drivers that other people have. It's already pretty easy to underestimate the importance and the kind of drivers that other people have when you're working in the office; it's even harder to do when you actually can see the other person and understand what they're feeling, understand what's motivating them, understand their body language, their expression, their tone.

If you can't understand all those things, you'll be falling into that empathy gap more often, you will be failing to understand those emotions, and even highlighting emotions, because emotions have been shown by recent research, to drive about 80 to 90% of our behaviors and our decisions.

So when we just intuitively go forward and let ourselves go as we will, our emotions will drive about 80 to 90% of our decisions, our behaviors, our beliefs and our thought patterns.

You want to be very much aware of what other people are feeling if you want to:

1) be an effective cyber security professional, and especially if you want to

2) influence other people, which you need to do effectively in order to get them to comply with your cyber security guidelines (especially as you rise up in your career and you interact with other people who don't speak your language, and who don't have the kind of beliefs and perceptions and intuitions that you have).

So that means that you need to be more aware of their emotions, and right now is a time when it's harder to be more aware of their emotions.

You need to pay double as much attention to focusing on their emotions, and focusing, of course, on your emotions, you want to know what's driving you to make certain decisions.

What do you feel comfortable with? And is that feeling of comfort really justified?

One of the worst things that happens to us is that we feel comfortable with something and therefore we want some information, and therefore we believe it's true, and we feel uncomfortable with certain information and therefore we believe it's not true.

Like Katelyn brought up about the recruiter feeling that, "Okay, you know, this is just a temporary thing you'll need to come back into the office." The recruiter's feeling uncomfortable with the idea that this will be a long term disruption, and that many companies will undoubtedly keep their teams virtual because they'll find that that's good enough for them, and they don't want to pay for the expensive headquarter's lease.

That's something that many companies will do. Partially, again, you know, this recruiter might be an extrovert, which is unlike many people in cyber security, and they might feel uncomfortable with the fact that they're not going to be surrounded by people all day.

Many people in cyber security will actually feel more comfortable not surrounded by a lot of people. And I say that as somebody who's an introvert myself, and I do need my alone time.

Katelyn Ilkani

Let's stay on this track of emotions for a minute, Gleb.

When we think about this idea of social engineering in cyber security, people take advantage of other people's emotions. This could take the form of knowing that people are fearful or out of their natural working environment and exploiting that through phishing attempts, and things like that.

We have another layer of complexity here because not only are the cyber security professionals at a bit of a disadvantage, because the environment has changed so much, it actually gives some advantage to the bad guys.

Gleb Tsipursky

Yep, it absolutely does.

And this is one of the things that I mentioned when they're in the home office. They're not used to thinking in a cyber security sense. They're used to thinking, "Oh, I'm at home wherever I can relax." That's what they're used to thinking of.

They're not surrounded by their colleagues, including their cyber security professional colleagues, who remind them, "Okay, this is a work environment." They're not surrounded by the office, which reminds them with numerous physical cues that this is a work environment that's different from their home environments.

If they're feeling more loose, more relaxed, they're much more likely to click on a spear phishing attack.

There's a number of spear phishing attacks where an accountant is sent an email from supposedly an executive saying, ship some money, transfer some money into this account and then they comply.

And of course, that's much more likely to happen right now when an accountant works from home, because they are less likely to be paying attention to this issue.

So in social engineering, what happens is that these very bad guys studied cognitive biases. They study how to influence people effectively, and they play into people's cognitive biases.

So in order to address this problem as a cyber security professional, you need to do the same thing except in reverse. You need to study the cognitive biases that cause people to make bad decisions.

And you need to take the steps to address these cognitive biases and educate the employees about the kind of cognitive biases that might cause them to make bad decisions.

For example, authority bias will be very relevant to the accountant following the guidelines of a supposed authority. Executive authority bias is kind of like it sounds where we tend to follow authority blindly.

Now, in the savanna environment that was very good; it was the right thing to do to follow the alpha monkey in the room.

And, when the boss said, jump,  and say you say how high because that's how you survived as a tribe.

If you didn't have that sort of top down authority structure that we're used to, we wouldn't have survived. So the tribe would have fallen apart and all tribal members would have died.

That conformity to authority is a really important element of how cyber phishing, how spear phishing, specific targeted phishing works.

You need to learn that that's what happens. And you need to enable the employees to challenge that authority, not challenge in terms of you know, I'm not going to do what the executive says, but to check with the executive, if that is indeed what is happening. And it's uncomfortable to check with them.

You know, there are similar examples that happen with nurses and doctors. It's very well known that nurses often see doctors doing really problematic things like let's say, not following basic hygiene - not washing their hands. And that is a big problem. And they have not felt the ability to point out to doctors and say, "Hey, this is a problem. You should really wash your hands before touching the patient."

So this really caused much higher mortality than it had to have.  How we learned that was that there was an intervention driven by behavioral science, behavioral economics and cognitive neuroscience that was instituted in many hospitals called the checklist.

Checklists, it's a very simple intervention, where before approaching any patient, the doctor and the nurse had to perform a number of things, including washing their hands, and of course, other things.

So now, as part of instituting the checklist, and there was a lot of resistance to instituting checklists because  doctors said, "What are you talking about? Of course, I wash my hands,"even though nurses said very often "They don't."

So, as part of instituting the checklist, everyone was empowered to point out to each other to make sure that everyone is following the checklists, doctors can point out to nurses, "Hey, have you followed those steps? " Nurses can point out to doctors, and with a checklist providing that intermediate authority between the doctor and the nurse, the checklist enabled the nurse to tell the doctor that "Hey, have you followed all the items in the checklist?"

Even though the nurse might know very well that the doctor did not follow the items in the checklist. Now she didn't have to say, "Wash your hands." Instead, they refer to this common source of authority for both of them - the checklist - and said "Hey, did you perform all the steps?" Then the doctor is like, "Oh, yes, I need to go wash my hands."  So that was the kind of interaction and that was really helpful to address that authority bias.

One of the things that I work on when I do consulting arrangements with companies in cyber security is actually doing checklists on cyber security issues for employees.

So having them go through step by step, any email that is not routine, to go through a checklist that they specifically have to follow in order to make sure that they minimize the problems associated with spearfishing.

That really helps to reduce the kind of problems that are associated with it, where there's an automatic trigger. "Oh, this is an unusual email. Let me take the steps, X, Y and Z," and so on. So that might be something that you think about instituting in your company, especially now that people are working from home.

It might be also part of the routine as they close out of their secure VPN when they leave the their work for the day, and not let their teenager go into their work computer and all these sorts of problems.

So that is something for you to think about - having a checklist that everyone follows.

Katelyn Ilkani

We've talked about a lot of strategies, and I'm wondering if you have a top three that you think our listeners should leave with?

What are the biggest ideas that if they can't remember anything else, you'd hope that they would still keep in their minds?

Gleb Tsipursky

I'll give you one strategy that's composed of five elements.

And this is a strategy that you want to use to make any decision that you don't want to screw up.

So, you want to minimize the problems with any decision? These are five questions that you should ask about that decision.

Here are the five questions.

First, what important information didn't I yet fully consider? So what evidence then should I take into account? So when we talked about getting the email from the employee, you might  feel that the employee is being dumb.

You might feel that way, but there might be elements about the employees compliance with your cyber security restrictions that you simply haven't considered, that might be met in an equally effective way, using different policies that would address the employee's concerns.

And if you do that, well, that will be much better for the company, for you and for the employee; it will be a win win situation.

It's very uncomfortable for us to consider that because then we have to acknowledge that we're wrong, and we don't like to talk about that. It feels inherently uncomfortable, right.  So that's something for you to consider. That's one.

Second, what dangerous judgment errors haven't  yet been fully addressed.

You want to learn about these cognitive biases. I talk about the 30 most dangerous ones in my book, Never Go with Your Gut: How Pioneering Leaders Make the Best Decisions to Avoid Business Disasters. And there's a list of over 100 on Wikipedia, which talks just generally about these cognitive biases, not about how to solve them.

Then third, what would a trusted, objective advisor tell me to do? So think about that angel on your shoulder? What would that person tell you to do? What would you tell a close, trusted friend to do in the situation?

You can get 50% of the benefit of this just by asking the question. And of course, you can get the other 50% of the benefit by calling this person, or if you're a millennial, texting this person, asking if you have addressed all the ways this could fail.

This is a really important one; you want to think about how this decision could fail.

So let's say you're instituting a checklist for everyone to do when they're checking their email, if that's a non routine email. Then you want to think about, well, "How can instituting this checklist fail? What are all the ways it can fail?"

So imagine that it completely failed, really failed to achieve your goals and think about why it failed to achieve your goals.

So maybe, for example, it failed to achieve your goals because the top leadership thought it was a dumb idea that only the rank and file employees should do and that the top leadership shouldn't follow this checklist, right?

I have seen spearfishing policies fail so often because of that specific problem, the top leadership.

Of course, spear phishing more often targets top leaders. So that's not smart, not good. But that's a way that the checklist can fail. And you want to address that failure in advance. And there are many ways that you can do that. Addressing failures in advance is very helpful.

Finally, what would cause you to change your mind about this decision? What would cause you to revisit this decision that you made?

You want to think about the kind of evidence that would show you that something is not working, let's say a new cybersecurity policy. What would show you it's not working, what would show you that it's a bad idea? And think about that evidence in advance of getting that evidence before you start implementing it.

Then it'll be much easier for you to change your mind, revise your opinion, and revise the policy, especially if it's part of a team decision making process to revise the policy, rather than if you just go ahead and start implementing the decision without deciding on the metrics  and an evaluation point.

So these five questions, just using them on any decision, if it's the one takeaway that you take away from this interview, that's the thing that they want you to take away - use these five questions.

They only take a couple of minutes to go through if your decision is right, and if your decision is wrong, that's great, then they will help you discover that.

They will save you many, many thousands of hours, and your company many 10s of thousands, or hundreds of thousands of dollars, on a bad decision relating to cyber security.

Katelyn Ilkani

Thank you so much, Gleb. It's been a fascinating discussion. I've learned so much from you today, and I appreciate you being on the show to share this with us.

Gleb Tsipursky

Thank you so much, Katelyn. It's been a pleasure.

Katelyn Ilkani

And that's a wrap. Thank you for joining us for this episode of Security Economy. Check out our episode lineup at battleshipsecurity.com/blog, and don't forget to subscribe. See you next time.

GO TOP

🎉 You've successfully subscribed to The Battleship Blog!
OK