Posts tagged risk management

The Endowment Effect in Information Security

There are many reasons why business managers seem to ignore the risks brought forth by information security professionals. I outlined six of them in an earlier post. In this note, I’d like to add another possible explanation: the endowment effect, which affects how humans value their possessions.

Richard Thaler coined the term endowment effect to describe the tendency of individuals to value the item in their possession more highly than the same item possessed by someone else. In the words of Dan Ariely, “once we own something, its value increases in our eyes.” Dan also points out that ownership isn’t the only way to endow something with higher value:

"You can also create value by investing time and effort into something (hence why we cherish those scraggly scarves we knit ourselves) or by knowing that someone else has (gifts fall under this category)."

This propensity seems irrational, yet it was observed in numerous experiments.

Information security professionals experience a sense of ownership for the data they safeguard. Therefore, the endowment effect might bias us towards overestimating the value of this data. Business managers are somewhat removed from the data by layers of applications and business processes and aren’t affected by the bias to the same degree.

In other words, business managers might value the data less than how infosec professionals value it. This would contribute to the disagreement regarding the level of risk associated with security of the data.

If information security professionals are, indeed, irrationally influenced by the endowment effect, what can we do about it? Alternatively, when persuading business managers to agree with our perspective, how might we influence them to experience the endowment effect to the same extent?

— Lenny Zeltser

Are Anxious People More Vigilant in Information Security?

Common wisdom suggests that anxious individuals are better at spotting danger than those with more mellow personalities. However, research by Tahl Frenkel and Yair Bar-Haim indicates that the opposite may be true: People with nonanxious personalities might be more skilled at spotting the early signs of trouble. This finding could highlight the type of people best suited for information security jobs.

Spotting Fearful Faces on Photographs

Some of the individuals selected for the study possessed anxious personality traits on the a State-Trait Anxiety Inventory scale, while others were nonanxious. The participants were shown photographs of a face that exhibited a progressive degree of fearfulness. The researchers measured how early in the progression the participants could detect fear on the photos.

"As expected anxious participants needed significantly less stimulus fear intensity for conscious fear detection," researchers discovered. However, only non-anxious participants began exhibiting early signs of fear detection before consciously recognizing fear on the photograph. The Scientific American Mind’s March 2012 issue clarified:

"The brains of anxious subjects barely responded to the images until the frightened face had reached a certain obvious threshold, at which point their brains leapt into action as though caught off guard. Meanwhile nonanxious respondents showed increasing brain activity earlier in the exercise, which built up subtly with each increasingly fearful face."

The researchers concluded that anxious people might lack the ability to detect threats in a granular manner and “therefore might face threats with no prior warning signal—further contributing to their already heightened anxiety level and perhaps associated with their enhanced baseline threat vigilance.”

Early Detection of Threats in Information Security

If there is a stereotype of an information security professional, it is sure to include anxious characteristics, such as concerns regarding threats, distrust and perhaps a degree of paranoia. These traits allow us to recognize the signs of danger, building defenses in anticipation of risks and also responding to the situation when the defenses fail.

Yet, those professionals who are calm and nonanxious might be better at spotting early warning signs of an intrusion before it escalates into a major breach. This skill is similar to the ability to detect the subtle signs of fear when looking at a photograph of a face. If this is true, then I wonder whether such individuals trust their instincts and have the time to begin investigating the potential problem early enough.

If this is interesting to you, see my earlier post Are Mistrustful Individuals Better at Information Security?

Lenny Zeltser

Why Are Executives More Prone to Accept Risks?

Information security professionals are often frustrated when their concerns regarding vulnerabilities and associated threats appear to be ignored by the company’s executives. I already discussed 6 reasons why business managers ignore IT security risk recommendations. I’d like to add a few more to the list, based on recent research into the links between power, prestige and decision-making.

High-Status Individuals Are More Trusting

In one study, Lount and Pettit researched how a person’s social status might influence the extent of trusting someone. In one of their experiments “participants were primed to experience either high or low status and then given the opportunity to send money in a trust game.” In this context, high status might be associated with the prestige of being a business executive, while another extreme of a low status might be associated with an entry-level mail room clerk.

The participants who were assigned a high status were more trusting when sending money, hoping that the recipient would return the funds. Low-status individuals were more cautious. The researchers concluded from this and related experiments that “having status alters how we perceive others intentions” to believe “that others have positive intentions toward us.” They also pointed out that:

"The possession of status can fundamentally alter our expectations of peoples’ motives toward us, and in turn, influence our initial trust in others."

People with prestigious positions, such as executive managers, might be more trusting of others and, therefore, might be willing to accept more risks.

Power Leads to Overconfidence

In another study, Fast, Sivanathan, Mayer and Galinsky explored the links between an individual’s perception of power and self-confidence. Their research found that people who believed themselves to be powerful experienced more certainty in the accuracy of their believes and opinions. They confirmed that “power increases overconfidence in the accuracy of one’s thoughts and beliefs.” This matters in organizations because many “high-impact decisions are based on perceived precision of relevant knowledge.”

The effect of this phenomenon is magnified because not only the subjective sense of power causes people to become overconfident in their knowledge, but also “overconfident people tend to acquire roles that afford power.”

Prestige, Power And Decisions About Risk

My perspective on these findings through the lens of information security and related risks is as follows:

  • Executive managers experience a sense of power and prestige associated with their decision-making abilities and responsibilities.
  • Such individuals might be inclined to make risk decisions while being overly confident in the accuracy of their understanding of the issues.
  • Such individuals are also likely to be more trusting than people whose positions aren’t as prestigious.
  • The result is that executives might accept risks from a perspective that is too trusting or without spending enough effort to understand the issues.

So, there you have it: a few more reasons why executives are more prone to accept risks, in addition to the 6 explanations I offered earlier. You might also like to know that choice fatigue contributes to the willingness to accept risks and that sleep deprivation contributes to risk-taking behavior. We just cannot help it—it’s in our nature.

Lenny Zeltser

When Successful Security Measures Are Taken For Granted

When information security controls consistently protect networks, systems or applications, then there is the risk that the defenses they provide might be taken for granted. The executives might wonder, “We haven’t had any breaches in recently. Why do we need a CISO?” Home users might muse, “Why do I need an antivirus tool if I’ve been malware-free for months?”

How might we preclude information security successes from leading organizations and individuals towards complacency? How can we make sure that the safeguards continue to be valued by their beneficiaries?

One way to mitigate the risk that the security measures will be taken for granted is to collect meaningful metrics that show that the safeguards are active and provide value. Determining what metrics to collect and how to do it is hard, and a good way to start learning about this topic might be Andrew Jaquith’s iconic book Security Metrics: Replacing Fear, Uncertainty, and Doubt.

The makes of computer security products also need to remind users that the product is active and providing value. The challenge is to do this without annoying the user with frequent and irrelevant prompts. Some of the tools I’ve seen accomplish this by presenting the user with periodic activity reports, summarizing the number of blocked intrusion or infections attempts. Though the numbers in such reports are rarely meaningful for people, they act as reminders that the system is being protected.

Both metrics and activity reports can be an opportunity to not only show that safeguards are in place, but also help the audience judge whether the controls are in need of tuning. This might also be a chance to educate the audience how to improve security posture even further.

For instance, an antivirus tool might report on the number of times the user clicked links embedded in email messages and explain the risks of this behavior. This would allow the product to not only remind the user that the protection is active, but also provide additional value through education. A CISO might show an increased percentage in security patch coverage in a given department, using it as an opportunity to gain support for expanding the program to other groups.

Hand-picked related posts:

Lenny Zeltser

Design Information Security With Failure in Mind

Information security can fail in many ways, despite the best plans and intentions. That’s why we must design the security program with failure in mind. You should be able to detect suspicious activities before they escalate into major incidents. And when the data breach does occur, the security architecture should limit the incident’s scope, dampening the spread of malware or the attacker’s influence to give you time to respond.

We can learn how to account for the eventual failure of controls from the more established industries, such as boat-building. Consider the following excerpt from a stability certificate issued by the U.S. Coast Guard Marine Safety Center to a passenger boat:

The boat was tested to stay afloat even when one of its compartments is flooded. (Click on the excerpt above to see the full text.) The authorities expect the vessel to be designed with failure in mind, recognizing that even when it is built and operated perfectly, an unexpected event may cause a hull breach.

Designing an information security architecture with failure in mind involves accounting for possible issues at network, system and application levels. For instance, accomplishing this at the network level might involve deploying multiple firewalls to segment the environment into several tiers according to risk.

In the words of Denis Waitley, “expect the best, plan for the worst, and prepare to be surprised.”

Hand-picked related items:

Lenny Zeltser

Most of us have been tying our shoelaces incorrectly. We were taught the weaker form of the knot, probably because the stronger version is harder for children to master. As Terry Moore demonstrated in his 3-minute video, tying the stronger knot involves bringing the second loop of the shoelace around the other loop in the opposite direction from what we are used to.

There are two reasons I bring up the shoelace story on this security-focused blog.

Lesson #1: Best Practices

First, we should remember that just because we’ve been following certain “best practices” for a long time, we shouldn’t assume that our approaches are the most optimal for the tasks at hand. The reliance on “best practices” is one of the addictions of information security professionals.

What if the security advice we’ve been passing along to each other as tribal knowledge isn’t good? Are there assumptions that we don’t question that prevent us from achieving stronger security or making more practical risk management decisions? What if we rely too much on the common security frameworks? Much about “best practices” is unproven and can probably be improved upon.

Lesson #2: Return on Investment

The second point I want to make involves Return on Investment (ROI). If someone were to offer to teach you a better way of tying shoelaces, how much would you pay for the lesson? The stronger knot comes untied less often, saving you valuable time and mitigating the risk of shoelaces coming untied when you’re being chased by robbers or when you’re rushing to cross the street.

It’s easy to conceive a formula that will put value on the secret of a stronger knot based on the cost savings or risk avoidance… Yet I doubt many of us would pay to watch the video that began this post. This is why I suggest being cautious of using ROI to justify the purchase of security technologies. Avoiding a potential loss is different from generating income.

But, back to the better way of tying shoelaces. The stronger form of the knot really works. I cannot tell you how many car accidents and robberies I avoided by investing 3 minutes to learn how to tie it. The stronger knot has become my new best practice.

Lenny Zeltser

The Role of Rituals in Information Security

"A ritual is a set of actions, performed mainly for their symbolic value," according to Wikipedia. But rituals are more than that, because of the role they play in society and the attention to detail required to perform them. Ritualistic behavior brings a sense of control to otherwise stressful situations. In many ways, information security practices are also rituals that make us feel in control, though without always addressing the risks.

A paper by Boyer and Lienard examined ritual behavior in obsessive and normal individuals. They pointed out that many of the rituals in which people engage dictate specific rules, combinations of action and compulsion. They also explained that:

"The thoughts that prompt rituals revolve around a limited number of themes, such as contagion and contamination, aggression, and safety from intrusion… Ritualized behaviors also include many recurrent themes, such as washing, cleansing, ordering and securing one’s environments, or avoiding particular places."

Concerns related to information security seem to fit well into these themes, as would the actions we take to ameliorate the situation.

In a paper on behavioral practices associated with threat detection, Eilam, Izhar and Mort suggested that ritual-like behavior is a salient characteristic of precaution in humans. Following explicit instructions during a ritual “confers a sense of controllability and predictability.” Furthermore,

"Since uncontrollability and unpredictability are major stressors, a repeated and precise performance of the same acts can generate a sense of controllability and a consequent reduction in fear from the abstract threat."

The information security rituals InfoSec practitioners follow typically take the form of best practices, which are also codified as frameworks and standards. Though some of practices are based on experience, metrics and data, many are collection of steps that we’ve been following out of habit. Like most rituals, following these practices requires painstaking details, rules and actions. Doing this makes us feel in control.

Rituals seem to reduce stress because, according to Boyer and Lienard, compulsory action sequences overload working memory. This “might make it more difficult for intrusive thoughts to become conscious.” In the context of information security, our rituals might relieve stress and offer an illusion of control without actually addressing the risks.

Related:

— Lenny Zeltser

Herd Behavior in Information Security - The Good and The Bad

Some information security professionals have expressed concerns regarding the state of the infosec industry, suggesting that many of our practices don’t work, that our interactions resemble speaking into an echo chamber, and that we spend too much time philosophizing about pointless topics. One way to make sense of what’s going on is to consider whether the industry’s dynamics resemble those of a herd—we might find that not all is bad, though there is room for worry.

A Benefit of a Herd: Increased Vigilance

In a paper on behavioral practices associated with threat detection, Eilam, Izhar and Mort highlight a survival benefit to animals that live as a herd or flock: “the larger the group, the lower the level of individual vigilance and the greater the sum of collective vigilance.” Individual animals can spend more time eating rather than watching out for predators, even though the collective as a whole is safer.

In fact, not all members of the herd are equally attentive, which is one of the advantages of this social grouping. The researchers clarified that the individuals at the perimeter of herds are more vigilant than those in the center. As the result, “higher vigilance by certain individuals enables other individuals to reduce their vigilance” among social animals.

Dare I draw a parallel to participants of the information security industry? Through on-line and in-person interactions we exhibit social herd-like characteristics. Through work, research and writing, some of us pay closer attention to threats, vulnerabilities and risks: this allows others to remain less vigilant without compromising the security of the collective. This seems like a good thing, even if some of the individuals discuss topics without immediate practical applicability.

A Downside of a Herd: Anxiety is Contagious

Eilam, Izhar and Mort point out that there is a characteristic of herds that might counterbalance the benefit of increased collective vigilance: Their research showed that vigilance, and thus anxiety, among social animals is contagious:

"Being among a group of vigilant, watchful and worried conspecifics might exert a contagious effect and, in consequence, other individuals may also become vigilant, watchful and worried."

This is why the “echo chamber” syndrome is undesirable: By rehashing the same topics among the same groups of individuals, we are infecting each other with anxiety that might be disproportionate to the actual risks. This seems like a bad thing. To address this issue, we should probably:

  • Exercise caution when using FUD to market security
  • Try not to overestimate the repercussions of security breaches or the severity of threats
  • Avoid limiting our social interactions solely to members of the information security industry

Being part of the herd has its benefits, though it’s not foolproof. May we graze long and prosper.

Lenny Zeltser

Fear vs. Anxiety in Information Security: What We Can Do

There are differences between anxiety and fear in the way people experience and react to these conditions. We might find that addressing the fear of information security events is more practical than handling anxiety. It’s worth considering how we might shift people from the state of anxiety to that of fear, so we can ultimately address their concerns.

Anxiety vs. Fear

Lang, Davis and Ohman examined physiological effects of fear and anxiety on humans, finding a number of distinctions. The authors also provided an explanation of how fear is different from anxiety:

"Fear is generally held to be a reaction to an explicit threatening stimulus, with escape or avoidance the outcome of increased cue proximity. Anxiety is usually considered a more general state of distress, more long lasting, prompted by less explicit or more generalized cues."

In the context of information security, experiencing fear implies being concerned with a specific event, such as a particular threat agent compromising a specific set of data. We have a way of dealing with such concerns through threat modeling, which involves a structured approach to identifying and rating threats.

Dealing with anxiety related to information security is much harder. Anxiety is a reaction to abstract concerns that are harder to identify, and, as the result, harder to address. For instance, a person might be anxious about the company experiencing a data breach.

Handling Anxiety is Harder than Dealing with Fear

Eilam, Izhar and Mort published a paper on behavioral practices associated with threat detection. The wrote:

"An animal that is anxious about the possibility of a nearby predator, might then come face to face with a predator, which will convert the anxiety into a real and perceptible danger that produces a fear response. Alternatively, the anxious animal might not encounter a predator, and the question is then one of when it will calm down and become less anxious. Relief from a state of anxiety is subjective and thus varies among individuals."

Marketing and persuasion practices in information security often incorporate the principles of Fear, Uncertainty and Doubt (FUD). FUD isn’t always bad, as Dave Shackleford suggested. The problem is that much of FUD is designed to induce anxiety, rather than fear. There are several categories of FUD tactics, as Mike Rothman outlined; some of them can incorporate fear and thusly persuade people to pay attention to relevant information security threats.

Focus on Fear, Rather Than Anxiety

By focusing on specific threats tied to actual events and meaningful risks, we can focus people’s attention on specific “predators.” We can then consider how to alleviate the associated fears through risk management. Breach and threat reports put out by security vendors (e.g., Verizon Data Breach Investigations Report) make this possible. In contrast, if we focus on inducing the state of anxiety, we might succeed at selling some security products, but we won’t actually address risks in a meaningful manner. It’s possible to react to fear. It’s hard to leave the state of anxiety.

Related: Don’t Underestimate the Importance of Feeling Secure.

Lenny Zeltser

Information security architects use documented frameworks to codify key practices, technologies, objectives and other elements relevant to the organization’s security or risk management program. While there are clear benefits to creating and following such frameworks, we need to be mindful of the risks of adopting them without hesitation or customization.

Example: Marketing Strategy Frameworks

The notion of frameworks is present in many industries. For instance, among the marketing frameworks taught in business schools is one called Four P’s. Designed to assist in evaluating a marketing strategy for a product, it advises businesses to consider the following elements:

  • Product: What is the product?
  • Price: How is the product priced?
  • Place: Where or how will the product be purchased?
  • Promotion: How will the customer find out about the product?

Another framework speaks of four C’s—Commodity, Cost, Channel, Communication—as important elements of a marketing strategy. Yet another framework, called STP—Segment, Target and Position—advises how to focus the strategy on the appropriate parts of the market.

These, and many other frameworks sound insightful. Yet, it is unclear that a one person’s framework is more useful than another’s. Dan Ariely, a behavioral economist, discusses in the above video how he led a class of executive MBA students through a discussion that used two arbitrary frameworks that he made up without the students even questioning the frameworks’ wisdom.

The moral of Dan’s story is that it’s easy to force the world into some framework without understanding the nuances of the situation and without evaluating the framework’s usefulness.

Information Security Frameworks

We love frameworks in the world of information security, too. We have standards, such as ISO 27001/27002 and PCI DSS, regulations such as HIPAA and FISMA, as well as lots of designs, templates and guidelines often grouped under the heading of best practices. Too often, companies attempt to adhere to these frameworks without understanding their applicability and limitations.

For instance, PCI DSS is pretty prescriptive about its security requirements. Yet, organizations often misinterpret them in a way that suits their budgets and business practices. Some companies even attempt to adopt PCI DSS as an approach to securing non-PCI environments without considering the extent to which the threats and security practices might differ.

As another example, consider the numerous controls listed as part of ISO 27002. Companies, possibly earnest in their desire to build a information security program, attempt to implement all of them. They do this despite ISO 27001 advising that the controls’ applicability depends on the organization’s “needs and objectives, security requirements, the processes employed and the size and structure.”

A related concern is regarding our reliance on advice labeled best practices. These frameworks, according to The New School of Information Security, are “activities that are supposed to represent collective wisdom.” The book warns against relying on them blindly, in part because the groups codify them have vested interests in security decisions. The book also points out that best practices “typically don’t take into account differences between companies or, more importantly, between industries.”

Usefulness and Dangers of Frameworks

Frameworks aren’t magic. They are put together by individuals like you and I, who usually do our best to codify our experiences and relay advice to other practitioners. This can help by providing a structure for making risk decisions, achieving compliance and thinking about hard security problems. However, we must be mindful about the dangers of blindly following frameworks without considering how they apply to a given situation or customizing them to the specific needs of the organization.

Lenny Zeltser