Overcoming Cognitive Biases During Threat Hunts and Incident Response

The most potent tool for threat hunting and incident response arguably can’t easily be entirely captured into code or automated away into a playbook or security orchestration, automation, and response (SOAR) platform. This is not to diminish the fantastic progress of the artificial intelligence research community since 1956 or rule out the role Skynet could play in future analysis. AI, machine learning, and automation play a central role in enabling an analyst to dive deeper into a broader amount of data during an investigation.

At the end of the day, analyst intuition ultimately makes decisions that make or break the case at hand. Cognitive biases are human errors during investigations that lead to an irrational or incorrect conclusion based on a particular analyst’s view of the world. Today we are going to look at three common biases that impact incident response and threat hunting and what can be done to avoid them.

Maslow’s Hammer or the Law of The Instrument

Problem: In a 1966 publication in The Psychology of Science, American psychologist Abraham Maslow published the quote “if all you have is a hammer, everything looks like a nail.” Law of the instrument covers the bias of favoring a particular tool over all others.

Solution: While there are certainly some excellent products out there to perform analysts, analysts should always evaluate the strengths or weaknesses of the instrument at hand. A few questions an analyst might ask are:

  1. What data sources does the tool use to calculate an analytic?
  2. What happens if an attacker manipulates the data sources in question one? How hard is it for the attacker to manipulate the data source?
  3. What algorithms or approaches are used to reach a conclusion?
  4. How quickly can the tool reach an analytic conclusion and how accurate is that conclusion? Is the timeframe appropriate for an incident response scenario?
  5. What are the strengths and weaknesses of the given tools given other options with the similar analytic output?

Belief Bias

Problem: If your incident response case starts with wild guesses not backed by actual analysis, you are likely a victim of belief bias. One example of a wild guess might be “it must be the <insert favorite panda/bear/APT number group here>.” Belief bias is the tendency to accept what you believe to be true despite evidence pointing in another direction. This isn’t to discount that you might be able to find and diagnose a nation-state level intrusion quickly. However, due diligence should go into testing a provable hypothesis and not allowing speculation to drive the investigation. Belief bias is particularly dangerous when it causes a case to come to an incorrect conclusion or overlook the actual root cause of the incident under investigation.

Solution:

  1. Create an incident response and/or threat hunting program that discourages lone wolf analysis.
  2. In the threat hunting or incident response program you create in step one, ensure that the culture of the program encourages analysts to challenge the assertions made by other analysts. Analysts should be encouraged to hold each other accountable.
  3. Conduct thorough hotwash sessions at the end of threat hunting and incident response engagements to capture learning points and implement fixes based on engagement feedback.

Bias Blind Spot

Problem: If you made it this far and think it’s others and not you impacting analysis with cognitive biases, Carnegie Mellon would like to have a word with you. In 2015, Carnegie Mellon published a paper called “Bias Blind Spot: Structure, Measurement, and Consequences” [1] in Management Science. CMU’s research found that people that think they are less biased than others are actually more prone to cognitive biases. Additionally, people that thought they were less inclined to cognitive biases were also less apt to listen to others or be open to feedback from others. In CMUs study, everyone exhibited blind-spot bias, and only one participant had the introspection to admit it.

Solution:

  1. Build threat hunt and incident response programs that encourage analysts involved to leave ego at the door.
  2. Build a culture that fosters teamwork and honest but tactful feedback.
  3. Address any resistance to individual feedback related to cognitive biases

Wrapping Up

Cognitive biases are a natural aspect of all threat hunt and incident response engagements. A robust threat hunt or incident response program should focus on the reduction of biases where possible. Organizations can negate much of the negative impact of cognitive biases on investigations through teamwork, challenging analytic assumptions, and awareness of common cognitive biases.

[1] https://www.cmu.edu/news/stories/archives/2015/june/bias-blind-spot.html