Developing Analytic Technique and Defeating Cognitive Bias in Security
In this presentation, I discuss the evolution to the analysis era in information security and the challenges associated with it. This includes several examples of cognitive biases and the negative effects they can have on the analysis process. I also discuss different analytic techniques that can enhance analysis such as differential diagnosis and relational investigation.
Analysis is Everywhere
• Making judgments based upon data
• Security Analysis Happens for:
– Malware Analysts
– Intelligence Analysts
– Incident Response Analysts
– Forensic Analysts
– Programming Logic Analysts
• My main focus is network intrusion analysis,
so this talk will be framed through that.
Network Security Monitoring
• The collection, detection, and analysis of
network security data.
• The goal of NSM is escalation, or to declare
that an incident has occurred to that incident
response can occur.
The Need for Analytic Technique
• Kansas State University Anthropological Study
on SOCs - Key Finding:
– “SOC analysts often perform sophisticated
investigations where the process required to
connect the dots is unclear even to analysts.”
• Analysis == “Tacit Knowledge”
Analysis: Thinking About Thinking
• We need to critically examine how we think
about information security analysis.
• We aren’t alone!
Perception vs. Reality
– “A way of regarding, understanding, or
– “The state of things as they actually exist.”
Let’s take a test…
• Variation of Stroop Test (John Stroop, 1935)
• Measures Cognition
– The Process of Perception
• Identifies Gap Between Perception & Reality
• Used to Measure
– Selective Attention
– Cognitive Flexibility
– Processing Speed
What is Bias?
“Prejudice in favor of or against one thing,
person, or group compared with another,
usually in a way considered to be unfair.”
•Perception != Reality
•Perception is Everything, but Fallible
•We tend to perceive what we expect/are
conditioned to perceive
• Defined: Heavily relying on a single piece of
– Src/Dst Country -> OMG China!
– IDS Alert Name -> It say this is X, so it must be X.
– Timing -> It’s every 5 minutes!
value of perceived
patterns in random
– The great “beaconing”
• Defined: Strong belief in something due to its
repetition in public discourse
– “Chinese Traffic is Bad.”
– “That rule generates a lot of false positives.”
• Defined: Occurs when a decision is based on
the believability of the conclusion.
– “We wouldn’t be a target for a nation-state
– “This is probably a false positive because it’s
unlikely someone would attack our VoIP system.”
• Defined: Interpreting data during analysis with
a focus on confirming one’s preconception.
• Ego is a big factor here
– “I think this is nothing.”
– “I think there is something going on here.”
• Defined: Tendency to overestimate the
significance of something based on the
• Signature/Alert Naming + Lack of Experience
Contribute to this.
– “The alert says this is a known APT1 back door, so
I need to spend all day looking at this.”
• Defined: Justifying increased time investment
based on existing time investment when it
may not make sense.
• Sunk Cost Fallacy
– “What do you mean this is nothing? I’ve spent all
day looking at this. I’ll spend all day tomorrow
digging into it; I’m sure I’ll find something else
• Defined: Interpreting information differently
based on how or from whom it was
• Important in interaction with other analysts
– Old Vet: “Steve doesn’t know what he is doing, so
if he is telling me this it probably doesn’t mean
– New Guy: “None of the more experienced guys
said anything about this, so it must not matter.”
• Defined: Excessive confidence in ones own
decisions, especially in light of contrasting
• 99% Paradox – “I’m 99% sure this is right.”
• One psych study suggest this statement is
wrong ~40% of the time.
• Defined: Excessive optimism and biased
decisions based on an invention of one’s own
making being involved in the analysis.
• Invention == System / Code / Concept
– “My tool can do that.”
– “I wrote that signature so I know it’s accurate.”
– “This fits perfectly in my model!”
There are over 100 types of bias.
How can we overcome them?
• Dr. Ernest Codman at Mass. General Hospital
• Post-Patient Meetings to Discuss What
Occurred and How to Better It
• Incident M&M
1. Handler/Analyst Presents Case
2. Followed by Alternative Analysis
• Developed by Richards Heuer Jr. (FBI)
• Series of Peer Analysis Methods
• Designed to Help Overcome Bias and Improve
Quality of Analysis
Group A / Group B
• Group A – Presenting Analyst/Team
• Group B – Secondary Analyst/Team
• Two Independent Analysis Efforts
• Note are Compared During the Presentation
• Identify Differing Conclusions from Same Data
Red Cell Analysis
• Peer Focus on Attacker’s Viewpoint
• Questioning in Relation to Attackers Perceived
• Requires Some Offensive Experience
• Best Executed by Red Team if Available
What If Analysis
• Focus on Cause/Effect of Actions That May
Not Have Actually Occurred
– What is the attacker had done X? How would you
have changed your approach?
– What if you didn’t stumble across X in Y data?
• Enhances Later Investigations
Key Assumptions Check
• Presenter Identifies Assumptions During
• Peers Challenge Assumptions
• Pairs Well with “What If” Analysis
– “What if it were possible for that malware to
escape that virtual machine?”
– “Would you come to the same conclusion if you
knew this was APT3 instead of APT1?”
Incident M&M Best Practices
• Limit Frequency
• Set Expectations
• Require a Strong Mediator
• Keep it at the Team Level – No Sr. Managers
• Encourage Servant Leadership
• Discourage Personal Attacks
• Write it Down!
• The Era of Analysis is Upon Us
• Bias is Inevitable – Learn to Recognize It
• Overcome Analysis Hurdles With:
– Analytic Technique
– Alternative Analysis