Ok, you know I don't mean that literally! However, "FAIL" is the theme of this month's Wired magazine, specifically, how you can turn losing into winning. As I was reading Jonah Lehrer's Accept Defeat: The Neuroscience of Screwing Up, I realized that it had some important implications for CIRTs and other security teams.
In the article, Lehrer writes about how scientists often fail to make the most of their experimental results:
Over the past few decades, psychologists have dismantled the myth of objectivity. The fact is, we carefully edit our reality, searching for evidence that confirms what we already believe. Although we pretend we’re empiricists — our views dictated by nothing but the facts — we’re actually blinkered, especially when it comes to information that contradicts our theories.
An intrusion analyst has this problem all the time. While responding to a possible incident, an investigator collects a lot of information and tries to organize it in such a way as to tell a coherent story, the better to judge what happened, whether it was significant, and what to do about it. Creating this story, then, is the equivalent of coming up with a theory about the event.
The problem is that there is often a lot of confusing and/or contradictory evidence in a complex investigation. The investigator comes to the table, though, with a built-in biased based on his previous experience with similar incidents, his knowledge of the local user population, his understanding of the organization's IT infrastructure and a lot of other factors. Can a biased analyst be relied upon to create an unbiased view of events? Yes, says Lehrer:
It’s normal to filter out information that contradicts our preconceptions. The only way to avoid that bias is to be aware of it.
In other words, if your CIRT is composed entirely of experts in the fields in which you expect to operate, you may hit some real challenges when an attacker does something unexpected, something contrary to the conventional wisdom.