Ok, you know I don't mean that literally! However, "FAIL" is the theme of this month's Wired magazine, specifically, how you can turn losing into winning. As I was reading Jonah Lehrer's Accept Defeat: The Neuroscience of Screwing Up, I realized that it had some important implications for CIRTs and other security teams.
In the article, Lehrer writes about how scientists often fail to make the most of their experimental results:
Over the past few decades, psychologists have dismantled the myth of objectivity. The fact is, we carefully edit our reality, searching for evidence that confirms what we already believe. Although we pretend we’re empiricists — our views dictated by nothing but the facts — we’re actually blinkered, especially when it comes to information that contradicts our theories.
An intrusion analyst has this problem all the time. While responding to a possible incident, an investigator collects a lot of information and tries to organize it in such a way as to tell a coherent story, the better to judge what happened, whether it was significant, and what to do about it. Creating this story, then, is the equivalent of coming up with a theory about the event.
The problem is that there is often a lot of confusing and/or contradictory evidence in a complex investigation. The investigator comes to the table, though, with a built-in biased based on his previous experience with similar incidents, his knowledge of the local user population, his understanding of the organization's IT infrastructure and a lot of other factors. Can a biased analyst be relied upon to create an unbiased view of events? Yes, says Lehrer:
It’s normal to filter out information that contradicts our preconceptions. The only way to avoid that bias is to be aware of it.
This is one of the two big take-aways I got from this article: Simply being aware that you have a tendency to create a biased story is the most effective way to keep your investigations objective.
The other important point this article makes is that the composition of your research team (or in this case, your CIRT) is crucial, but maybe not in the way you think:
There are advantages to thinking on the margin. When we look at a problem from the outside, we’re more likely to notice what doesn’t work. Instead of suppressing the unexpected, shunting it aside with our “Oh shit!” circuit and Delete key [brain functions discussed earlier in the article], we can take the mistake seriously. A new theory emerges from the ashes of our surprise.
In other words, if your CIRT is composed entirely of experts in the fields in which you expect to operate, you may hit some real challenges when an attacker does something unexpected, something contrary to the conventional wisdom.
To illustrate this point, Lehrer relates a story about two research labs trying to solve the same problem (unwanted proteins sticking to a filter in their equipment). One lab had a team of researchers who were all experts in the same field, while the other was composed of a diverse set of researchers from different scientific disciplines.
The first team "took a brute-force approach, spending several weeks methodically testing various fixes. 'It was extremely inefficient,' Dunbar says. 'They eventually solved it, but they wasted a lot of valuable time.'"
The second team, composed of experts in many different fields, however, had a different result:
The diverse lab, in contrast, mulled the problem at a group meeting. None of the scientists were protein experts, so they began a wide-ranging discussion of possible solutions. At first, the conversation seemed rather useless. But then, as the chemists traded ideas with the biologists and the biologists bounced ideas off the med students, potential answers began to emerge. “After another 10 minutes of talking, the protein problem was solved,” Dunbar says. “They made it look easy.”
The reason the second team was more successful? They had the "outsider's" point of view. They were able to examine fresh approaches to the problem, kick them around and come up with a creative solution far more easily than the "expert" team bound by a sense of "this is how we always do it".
It's pretty easy to see how this could apply to security. We're constantly called to come up with creative solutions to problems, though our problems are usually called "incidents."
The key to a successful CIRT is the diversity of the team. You need intrusion analysts, sure, but you also need specialists in other areas such as incident response, malware reverse engineering, system administration, threat intelligence and as many other relevant disciplines as you can lay your hands on. Put these people together on the same team, give them easy communication and good collaboration tools and watch them go!
I'm extremely fortunate that I work in just such an environment, and I can tell you from experience that this approach works. It's almost shocking how effective it is.
I never met an analyst that thought she was good enough to deal with everything by herself, nor a CIRT that felt it was able to handle incidents well enough. This is an profession where we have a constant need for improvement and adversaries who are often extremely skilled. To respond well to compromises, it's important to recognize how you can turn failure into excellence. This article shows how this is possible, for both individuals and teams.