So I talked about feedback. Here’s some thoughts on implementing one flavor of it in your own security designs. Yes, it’s the aftermarket attach-it-to-the-current-setup kind of feedback, but that means it can be used just about anywhere.
The original article was oriented towards people working on organizational IT settings. You all know my advice is to keep your soul and avoid such things, so I’ve quoted the bits that apply to the rest of us.
I would add that a bit of axiomatic design thinking (“what’s the goal? ok, what does the system/design have to do to achieve that? ok, what does it have to look like as a result? …now at every stage, is it shaping up the way it’s supposed to?”) goes a long way in such ventures.
Come to think of it, axiomatic design generalizes to just about any task in security. But we’ve already covered that a bunch. (if you missed it, my four-question summary is pretty much what you need to know.)
“Collect and store all event data, even if you don’t think you need it. This is especially important since you don’t always know what you have—or what you will need—in the way of forensic data analysis.[….]
Establish basic measurements, understand them, then expand. Start somewhere, anywhere, to establish a metric and then work to make that metric useful or replace it with a better one that you’ve discovered in the process. Don’t just poke around or take a whack-a-mole approach to your discovery process – prioritize your effort so that you can accumulate and maintain a portfolio of metrics that maximize the value of your initiative.[…]
Be consistent. Don’t spend a month on analysis then move on if nothing pops up. Maintaining consistent vigilance is the key to spotting trends or variances …erratic monitoring and analysis leads to a false sense of security and reduces your ability to continuously reflect and refine based on known patterns.[…]
Be ready to change. There is a tendency to take a finding, create a counter-measure around it, and then never look back. Be intellectually honest when you make new discoveries, particularly if they show a need to change an established rule, alert or policy. While flexibility and change seemingly conflict with “be consistent,” get comfortable with the idea that you will often learn something new which will require a policy or process change. […]
The dynamic nature of attacks may also lead you to integrate data from systems you didn’t initially consider using to drive critical correlations.[…]
Test yourself. Conduct a Metrics Penetration Test (MPT), which determines if the analytics you have established will “catch” the behaviors you are trying to isolate. For example, have an employee download a massive amount of data from an unusual location during an odd hour of the day to see if your “Unusual Download Volume” measurement triggers the flags you expect to see. Use results from these MPTs in operational reviews to continue evolving/maturing your analytics methodologies.[…]
nnovate with new technologies but prune as you go. Defense in depth is a proven strategy but it can also lead to technology bloat, a false sense of protection and – in many cases – open doors for attacks. Examine your digital exhaust to identify devices, systems, applications and tools that are dormant or redundant.”