The Security Awareness Training Debate (and eschatology in review)

Two noted security experts recently faced off on the issue of “security awareness training.” These guys (Schneier and Winkler) were writing for a corporate audience, so let me translate: for “security awareness training debate” read “do we spend money on making users smarter or making security more idiot-proof?”

Bruce Schneier’s view, “screw training, it never worked, build better systems instead”:

Ira Winkler’s response, “there’s no good technical solution to ignorant people”:

I think you need broad education in “this is what security means,” and system design that makes it easy. Security and convenience can’t be enemies if you want good security… even if the two are constantly eying each other warily, you need to design in at least an armistice.

On the other hand, the best way to pull off the former is not with the boring classes that Schneier rails against as ineffective. In other words, once people understand what’s possible — those same people get smarter and more responsible in their actions. If you know how security gets broken, you understand what it takes to maintain security much better than you would if you’d been kept in the dark.

Eschatology in review:
Back in mid-February I pointed out some obscure bets on chaos before the beginning of April:
“Some very rich people nevertheless appear to be betting on zombies in March. (“Beware the Ides of March!”). Not only has someone put $11 million on market volatility going up before April (, but someone also put 200x more than average on banks taking a dive between March and April(”

So, did that analysis pan out? Well, it’s too early to say on the volatility question — it still has until April 20 to run. The bet in question was that the volatility index (“VIX”) would go above 20, which it hasn’t done yet… though it did touch 19 in late February.

The second one, though, has definitively not panned out. March ended with the relevant indicator (“XLF”) never dropping very far at all, and certainly not below the value of 16 someone was betting on. (Indeed, the relevant stock symbol hit a 2-year high.)

Despite this objective failure, rather a lot happened in the relevant sectors: the Cyprus bank chaos, even if it didn’t affect “XLF,” does indeed represent certain banks taking a big dive.

The VIX, on the other hand, hit 19 as Beppe Grillo was making his way up the Italian polls and the fiscal cliff / sequester debate reached a fever pitch. And that “bet on zombies” still has 20 or so days to run…

” Today, everyone is a computer user. A lot of our computers are simply called digital devices, and they can all potentially impact both the individual using them and the organization for which they work. This is a reality which undermines the hopes and dreams of a security strategy that can be characterized, in the extreme, like this:

Make the systems secure so the users of the systems don’t need to understand security.

I have watched those hopes and dreams falter for more than two decades, but I’m not ready to say “stop wasting money trying to make systems secure because it hasn’t worked so far.” And I am not saying that this “security-purely-through-technology approach” is exactly Bruce’s position, but there are echoes of it in his article:

I personally believe that training users in security is generally a waste of time and that the money can be spent better elsewhere. Moreover, I believe that our industry’s focus on training serves to obscure greater failings in security design.

My first response to this is that we are not spending enough on training users, period. Organizations routinely hire people for positions that require the use of a computer without either a. asking for proof of user skills, or b. holding the term “user” to any meaningful standard. Consider just a simple conversation between employer Q and applicant A:

Q. Do you know how to delete a file?
A. Yes.
Q. How?
A. You drag it to the trash.
Q. And then it’s gone?
A. Yes, I guess.

Are you comfortable allowing someone with that level of knowledge and skill to use your computers? Are your systems brilliantly designed by developers trained in security so that the people who use them don’t need to know how to delete files? They are? Then what happens when you hire A and some of your files get onto his smartphone and you tell him that’s wrong, he must delete them?”

%d bloggers like this: