Take responsibility for what you do. This counts double for security: understand how the things around you work, because it’s up to you whether they do or not. (And think about how what you do will affect other people’s lives, but that’s a larger issue.)
Cryptography (and how cryptographers think) can sadly be a huge barrier to this.
Consider the link. In order to illustrate how an algorithm can be accountable to the public, the otherwise laudable Ed Felten designs a hypothetical algorithm for randomly selecting people for special airport security screening.
The security authority commits to a particular random value at the beginning of the day, which in turn seeds an algorithm that chooses whether a traveler gets the latex glove of justice. If said traveler feels the selection wasn’t random, he or she can run the algorithm again at the end of the day using the random value once it’s revealed, and thus check whether or not the goons followed the rules.
Unfortunately, this sort of thinking discourages people from taking responsibility for their own security. Very few can comprehend an algorithm, even fewer might bother to run it. Therefore, while there are situations where complex math is the only way, things like airport security is not among that set.
(From the comments, a better idea: a box plugged into nothing but the wall, which when you push a button either lights ‘red’ or ‘green’ to determine whether you’re selected. I would suggest an even simpler way: one of those pop-up bubbles used in board games to automatically roll the dice, mounted to a table in front of a guard. Put a D20 in with one face painted red, everyone that goes by pushes down on the bubble once under the eye of the guard.)
Yes, I realize this was just an example. But this sort of thinking pops up all over the place: think of the various crypto-electronic-voting schemes, which do electronic voting in a verifiable way. Wonderful idea, but perfectly incomprehensible to most victims of the education system.