One of my favorite security gurus, Bruce Schneier, has an entertaining and yet infuriating article on The Security Mindset in which he tries to explain how security professionals think differently from other engineers.
SmartWater is a liquid with a unique identifier linked to a particular owner. “The idea is for me to paint this stuff on my valuables as proof of ownership,” I wrote when I first learned about the idea. “I think a better idea would be for me to paint it on your valuables, and then call the police.”
Really, we can't help it.
This kind of thinking is not natural for most people. It's not natural for engineers. Good engineering involves thinking about how things can be made to work.
It's fun and you should read the whole thing.
But it's also a bit frustrating — because Bruce restricts his discussion to how engineers think. To me, what he is describing is a big part of “thinking like a lawyer”. And when Bruce asks whether this sort of demented worldview, one in which you shake things to see how they break, can be taught, I think, “Hell, yes: I've been doing it for years.”
Most lawyers don't have the math to be a cryptographer or the technical chops to do security analysis of a complex program. But good lawyers — whether transactional or litigation oriented — do have a “security mindset”: A big part of learning to 'think like a lawyer' is learning again and again how things broke. That equips you to try to build things that won't break (or at least won't break in old ways); it also trains you how to break them.
It’s certainly worth learning how things can be broken… If you don’t respond to that training by regarding breaking things as just another tool… Just because you know how to break something doesn’t mean you should.
I’m an engineer, and we do indeed learn about breaking things. But it’s a different approach: We learn how things can be broken so we can avoid breaking them. Not so that anything that annoys us ends up in pieces.
The history of literature and the literature of history are both no more than stories of good intentions and unintended consequences.
It’s not called a “security mindset” it’s called “an imagination,” and no, that’s not something engineers, or rationalists, are trained to have much of. Enthusiasts are rarely skeptics.
Ah. I’m just saying. If your going to dabble in analogies. I mean. Well. Where: X ~ Lawyer, just possibly you should considering selecting group a X less capable of … ah … acting out? Just a thought. 🙂
Funny… this guy had a take on Bruce’s article from his own career perspective:
Maybe the more general observation is that this so-called “security mindset” is critical in many fields of endeavor?
Well, in the engineering field there is a function called quality assurance. We commonly describe the testing that engineers perform as “proving the product works” and the testing that QA performs as “trying to make the product break”. A gifted quality engineer will intuitively figure out the product’s weak points and focus on them.
A while ago, in the 1990s, I was involved with the R&D of various security products. We had many gifted engineers. But the only ones truly gifted at breaking the product (that is, breaking into the system) were in QA. We would often have engineers from one product try to break another product’s security. Their approaches typically involve using techniques that they’d learned from past experience, and their approaches were often effective. However, rarely did they find anything the best QA engineers didn’t find.
Security hackers, for the record, are more like the non-QA engineers. They tend to try approaches that worked for them in the past, or stuff they learned from others. Very few are really skilled at inventing new break-in techniques.