A Common Security Fallacy? Too Big to Fail (KISS)

Often I have journalists asking me to answer questions or send advice for a story. My reply takes a bit of time and reflection. Then, usually, although not always, I get an update something like this:

Loved what you had to say but had to cut something out. Editors, you know how it is. Had to make room for answers from my other experts…I’m sure you can understand. Look forward to hearing your answer next time

I DO understand. I see the famous names of people they’re quoting and the clever things they’re saying. They won, I lost. It happens. And then I started to wonder why not just publish my answers here too. That really was the point of having a blog. Maybe I should create a new category.

So without further ado, here’s something that I wrote that otherwise probably never will see the light of day:

Journalist: Tell me about a most common security fallacy

Me: let me start with a truism: KISS (keep it simple stupid)

this has always been true in security and will likely always be true. simpler systems are easier to secure because they are less sophisticated, more easily understood. complex systems tend to need to be broken down into bite-sited KISS and relationships modeled carefully or they’re doomed to unanticipated failures.

so the answer to one of most common security fallacies is…

too big to fail. also known as they’re big and have a lot to lose so they wouldn’t do the wrong thing. or there’s no way a company that big doesn’t have a lot of talent, so i don’t need to worry about security.

we’ve seen the largest orgs fail repeatedly at basic security (google, facebook, dropbox, salesforce, oracle!) because internal and external culture tends to give a pass on accountability. i just heard a journalist say giant anti-virus vendors would not have a back door because it would not be in their best interest. yet tell me how accountable they really are when they say “oops, we overlooked that” as they often do in their existing business model.

for a little historic context it’s the type of error made at the turn of the century with meat production in chicago. a book called “the jungle” pointed out that a huge fast-growth industrial giant could actually have atrocious safety, yet be protected by sheer size and momentum from any correction. it would take an object of equal or greater force (e.g. an authority granted by governance over a large population) to make an impact on their security.

so the saying should be “too big to be simple”. the larger an organization the more likely it could have hidden breaches or lingering risks, which is what we saw with heartland, tjx, target, walmart and so on. also the larger an organization the less likely it may have chemistry or incentives in place to do the right thing for customer safety.

there’s also an argument against being safe just because simple, but it is not nearly as common a fallacy.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.