Protecting your trail

A recent decision of the Bankruptcy Appeals Panel of the 9th Circuit (VEE VINHNEE v. AMEX: Dec 16, 2005) seems to suggest that adequate controls to protect audit logs must be in place in order to prove the authenticity of digital information.

I have heard some conclude that this leads directly towards cryptographic protections, but it seems plausable to me that proper access controls and strong identity management might also be argued to be sufficient, if not compensatory.

The testimony by AMEX employees who routinely accessed the data was non-expert, and it suggests that they could only assume controls were in place but did not know/verify. This appears to have opened up the possibility that the data could not be proven to be authentic.

The decision explores the issue of authenticity and has some interesting citations such as “George L. Paul, The “Authenticity Crisisâ€? in Real Evidence, 15 PRAC. LITIGATOR No. 6, at 45-49 (2004). It also calls out a specific “scientific” methodology to help examine the “validity of the theory underlying computers and of their general reliability”:

Professor Imwinkelried perceives electronic records as a form of scientific evidence and discerns an eleven-step foundation for computer records:
1. The business uses a computer.
2. The computer is reliable.
3. The business has developed a procedure for inserting data into the computer.
4. The procedure has built-in safeguards to ensure accuracy and identify errors.
5. The business keeps the computer in a good state of repair.
6. The witness had the computer readout certain data.
7. The witness used the proper procedures to obtain the readout.
8. The computer was in working order at the time the witness obtained the readout.
9. The witness recognizes the exhibit as the readout.
10. The witness explains how he or she recognizes the readout.
11. If the readout contains strange symbols or terms, the witness explains the meaning of the symbols or terms for the trier of fact.

The decision then suggests that step four is of particular importance, given the lack of proof that controls existed to ensure the accuracy of data:

The testimony of the records custodian at trial regardingthe computer equipment used by American Express was vague, conclusory, and, in light of the assertion that “[t]here’s no way that the computer changes numbers,� unpersuasive.

If you read the testimony yourself, you can see the issue the decision is referring to…

I couldn’t testify to exactly what – what the model is or anything like that. It’s – you know, our computer system that we’ve used for, you know, quite some time to produce the documents, to gather the information, to store the information and then, you know, produce the statements to the card members. And we – you know, it’s highly accurate. It’s based on the fees that go in. There’s no way that the computer changes numbers or so.

I can imagine a million ways to be more convincing/prepared with regard to the controls used to protect the data in question. But the real question, I guess, is whether cryptographic controls should now be considered a minimum requirement?

Controls Map

With the recent release of ISO17799:2005 and CObIT4 I guess I need to rewite my controls map (not to mention the long list of privacy laws debated in California during 2005). I really like the ISO revision, but am still catching up with CObIT. One of the challenges of helping organizations stay on top of their controls is chosing the right blend of guidance and frameworks. I’m not saying you have to use a blend, but since they are never a perfect fit and different groups have their favorites (Auditors love COSO/CObIT, Engineers go for ISO, Ex-gov bring up the NSA and NIST, etc.) I find it helps to pull it all together into a shared map. For example:

SYSTEM INTEGRITY – Controls that ensure the integrity of the environment by utilizing proactive measures to prevent and detect unauthorized changes.

  • Gateway Filtering
  • Anti-virus
  • Encryption
  • Access Controls

  • ISO.17799 (8)(3) –
    Protection against malicious software
  • ISO.17799 (8)(7) –
    Exchange of information & software
  • ISO.17799 (10)(3)
    – Cryptographic controls
  • ISO.17799 (10)(5)
    – Security of system files
  • NIST.800-14 (3)(14) – Cryptography
  • NSA IAM (9) – Virus protection
  • AB 1950 (Wiggins) – California State Personal Information Security

Security vendors and trust

RSA 2006 is coming soon and so I am being literally barraged by security vendors hawking their wares. How do we sort the chaff from the wheat?

Here’s a hint: there is nothing more annoying that someone dangling an iPod in front of my face and asking me to tell them whether I am able to comply with some regulation. “Tell us if you violate the GLBA and we’ll give you an mp3 player” is downright insulting. It baffles me that someone who is basically anonymous would even ask that question and expect to get accurate data. And putting a picture of some cute person in front of me doesn’t improve things. Appropriate response: ignore or, if pressured, present bad data and walk away.

If you represent a security company, please help stop the madness. Random drawings based on contact information alone, for popular electronics, is one thing. Overtly saying “we’ll pay you to give us dirt on your employer” without establishing any modicum of trust should be grounds for being barred from security conferences.

Spy Rock

Come here my sweet pet

You’ve heard of the pet rock? Russian intelligence is accusing the British of using one to spy on them, according to the BBC. The article has a fun Q&A format, with answers like this:

from what we know it appears that those who allegedly stole the confidential information walked close to the rock and then uploaded data to the device beneath it. Later, others came and downloaded the data and walked off with it

“Sir, can I help you?”
“No, thanks, just taking my pet rock for a walk.”

Update: the Russians reportedly claim the rock cost “several tens of millions of dollars” to develop. Funny.