Category Archives: History

hUkt On f6nlks

Excellent research in a paper and presentation from the IEEE Symposium on Security and Privacy: “Phonotactic Reconstruction of Encrypted VoIP Conversations: Hookt on fon-iks

Encryption of voice conversations on IP networks does not sufficiently obscure it to prevent reconstruction. It essentially applies the way our brains process spoken language to the sounds of an encrypted VoIP channel. We hear sounds that resemble those in our memory, and then we pattern match (e.g. search for collisions). The patterns can still be found even in encrypted VoIP.

The study has numerous references to related works and there have even been similar presentations at the IEEE but this one emphasises that it is a proof — attacks are far easier than previously thought.

In this work, we make no such assumption about a priori knowledge of target phrases. Rather, our ultimate goal is to reconstruct a hypothesized transcript of the conversation from the bottom up: our approach segments the observed sequence of packets into
subsequences corresponding to individual phonemes (i.e., the basic units of speech).

This illustration of the problem is superb:

Color me impressed. Great adaptation of linguistics to information security. However, they only propose two mitigation options:

…a knee-jerk reaction to thwarting this and other aforementioned threats to VoIP is to simply use constant bit-rate codecs or block ciphers. […] Another alternative might even be to drop or pad packets…

I wonder why they did not mention mixing an entire stream of noise/data into the payload, like a salt in a hash. Maybe that’s what is meant by pad packets? The goal would be to at least fill gaps and obscure phrasing to eavesdroppers, such as techniques used in WWII to hide the increase in radio traffic before attacks were launched. Yet that kind of pad defence is different in my mind from actively sending fake data that hostile recipients would want to process instead of ignore (e.g. communication by the Aspidistra high-power (600 kW) medium wave broadcasting transmitter to confuse German attacks).

Latency in encoding and decoding a message is usually cited as an obstacle for filling gaps and running interference to obscure IP communications, but if someone needs the privacy then a short delay or echo on a call seems like a small price to pay and bandwidth/memory/processing is getting less expensive all the time.

The Eight Roles of DCID 6/3

Only five short of a baker’s dozen, there are eight roles provided in Director of Central Intelligence Directive (DCID) 6/3 “Protecting Sensitive Compartmented Information Within Information Systems”.

(pdf) (doc)

  1. Principal Accrediting Authority — responsibility for all intelligence systems within their respective purviews, are the DCI, EXDIR/CIA, AS/DOS (Intelligence & Research), DIRNSA, DIRDIA, ADIC/FBI (National Security Div), D/Office of Intelligence/DOE, SAS/Treasury (National Security), D/NIMA, and the D/NRO
  2. Data Owner — final statutory and operational authority for specified information
  3. Designated Accrediting Authority — authority to assume formal responsibility for operating a system at an acceptable level of risk based on the implementation of an approved set of technical, managerial, and procedural safeguards
  4. Designated Accrediting Authority Representative (DAA Rep) — technical expert responsible to the DAA for ensuring that security is integrated into and implemented throughout the life cycle of a system
  5. Information System Security Manager (ISSM) — responsible for an organization’s IS security program
  6. Information System Security Officer (ISSO) — responsible to the ISSM for ensuring that operational security is maintained for a specific IS
  7. Privileged Users — access to system control, monitoring, or administration functions
  8. General Users — can receive information from, input information to, or modify information on, a system without a reliable human review

They provide a good exercise in defining relationships with compartmentalised information; it’s fun to try and make a diagram that shows the connections and overlap.

DCID 6/3 in 1999 superseded DCID 1/16, which had the much more fun title of “Security Policy for Uniform Protection of Intelligence Processed in Automated Information Systems and Networks“.

DCID 1/16 was from 1988 and superseded DCID 1/16 of 1983 — a time of great US government concern about outsider attacks and NSA’s first attempt to wrestle control of the Internet away from NIST.

Restitution for Hacks

I wrote earlier about a recent decision on computer fraud related to ATMs. I did a little history reading to jog my memory and see if I could figure out what about the case sounded familiar. I found Section 6-1 of my HP-UX System Security Manual, from October of 1989, with the following warning:

The U.S. Computer Security Act of 1987 casts new urgency on computer security in all business segments. It stipulates that if financial loss occurs due to computer fraud or abuse, the company, not the perpetrator, is liable for damages. Thus, ultimate responsibility for safeguarding information lies with individual businesses themselves.

Ronald Reagan’s Computer Security Act (CSA) was repealed by FISMA in 2002. Could it be relevant to today’s attacks?

The CSA was a reaction to the news of computer attacks in the early 1980s, especially by seven teenagers from Milwaukee. An eager Congressman from Kansas (Glickman) called House hearings that pointed out attacks were successful mostly because of weak and default passwords as well as of missing patches.

Here’s an amusing excerpt from InfoWorld in 1983:

…the FBI had implied that [a perpetrator] had violated the law when he sent electronic mail on the Telenet network. “We weren’t even aware that using the [stolen] passwords was illegal” he said.”

Obviously attacks have not changed much. What really has changed is restitution.

The major difference from pre-CSA regulation to today seems to have more to do with the liability of an attacker to pay for restitution than with any radical shift in system vulnerabilities.

Note the details in a case earlier this year. A man in New Hampshire was set to pay restitution of more than $2 million and forfeit another $8 million after running a four-year malware operation.

PALA and his co-conspirators infected German citizens’ computers with a program that would force the computers’ telephone modems to surreptitiously dial premium telephone numbers rented from German telephone companies by PALA’s co-conspirators. …from 2003 through 2007, PALA made approximately $7,941,336 from the computer hacking conspiracy. PALA also allegedly failed to pay approximately $2,287,993 in income taxes during this time.

Modems? He was expected to pay a hefty restitution to the IRS for undeclared profits from (unauthorised) dial-up fees.

Another interesting restitution case earlier this year was in Massachusetts, where a prisoner hacked the common computers and then was ordered to pay to protect the identity of other inmates.

Souter conceded that individual current and former employees could have paid for their own credit monitoring when they learned of the hacking, “but this in no way diminishes the reasonableness of the Facility’s investigation prompted by the risk that its security failure created.”

[Retired U.S. Supreme Court Associate Justice David] Souter rejected Janosko’s timeliness argument. “An employer-victim contemplating the resolution of a charge like the one here could be expected to press the prosecutor to demand any terms that would be necessary to make the members of the employer’s workforce whole, and a credit check even up to the moment of a plea agreement would therefore be timely,” he wrote.

The BofA case thus fits the trend of ordering a hefty restitution award from perpetrators. Unlike the time of the CSA the laws now seem headed towards large recovery awards, which some argue are disincentives to attackers. Hopefully the restitutions will not prematurely reduce the pressure to enhance technical controls.