It always bothered me that the 419 scams in Nigeria seem to be linked to people who say that they are just playing the game of open markets. In other words, attackers ask why they should be blamed if they simply prey on others’ greed.
A new story appeared last Thursday in the Guardian that reinforces much of what was reported a few years ago:
The email scammers here prefer hitting Americans, whom they see as rich and easy to fool: maghas [slang from a Yoruba word meaning fool] are avaricious and complicit. To them, the scams – known as 419 after the Nigerian criminal statute against fraud – are a game.
A “game” that has victims rather than players, hardly can be called a game at all. Instead, it is an example of carefully crafted social engineering that allows attackers to transfer value (from victims to themselves) without proper authorization. The interesting thing about the attack, in this case, is how it uses political or even cultural prejudice to establish credibility.
I presented a report on this with Harriet Ottenheimer at the Central States Anthropological Society’s meetings in 2004. It was called “Urgent/Confidential — An Appeal for your Serious and Religious Assistance” and provided details on the attack taxonomy and social engineering methodologies.
Might be time to publish the paper to help clarify how people remain susceptible and what can be done to reduce the risk.
The Register reports that a teenage girl in England apparently convinced a judge that an ankle-tag would look odd with her choice of clothing, and she therefore was able to easily circumvent the curfew conditions of her bail. One must wonder whether the judge also called for a more effective/discrete tag (is their purpose to be obvious to others?) or just did not really think the teen would recitivate for “grievous bodily harm against another woman”.
The BBC report mentions that the girl said she did not answer the door during her curfew because “she was asleep at the time”, which presents an interesting value map for security:
Fashion > IDTag > Sleep
I posted portions of the following comment on Schneier’s blog today. Thought it deserved a place here as well.
This is an excellent quote, discovered in a Wired story called “Drivers Want Code to their Cars“:
“‘There is really no time in my schedule for sitting around a car dealership listening to some fat guy in a clip-on tie tell me that the problem is my fault,’ [a 2002 car owner] said. ‘Instead of explaining anything to me they just pull out a warranty sheet with a highlighted portion indicating that they don’t cover Check Engine light problems.’
A bill floating through Congress could help people like Seymour by forcing automakers to share diagnostic codes with car buyers and independent mechanics. The Motor Vehicle Owners’ Right to Repair Act would give Seymour the means to determine whether the Check Engine light signaled another gas cap vagary or a major oil leak. The legislation would also allow Seymour to choose an independent — and possibly cheaper — repair shop instead of being forced to go to the dealership.
The legislation argues that consumers own their vehicles in their entirety and should be able to access their onboard computers.”
I think that’s “own” as in “beer”, not speech…
I know it’s a stretch, but imagine personal computer users making the same kind of demand (ok, forget the clip-on tie part). You would have a legal precedent for a “right to repair”, which could be extended to a need for source, no? How does IP get protected when you give away the details needed to make repairs, or should IP rights be placed above the right to prevent harm or even just maintain value for a buyer? More research required.
Wired picked up some of the details of the Prius software bug that I mentioned this past Sunday. It looks like several major news outlets carried a story on this as far back as May 2005. Wired mentioned the Prius troubles in an article called “History’s Worst Software Bugs“. I am disappointed that they didn’t bring up the fact that dealers are still selling the buggy version of the car.
One of Wired’s “worst” is the Arianne 5 flight 501 disaster. Since I am personally familiar with the event (from work at UIowa Dept of Physics and Astronomy) I might be biased, but I must say that while I’m not sure it was one of history’s worst, it certainly makes a great case study. I use it regularly in presentations on risk management. For example, the backup code was the exact same rev as the primary, and thus the bug (floating point error) that caused a failure in the primary…yup, you guessed it…oops.
Wired suggests a Wikipedia version of events, but their second link points to the original European Space Agency ESA) report “ane5rep.html” (also found hosted at MIT). The ESA provided a very clear analysis of the source of the problem:
“The reason why the active SRI 2 did not send correct attitude data was that the unit had declared a failure due to a software exception. The OBC could not switch to the back-up SRI 1 because that unit had already ceased to function during the previous data cycle (72 milliseconds period) for the same reason as SRI 2.”
But even more interesting is that the floating point error itself could have been handled many ways, or the trajectory tested more accurately, but “It was the decision to cease the processor operation which finally proved fatal. […] The reason behind this drastic action lies in the culture within the Ariane programme of only addressing random hardware failures.”
Dare I say, the risk of software bugs was mis-managed?
The Inquirer reports today that Sony is getting sued, by the ALCEI (Electronic Frontiers Italy):
“According to the press release here, and the complaint here, the Italian group ALCEI is suing Sony over the rootkitting DRM infection.”
This is a response to Mark Russinovich’s rather thorough and powerful complaint about his discovery of a Sony root-kit on his Windows PC after installing a player from one of their music CDs.
No luck with the trackback yet, so I’ve cross-posted some of this on Schneier’s blog as well.