Category Archives: Energy

Red Means Go, Green Means Slow

While riding in late night taxis in Brazil I noticed they hit the accelerator through red lights. When we approached a green light, they would slow down and look around for people running the reds.

I had to ask why. The drivers said this is a risk mitigation strategy.

Because of assault danger, Brazilian drive through red traffic lights during night, just as a warning.

Since stopping at a red light, especially late at night, makes you an easy victim for car-jacking or robbery…we didn’t stop.

And because everyone there knows drivers run red lights to stay safe, drivers with green lights slow down before crossing an intersection.

Just another example of why we should seriously reconsider stop-lights and their overall impact to risk (inefficiency of idling, yellow-light behavior, etc.)

#HeavyD and the Evil Hostess Principle

At this year’s ISACA-SF conference I will present how to stop malicious attacks against data mining and machine learning.

First, the title of the talk uses the tag #HeavyD. Let me explain why I think this is more than just a reference to the hiphop artist or nuclear physics.

HeavyD
The Late Great Heavy D

Credit for the term goes to @RSnake and @joshcorman. It came up as we were standing on a boat and bantering about the need for better terms than “Big Data”. At first it was a joke and then I realized we had come upon a more fun way to describe the weight of big data security.

What is weight?

Way back in 2006 Gill gave me a very tiny and light racing life-jacket. I noted it was not USCG Type III certified (65+ newtons). It seemed odd to get race equipment that wasn’t certified, since USCG certification is required to race in US Sailing events. Then I found out the Europeans believe survival of sailors requires about 5 fewer newtons than the US authorities.

Gill Buoyancy Aid
Awesome Race Equipment, but Not USCG Approved

That’s a tangent but perhaps it helps frame a new discussion. We think often about controls to protect data sets of a certain size, which implies a measure at rest. Collecting every DB we can and putting it in a central hadoop, that’s large.

If we think about protecting large amounts of data relative to movement then newton units come to mind. Think of measuring “large” in terms of a control or countermeasure — the force required to make one kilogram of mass go faster at a rate of one meter per second:

Newtons

Hold onto that thought for a minute.

Second, I will present on areas of security research related to improving data quality. I hinted at this on Jul 15 when I tweeted about a quote I saw in darkreading.

argh! no, no, no. GIGO… security researcher claims “the more data that you throw at [data security], the better”.

After a brief discussion with that researcher, @alexcpsec, he suggested instead of calling it a “Twinkies flaw” (my first reaction) we could call it the Hostess Principle. Great idea! I updated it to the Evil Hostess Principle — the more bad ingredients you throw at your stomach, the worse. You are prone to “bad failure” if you don’t watch what you eat.

I said “bad failure” because failure is not always bad. It is vital to understand the difference between a plain “more” approach versus a “healthy” approach to ingestion. Most “secrets of success” stories mention that reaction speed to failure is what differentiates winners from losers. That means our failures can actually have very positive results.

Professional athletes, for example are said to be the quickest at recovery. They learn and react far faster to failure than average. This Honda video interviews people about failure and they say things like: “I like to see the improvement and with racing it is very obvious…you can fail 100 times if you can succeed 1”

So (a) it is important to know the acceptable measure of failure. How much bad data are we able to ingest before we aren’t learning anymore — when do we stop floating? Why is 100:1 the right number?

And (b) an important consideration is how we define “improvement” versus just change. Adding ever more bad data (more weight), as we try to go faster and be lighter, could just be a recipe for disaster.

Given these two, #HeavyD is a presentation meant to explain and explore the many ways attackers are able to defeat highly-scalable systems that were designed to improve. It is a technical look at how we might setup positive failure paths (fail-safe countermeasures) if we intend to dig meaning out of data with untrusted origin.

Who do you trust?

Fast analysis of data could be hampered by slow processes to prepare the data. Using bad data could render analysis useless. Projects I’ve seen lately have added weeks to get source material ready for ingestion; decrease duplication, increase completeness and work towards some ground rule of accurate and present value. Already I’m seeing entire practices and consulting built around data normalization and cleaning.

Not only is this a losing proposition (e.g. we learned this already with SIEM), the very definition of big data makes this type of cleaning effort a curious goal. Access to unbounded volumes with unknown variety at increasing velocity…do you want to budget to “clean” it? Big data and the promise of ingesting raw source material seems antithetical to someone charging for complicated ground-rule routines and large cleaning projects.

So we are searching for a new approach. Better risk management perhaps should be based on finding a measure of data linked to improvement, like Newtons required for a life-jacket or healthy ingredients required from Hostess.

Look forward to seeing you there.

Diesel = Winning

The Economist in 2011 made a salient point about the future of gasoline vehicles:

For Toyota, taking BMW’s diesel engines is a tacit admission that its hybrid strategy does not cut it in Europe.

That means a gasoline-hybrid strategy is failing. A diesel-hybrid strategy, however, would have worked.

Two years later, today, the Economist admits the race is over. Diesel has won. Gasoline is dying.

The Toyota Prius hybrid? A lowly twentieth on the league table of the most economical fuel-sippers, with 4.2 litres/100km, along with higher emissions of carbon dioxide. The 19 cars having better fuel economy than the Prius hybrid are all clean diesels.

It really makes you wonder why we don’t have a hybid-diesel option instead of a hybrid-gasoline in America. At least GM has finally taken the lead domestically by releasing a passenger-car diesel option. It’s not the car they talked about in 2008, but it’s a start.

The Economist, however, is still thinking about the past. Instead of pointing out modern diesels already are available in America, they tell us diesels are on their way and Mazda is “leading”. This sounds like nonsense to me:

Later this year, Americans will get their first chance to experience what a really advanced diesel is like—and why Europeans opt for diesels over hybrids, plug-in electrics and even petrol-powered cars. [Mazda’s] diesel has 30% better fuel economy and provides oodles more pulling power. Good as the petrol version is, motorists who choose it over the diesel will miss out on a lot.

I’ll tell you why their “first chance” talk is nonsense. Look at the Economist analysis of what has changed.

“What marks this latest generation of diesel engines from even their ‘common-rail’ predecessors of the late 1990s…”

That’s a 20-year old reference — too far back to talk about predecessors. Several major generations of engine have hit American shores in-between the “late 1990s” and today. Why not compare current engines with those Americans have been driving in the 2000s?

Most notably was a major shift in 2004 when California regulated small passenger diesels out of the state (politics prevented regulation of the larger engines until later). Another major point was in 2008 when they re-appeared. This was no secret to the Economist. They wrote about it in an article called “Diesel’s second coming.”

America’s first chance to experience new diesel engine technology was around 2005 (when new VW TDIs were introduced) and then again in 2008 (when VW’s diesel won car of the year). I would call 2004 and 2008 models the predecessors, taking us into the late-2013 technology.

So much for America’s “first chance”.

Perhaps it is fair to say some people haven’t really been excited by diesel engines since the 1990s (the Economist gave Fiat Research an award) but who cares if those people have been asleep at their wheel since then. The Economist is not excused from doing research on the past decade of innovation.

It has been clear over the past ten years that diesel has achieved a pole position for increasing mpg while reducing emissions and providing performance and power. Combined with an electric motor, we’re talking over 100 mpg and a fun driving experience. Volvo sold-out every diesel-hybrid targeted for France before they even left the factory.

Still, some are confused. Editors at Forbes offer us this non-prediction:

…it’s not yet obvious whether electric vehicles, diesels, or even compressed natural gas vehicles (there are millions in countries such as Iran and Brazil) will ultimately take the checkered flag in the race for efficiency

Millions in Iran? Never thought I would hear Forbes suggest Iran as a model of efficiency. Experts on Iran seem to paint it in the opposite light, a study of waste and unsustainable inefficiency:

…poor resource management have contributed to rapidly growing energy consumption and high energy intensity for the past decades.

What Forbes is really saying is “let’s study engines in countries with domestic natural resources”. This is NOT a good equation. No wonder Forbes is confused.

To me there is no question. The race for efficiency is really a race for freedom from resource dependence, related to national security. Diesel = winning.

The answer is obviously diesel-hybrid, given American driver habits/needs. No other engine competes when it comes to sustainability. As I have written here since 2005, it is the safest, easiest to deploy, most-resilient and yet best performing of all. It is no coincidence the military prefers diesel.

Natural gas presents an interesting alternative to the pollution of coal plants but, as the Economist itself has written in the past, it fails miserably with a car engine. It gets about the same mpg as gasoline with far less performance. Same problem as ethanol. That means a significant cost added to manufacturing with marginal benefits.

Electric vehicles are great performers, perfect for urbanites, yet they lack range and cost far too much to enter the market at a broad base. They also have major additional costs to factor such as battery maintenance and replacement, not to mention the occasional unexplained explosion and fire (e.g. even Boeing claimed they didn’t see it coming).

Both natural gas and electric also require an overhaul to American infrastructure to enable vehicles with special engines. That’s a pipe-dream (pun intended) and about as likely as hydrogen. When you think about how long broadband has taken to be upgraded in America, and how inexpensive that infrastructure is compared to fuel lines…

I wouldn’t bet against diesel-hybrid.

American Fear of a Non-Motorized Planet

The area around Polk Street in San Francisco has experienced a high level of bicycle accidents. It has been ranked in the top five most dangerous streets. I can support this both quantitatively and qualitatively. Through nearly three decades of commuting by bicycle in the Twin Cities, London, Los Angeles and San Francisco the only place I have been hit by cars (twice!) is San Francisco.

In fact, in 1993 for more than six months day-or-night, rain-or-shine I rode 20 miles every day in central London and never once had impact with vehicles. (Other risks were higher: I was stopped and detained by anti-terror police once and I eventually was forced to reduce my daily ride time after a GP diagnosed me with serious respiratory damage from diesel-sulfur pollution).

Naturally the San Francisco Bicycle Coalition is looking at the same data. They work on traffic flow changes with urban planners to encourage cycling and reduce harm spots. This means increasing of non-motorist traffic, supporting higher-density of consumers and increasing sales for local businesses.

Studies have proven that an increase in safety for cyclists creates so much more non-motorized traffic that flow calculations have to be adjusted. Urban planners in London used to assume non-motorized traffic would equate to a quarter of a space used by motorized vehicles. It turns out to be much higher. This is an amazing development when you consider the potential density of bicycles and how much space is wasted by automobiles.

Separate Transport for London figures already show that cyclists now make 570,000 trips in London every day compared with 290,000 trips in 2001.

Blackfriars, Waterloo and London bridges are all now among the top 10 busiest cycle streets in London. On all of these, cyclists make up 42 per cent of traffic and 15 per cent of people – though they take up just 12 per cent of road space.

Almost 9,300 riders – 11 a minute – cross London Bridge a day.

Why aren’t more people cycling, given the obvious advantages? Turns out that even as bicycling is soaring it has been held back by safety concerns linked directly to automobile-centric thinking.

The inquiry heard that more than 42 per cent of Britons own a bicycle, but only 2 per cent of journeys in the UK are made by bike.

Many people who would like to cycle do not feel safe enough, and the inquiry heard that all road projects and urban development must include high-quality cycle lanes as part of the planning process.

In 1904 20% of traffic in London was by bicycle. It’s now planning to return to that level again because of increased value it brings across the board to a city (healthier citizens, cleaner air, higher density, lower infrastructure cost, more resilient to disasters, etc.) Look at how safety factors into their mayoral plan — a plan to reduce threat and harm from automobiles:

  1. A Tube network for the bike. London will have a network of direct, joined-up cycle tracks, with many running in parallel with key Underground, rail and bus routes
  2. Safer streets for the bike. Spending on the junction review will be significantly increased and substantial improvements to the worst junctions will be prioritised. With government help, a range of radical measures will improve the safety of cyclists around large vehicles
  3. More people travelling by bike. We will ‘normalise’ cycling, making it something anyone feels comfortable doing
  4. Better places for everyone. The new bike routes are a step towards the Mayor’s vision of a ‘village in the city’, with more trees, more space for pedestrians and less traffic

Let’s look at what higher density of more consistent-speed traffic really means.

In terms of commercial return, ask any shop owner or real-estate agent if they would rather see 1 customer enter their sales funnel or 10X customers. Instead of a single person taking up a giant parking space, or a full lane, we see the potential for 10X traffic for less cost. We also know that street-level advertisements (signs and store-fronts) are more effective on pedestrians and cyclists. That’s the kind of low-impact scalability model any modern urban space should be rushing towards. You’d expect retailers to be leading the charge.

Despite these facts, many American businesses seem to be up in arms instead. Lose a parking space? Never. Look at the data? Impossible.

Believe it or not, some Americans consider pedestrian harm collateral damage acceptable in their automobile-centric life. And if the subtext to that culture isn’t obvious enough, it’s based in racism cloaked in the “privilege” of wealth to afford a vehicle. When non-whites are disadvantaged systemically into remaining below a line of poverty, whites use concepts like property owner (car, house) to declare themselves “better” and more deserving of rights. Indeed, there are huge fines and felony charges for a car that damages property yet often none at all for killing a human.

A resistance to progress, despite obvious gains in traffic and better living conditions, comes from those who argue every parking spot translates to direct positive impact to the value of their property. The same group also seem to believe everyone can win a dead-end race to own the biggest vehicle on the road to stay safe and that pollution is a necessary evil within wealth accumulation.

It turns out that more carefully planned parking spaces, and even reduced motorized traffic flow, has an increased value to property investments. This is the converse of what car parking extremist believe. Of course common sense proves this. Suburbs, with the highest percentage of parking, struggle to hold value while urban spaces attract more people than ever despite a lack of car parking.

Consider also that the rate of driving is declining as people realize an American model of excessive automobile ownership (e.g. being stranded without a car) is the opposite of true quality-of-life values. Transportation options that make spaces clean and quiet with lower barrier to entry are where people are wisely spending money now (see “Young People Aren’t Buying Cars” and “Young Americans Lead Trend to Less Driving“)

So I just noticed that Boston news is facing a similar debate as in SF.

“There seems to be a knee-jerk reaction to the eliminated parking wherever something like this is proposed.” said Pete Stidman, executive director of the Boston Cyclists Union. “We think if it was fixed up and made safer, you’d see an even huger increase in cycling there.”

Hayes Morrison, Somerville’s director of Transportation & Infrastructure, said the city has reviewed the street with two parking studies, finding that there is ample parking available, and would continue to be after the reconstruction and loss of spaces.

Think about why a Boston or San Francisco is far more desirable to live in than a Los Angeles. It is like asking why people prefer to live near green spaces, parks and bicycle lanes instead of dangerous and polluted petroleum gulches like Polk St.

Trust me, I commuted by bicycle in Los Angeles in 1995 after I left London. It was nearly impossible. Bike lanes literally dead-ended at freeways with no options. I spent hours trying to map out routes that someone could actually ride and not be stranded by planners who ignored non-motorized traffic.

Bottom line is that areas of a city that have safe bicycle traffic will be the areas of density and prosperity growth. Cleaner, quieter…fewer cars, less parking, yet more people means better living. By comparison, neighborhoods that emphasize car parking are higher-risk, less desirable, less able to sustain heavy traffic and will lose value. Wasted space makes commerce more expensive with less return.

Until Polk street allows reasonable pathways for non-motorized traffic we all should avoid spending money in that area. Take your business elsewhere (e.g. Mission, Haight, FiDi), places that are working to maximize quality of life, reduce injury, and let us breathe easier. It’s time to support areas invested in sustainable value. Stop protecting dead spaces for empty cars.