CVE-2012-2118: X.org input device format string

The xorg-x11-server log may have a format string flaw in certain forms of its log message. An attacker would have to give a device a specific name and then get the local Xorg server instance to use it for the flaw to cause the server to abort or malfunction.

IBM rated this vulnerability high risk yet gave it a base score of 4.4

X.org could allow a local attacker to execute arbitrary code on the system, caused by a format string error when adding an input device with a malicious name. By persuading a victim to open a specially-crafted file containing malicious format specifiers, a local attacker could exploit this vulnerability to execute arbitrary code on the system or cause the application to crash.

NIST gives it a perfect 10, however.

Access Vector: Network exploitable
Access Complexity: Low
Authentication: Not required to exploit
Impact Type: Allows unauthorized disclosure of information; Allows unauthorized modification; Allows disruption of service

The flaw is explained on Patchwork

The culprit here was not the user-supplied format string but rather xf86IDrvMsg using the device’s name, assembling a format string in the form of

 [driver]: [name]: message[/name][/driver]

i.e. “evdev: %n%n%n%n: Device \”%s\”\n”
and using that as format string to LogVWrite.

I also mentioned a serious vulnerability related to evdev in an earlier post.

…crash is related to changing the driver for input devices using evdev (xserver-xorg-input-evdev) — the kernel event delivery mechanism that handles multiple keyboards and mice as separate input devices

Terrorist Use of Fire as a Weapon

I have had several people ask about the possibility of terrorist activity related to the Colorado fires. There are two main factors to explore at this point. Recent intelligence reports coupled with the evidence of arson in proximity of inhabited areas make it impossible to rule out terrorism.

Intelligence reports

Terrorist groups on May 2nd, 2012 published details on how to use fire as a weapon, as reported by ABC news on May 3rd.

Several days later, on May 7th, 2012, the Denver Division of the FBI released a Criminal Activity Alert to raise awareness about wildfires and Al-Qaida. It mentioned the terrorist group was monitoring for the right conditions, such as high winds and dryness, to spread fires. By the end of the month a Homeland Security / DHS report was issued called “Terrorist Interest in Using Fire as a Weapon.”

For at least a decade, international terrorist groups and associated individuals have expressed interest in using fire as a tactic against the Homeland to cause economic loss, fear, resource depletion, and humanitarian hardship. There is no evidence that international terrorist groups and inspired individuals are responsible for any purposeful destruction of public or private property by setting wildfires in the United States. Using fire as a weapon, however, is inexpensive and requires limited technical expertise, giving it a strong advantage over other methods of attack. Statements advocating this tactic are most often found on violent extremist Web forums and in violent extremist propaganda.

[…]

We have no indications AQAP or unaffiliated violent extremists are planning to act upon the suggestions contained in Inspire.

[…]

We do not have any current intelligence indicating terrorists or violent extremists in the United States are planning to use fire as a weapon as described in Inspire magazine and online violent extremist forums. However, because terrorists have long shown interest in this tactic, which is inexpensive, low risk, and requires little technical knowledge, we encourage first responders to remain vigilant to indicators of the potential use of fire as a weapon

Evidence of Arson

It is not clear whether it is by “inspired individuals” or terrorists but reports indicate human fault during the exact conditions mentioned by the FBI.

Fire investigators believe the fire was sparked by someone participating in recreational shooting over the weekend, possibly by people shooting a gun at a fuel tank.

The latest Detailed Situation Report from the Geographic Area Coordination Center (GACC) for the Rocky Mountains (RMCC) has the following data showing rapid spread of fires in populated areas:

New Fires: 18
Acres: 54,105
Uncontrolled Fires: 7

InciWeb shows the proximity of the Waldo fire to Colorado Springs, forcing the evacuation of more than 30,000.

Here are recent photos taken by Phillip Smith in the Colorado Springs area

The GACC Large Incident Report gives current estimates such as the following:

Waldo Canyon Structures Threatened: 20,085 PRIM , 160 COMM
Waldo Canyon Costs to Date: $1,950,194
High Park Structures Threatened: 1,969 PRIM , 6 COMM
High Park Structures Destroyed: 257 PRIM , 52 OUTB
High Park Costs to Date: $33,100,000

Although these numbers are incredibly large and show the fires to be disastrous, the Incident Information System and National Interagency Fire Center puts them in context of national activity.

Large fire activity increased yesterday as 15 new large fires were reported: four in Montana, three in Alaska and Alabama, and one in Arizona, Colorado, New Mexico, Utah and Wyoming. Firefighters contained five large fires, including the Dump and Grease fires in Utah.

[…]

10 Year Average 2003-2012
Fires: 36,407
Acres: 1,931,551

The large fire data across the western states corresponds with their map of high risk areas.

It can also be compared to the current Large Fire Incidents map from the USDA Forest Service Active Fire Mapping Program.

So there is a strong correlation of intelligence data with evidence of arson, but cause still needs to be fully investigated.

Good Security Questions

A domain has been registered called goodsecurityquestion.com. It’s hard to believe it isn’t a phishing site but it seems legit, albeit a bit sarcastic. It warns emphatically, for example, “there really are NO GOOD security questions” while at the same time it provides a list called “Good Security Questions”.

Maybe they should have said there is NO goodsecurityquestion.com.

My real issue with the site is that it does not explain in much detail why a good question differs from a fair question. Answering with a city in the good list does not seem any better than answering with a city in the fair list. In other words, the start of the question in the good list is “In what city…” and the start of the question in the fair list is “In what city…”, which seems to violate their own rules.

Perhaps an argument could be made for types of city questions. Threats will use brute force or research to guess the answers. So a question that asks for a honeymoon city might be easier to guess (shorter list) than a question that asks about a birth city. But that seems unlikely. Counter arguments are easy to make (e.g. if you ask for a city the rest of the question is ignored).

We used to spend long days and nights at Yahoo! trying to craft tough questions. A lot of research has been done since then but what I haven’t seen yet is a compound question system. It would be nice if the user could select from two parts (e.g. favorite sports team and then color, or favorite sports team and MVP). When answering they only get the first half of the question. The second part would be hidden from the attacker and therefore brute force and research would both be seriously disadvantaged.

Will Engineers Save the Network?

Lindsay Hill has posted a lament about change called “Will Engineers Hold Networking Back?

I’m sure anyone that’s used CiscoWorks over the last decade has been incredibly frustrated with it at times. I have heard it’s better now, although I can’t personally verify that. So engineers tried doing things with CiscoWorks in 2003 and it was buggy, so they gave up and went back to their PuTTY terminal. And they never looked back to see what might have changed in the meantime. Well, maybe it’s time we looked again. No, we won’t find all the tools ready now – but maybe we’ll be able to see a future, and we’ll be able to demand solutions from our vendors.

In short, he throws onto the table the proverbial question of how established societies should manage transition to new technology, procedures and skills. Can engineers see a future and get on board with it?

Who does he think created the last future? Que the change management consultants.

But seriously this reminds me of moments in my life when I have tried to explain change in technology. For example in 1992 I stayed up all night alone in the computer lab at college. Why? I was hard at work, downloading less than a minute of video from George of the Jungle.

Today it seems inconceivable I was alone with rooms full of unused computers; but in the early 1990s not many people seemed to care about sixteen idling processors with dedicated connections. To me it was like winning the lottery. I just needed to find a good use.

It was fairly easy to find video on the Internet through Gopher but it was a challenge and novelty to download a media segment to a tiny storage device over thin lines and then get it to play aloud.

The moment of true change came when Professor Green walked into the lab and asked me for my homework. He always came in early, cup of hot coffee in hand, to turn the lights on and get the lab ready for the day. He gave me a look of disappointment and I wondered if it was obvious I hadn’t slept in days.

I tried to distract him by sitting up and saying with excitement “look at the future, soon we ALL will be downloading video and watching the news from around the world over the Internet! CNN, MTV, everything! Watch this video…” He glared and shook his head and said “just do your f#^ng homework.”

Maybe it would have been different if I had downloaded something other than George of the Jungle. A 10 second segment of a 1960s cartoon wasn’t the best example to support my argument.

Alas, I did my homework. And I wrote for him a mock grant proposal to study the effects of teaching rural communities using the Internet and video. It was based on my own experience leaving a rural part of America to access better education. I wanted to test whether the Internet might help make that problem go away; if it worked in America, maybe America could make it work overseas and then it was just a short step before we’d be watching news video over the Internet!

Professor Green gave me an A- on the paper, which I barely remember the details of now. SPSS was involved somehow. The paper was probably filed away just as another shard of evidence that I could pass a milestone set by management. What I really remember were his words to me when we both stared into the face of change.

Why didn’t we both see the same thing? It is tempting to say it was an age disparity, or a training gap. Those are the usual suspects in discussions about why people resist change. He was a well-respected expert and into his later years. Why would he alter course and take a chance on spending time with some new-fangled experiments? The risk was high for him versus projects he had underway.

And therein lies the answer. Change has risk. There are those who take risks (knowingly or naively) and those who don’t. It’s a phenomenon that has been studied extensively and there are entire bookshelves of libraries filled with social and cultural theories. But in network engineering it’s fairly simple (ask me sometime about spice trades in the 1600s).

Bringing it back to Lindsay Hill’s lament, the answer is not binary. We should not settle with a yes or a no. Some engineers embrace risk and therefore are open to change. They even find other people who take risk to support them on the principle of making a large return on investment. However others do not embrace change so easily because they want to minimize risk. The world needs both and it is unreasonable to expect everyone to be the same.

And with all that in mind, here’s a video from Plexxi that gives a hilarious view of some of the challenges today in network management. They’re a new company you should definitely keep an eye on if you are interested in change.

We aim to resolve two major issues in the network—network automation and network scale—by leveraging the concept of affinities (the complete set of data center resources required to execute a given workload).