Captain Morgan Hated Being Called a Pirate Because He Hated Democracy

Someone just suggested to me that the Spanish loved pirates while the British hated them.

This isn’t even remotely true and it reminded me how a Spanish city official (Don Juan Pérez de Guzmán, a decorated veteran of wars in Flanders) once called Captain Morgan a pirate, meaning to insult him as the Spanish monarchy hated pirates.

The story then goes Morgan indeed hated the exchange and was so enraged that he planned a devastatingly brutal siege of the Spanish city Guzmán defended, torturing residents and pillaging the area for weeks.

Here’s how one historian has referred to Morgan’s style of leadership:

Behind him were smoldering ruins, pestilence, poverty, misery and death.

A first-person’s account of Morgan’s battles was written by Alexandre Exquemelin, a doctor serving him, in a book called Buccaneers of America. Exqumelin wrote that Morgan lashed together Spanish nuns and priests to use as human shields while he attacked the Spanish military, and that he regularly imprisoned and raped women.

Painting Morgan commissioned of himself while “under arrest” in London after 1672. Source: National Trust of the United Kingdom
Morgan’s argument to the Spanish was that he was a proud privateer in service of the British monarchy during war (Governor of Jamaica in 1667 gave Morgan a letter of marque to attack Spanish ships).

He ran an autocratic and ruthless mercenary operation accused by his own men of “cheating” them of promised wages and benefits as he pillaged cities, which he wasn’t even authorized to do. But hey, that’s privateer life in the immoral service to monarchy (ultimately charges against him were dismissed and instead he received a formal appointment to government, where he proudly owned hundreds of slaves to operate Jamaican sugar plantations). How dare anyone accuse him of being fair to his own people or a democratic leader? He would surely have tortured and killed them if they did.

In that sense, pirates seem almost like entrepreneurs challenging the brutality of unjust political systems of monarchy. Pirates fought against those who had expressly denied human rights and trafficked in human exploitation. They weren’t going to fight in wars that benefited only a few elites, because Pirates often used a democratic system of leadership based on votes and qualifications.

Privateers functioned almost in the opposite way to pirates; as business operators appointed by authority they served awful political systems to exploit high-risk and unregulated markets. They operated as ruthless mercenaries milking a corrupt system for some personal gain.

It’s a significant difference between an owner-operator business in undefined territory versus exploitative vigilantism. Somehow pirates have become associated with the latter when historically they seem to have operated more as the former.

This perhaps is best explained in Chapter 8 of “The Invisible Hook: The Hidden Economics of Pirates” by Peter T. Leeson

“Knowledge Wins”

The U.S. Army JFK Special Warfare Center and School has released a video called “Knowledge Wins Episode 4 – Great Power Competition – Part 1

The video starts by asking for a definition of competition, and the answer is…open. There are many different and relative definitions of competition, although in my research so far I’ve found universally that knowledge competes with privacy.

The video starts with this war-time poster encouraging people to gain knowledge:

And that reminded me of these two posters that hinted at war-time issues of privacy, information and knowledge:

Death of the Dust Seeker

By Abdukhebir Qadir Erkan

Source: Uighur Poets on Repression and Exile

Building his dwelling in the winds,
gifting the grubs the sun of his skies,
he left for the roads that run dark among letters.
Thirsting for seas that flow from night drops,
living his days outside of the seasons,
sketching his cry in a blossoming chest,
he left his flower with his dark lover.
The buds of his comforting shadows
dug ever deeper in his chest
as he stuttered like a speechless man
through canyons with word-choked memories.

*

Now grant him permission
to die as gloriously as a grub.
Let the tongue that darkened as his hair grew white
be a grave in his soul’s ruined temple.
Make a coffin from the blackboard that ate his lungs,
as we mourn him let it be our wake.

—August 3, 2017

This Day in History: 1945 US Dropped Atomic Bomb on Hiroshima, Japan

Japanese cities destroyed by strategic bombing in World War II. Source: “Tokyo vs. Hiroshima,” Alex Wellerstein, September 22, 2014

The usual story told in American history classes is that dropping two atomic bombs on Japan saved American lives. This is mostly false.

Studies now show nearly as many Americans died from nuclear radiation and fallout during creation of these bombs, as died in Japan from the bombs being dropped.

Source: “Some Unintended Fallout from Defense Policy: Measuring the Effect of Atmospheric Nuclear Testing on American Mortality Patterns,” Keith Meyers, University of Arizona

One might still say American soldier lives were saved at the time these two bombs were dropped (instead of invasion), even if so many Americans were killed at shockingly high rates for decades afterwards.

The problem with this theory is the atomic bombs didn’t force surrender either.

Nonetheless a story told in American policy circles has been that dropping two bombs on Japan proved such a level of superiority in warfare (“assured destruction”), it somehow suddenly compelled the Japanese to immediately give up… not to mention a story also told that atomic bombs held the Soviets at bay afterwards. All this unfortunately is false history (see “Hidden Hot Battle Lessons of Cold War“, for additional perspective).

Here is Truman’s famous June 1st, 1945 speech calling on Japan to surrender, just to set the context of what the public was hearing at the time:

Take note that the warning was after massive bombing campaigns like March 9-10, 1945 where some 330 B-29 bombers burned 40 square miles of wood-built Tokyo to the ground killing over 100,000 civilians.

Source: “A Forgotten Horror: The Great Tokyo Air Raid,” Time, March 27, 2012

However Japan didn’t fear civilian casualty loads and couldn’t have really understood at the time why this new bomb mattered in August after a long summer of entire cities being destroyed. In a chillingly ironic manner US military leaders also didn’t fear civilian casualties.

Source: “Dar-win or Lose: the Anthropology of Security Evolution,” RSA Conference 2016

Japanese leaders instead greatly feared Soviet declaration of war on them. They thought Stalin’s shift to formal enemy would very negatively alter the terms of surrender (Soviets no longer would mediate a surrender that Japan had been asking about for weeks before the bombs were dropped).

I don’t write these things to be provocative, rather to help us better educate people about the past and also to plan for the future. Perpetuating a false narrative doesn’t do America any favors. And most of what I’m writing here is old news.

In 2013 for example Foreign Policy published “The Bomb Didn’t Beat Japan … Stalin Did

Japanese historians contended it was the USSR declaring war against Japan that convinced their Emperor and gov that surrender was the only option.

Japan referred to atomic bombs like a “single drop of rain in the midst of a hurricane”, given that they already had seen months-long fire-bomb raids of Tokyo that left it over 50% destroyed with 300,000 burned alive and 750,000 injured.

The reason Tokyo wasn’t targeted with atomic bombs was it was too destroyed already — atomic effect wouldn’t have been measurable (125,000 were killed in atomic attacks on Hiroshima and Nagasaki, which would mean it was similar in effect or even less than a single night of the fire bomb raids hitting Tokyo for months)

Two years before the Foreign Policy piece, a 2011 article in Boston papers offered the following insightful analysis in “Why did Japan surrender?

“Hasegawa has changed my mind,” says Richard Rhodes, the Pulitzer Prize-winning author of “The Making of the Atomic Bomb.” “The Japanese decision to surrender was not driven by the two bombings.” […] “The bomb – horrific as it was – was not as special as Americans have always imagined. …more than 60 of Japan’s cities had been substantially destroyed by the time of the Hiroshima attack, according to a 2007 International Security article by Wilson, who is a senior fellow at the Center for Nonproliferation Studies at the Monterey Institute of International Studies. In the three weeks before Hiroshima, Wilson writes, 25 cities were heavily bombed. To us, then, Hiroshima was unique, and the move to atomic weaponry was a great leap, military and moral. But Hasegawa argues the change was incremental. “Once we had accepted strategic bombing as an acceptable weapon of war, the atomic bomb was a very small step,” he says. To Japan’s leaders, Hiroshima was yet another population center leveled, albeit in a novel way. If they didn’t surrender after Tokyo, they weren’t going to after Hiroshima.

It’s very hard to argue with these common sense points. Massive civilian casualties were mounting and having little effect. Did novelty of a bomb that was a secret suddenly change minds? Even common sense would say no, and the historical record increasingly confirms this.

Or as DW puts it in their documentary, why did American drop a second bomb on Nagasaki if that Hiroshima one supposedly could send a message to surrender?

Civilian suffering had never coerced Tokyo to change tactics, and these bombs also failed in that sense. Hiroshima was the 69th city in Japan destroyed by bombing and Nagasaki wasn’t even the primary target when it was destroyed just for the sake of bombing someplace at all.

In the end, America dropped these bombs most probably to see what the effects of dropping atomic bombs would be (expressed in the DW video above as “…my mother fell apart like dry sand when I touched her foot…”) and then the US Air Force created a supporting narrative to justify continuing the program.

Historians have been trying to explain the false stories away ever since.

Cultural Spectrum of Trust

Are you more likely to believe a prince in Africa is coming to give his wealth to you (get rich quick), or that AntiFa is coming to take your wealth away from you (get poor quick)? American cognitive trust has a dangerous vulnerability called… bias.

Often I speak about the cultural relativity of privacy. Americans and Swedes will sacrifice privacy for selfish reasons, while on the other end of the spectrum China and India will sacrifice privacy for social good… according to a study by Microsoft buried in a 2013 “connected world” transcript of the FTC.

The last variation is when we look at the value exchange from no benefit to community benefit. And what we see here, and this is a trend throughout the rest of the survey, is that the value exchange for community benefit is much, much larger proportionally in China and India than in the western countries.

Another interesting area of cultural relativity is notions of trust. The following HBR study of “Getting to Yes Across Cultures” may help explain why the 419/AFF scam is so effective on US targets.

Source: Getting to Yes Across Cultures, HBR 2015

Our research has shown how the 419/AFF attack uses an emotional appeal mixed into a cognitive blindness test to disarm even the most rational, trained and intelligent people.

On the above linear chart you perhaps see the issue more easily (note the spread between US and Nigeria).

A purely emotional appeal alone would not work on the cognitive end, since affection sits far away on a trust spectrum for business deals that require a cognitive-style presentation. That is why people assume intelligence is a defense and they are invulnerable by being typical rational thinkers.

However, the emotional appeal becomes very dangerous, weaponized if you will, by building a short-cut bridge to the other end based on a vulnerability in cognition (cognitive bias). It’s dangerous because each end has its own set of expertise, tools and skills to stay safe.

Thus, evidence of bias should be seen as a key predictor to unlock why highly intelligent people still may be vulnerable to emotive fraud campaigns that bridge the two ends (e.g. AntiFa, AFF). Victims act when they have impulse/motivation towards an emotional appeal that has successfully breached their attention, such as greed or fear.

People who connect with false sudden wealth (greed) fall for AFF being real opportunity. People who connect with false sudden loss (fear) fall for AntiFa being real threat.

Again, it is wrong to think that intelligence or success in life is an antidote to these attacks. Someone wise to their own world of defense, law, finance, medicine, etc. is actually at high risk to develop a false cognitive trust when they harbor a bias.

In the case of AFF that bias tends to be ignorance about blacks and specifically Africans (racism), which means victims believe a rich prince or relative of a dictator really might have some money that needs laundering. We’ve seen a lot of this cognitive bias attack since we started formal research on it in 2005.

The movie “Coming to America” gives a good sense of what some people in America would not register as a comedy but think actually how the world works.

More recently, in the case of AntiFa, we’re seeing a new bias vulnerability. It looks to be class-based ignorance (modern versions of racist McCarthyism, or misogynist Birchirsm) with fears of progressive movements causing loss of establishment power. Targets are triggered by the idea of impoverished youth redistributing power (perceived loss) and threatening assets or disrupting sense of control.

Narratives warning of AntiFa seem to have the same attack patterns as AFF that engineer target behavior, yet the complete inversion. While “Coming to America” comedy is about joy from sudden wealth, the AntiFa story is fear of sudden wealth loss. Perhaps a new and updated movie is needed.

Think of it this way. Saying to a hawkish policy thinker there is no chance of sudden loss from AntiFa is like saying to a racist banker there is no chance of sudden gain from an African prince.

It is an emotional appeal to a deep-seated bias why we see far-right sympathetic Americans ignore report after report that AntiFa is not a threat, while ignoring the obvious and mounting deaths from far-right terrorists:

Perhaps most convincingly to the unbiased thinker is a simple fact of history that AntiFa is “anti-fascism”. While it promises to negate threats to life it offers little or no substantial directive power towards any political movement even during troubled times.

…the labor movement’s failure to defeat Hitler and the fact that Germany had required liberation from without drove antifascists to a largely reactive policy, vigorously pursuing former Nazi officials and purging society of collaborators, but neglecting to build a plausible vision for a “new Germany” beyond both fascism and Cold War machinations.

Being anti-fascist thus is a negation of fascism, and historically has lacked the vigor for anything more directed. At best it is a centrist’s guard rail against extremism, because it serves as movement towards defense of basic rights. At worst it’s a nuisance cost when property needs restoration. It’s the opposite of any generalized threat, as it mainly negates an actual and specific threat called fascism. Here are two historic examples that may help clarify:

First, Birchirism manifested in being anti-ERA. That didn’t mean it was not a threat but rather begs the question of whether its negation of equal rights can be taken as such a generalized threat that it demands militarized violent response and classification of being anti-ERA as a form of terrorism?

Second, AntiFa is like calling a seat belt an anti-head-injury movement. Does it threaten American freedom to stop deaths of Americans? There were indeed Americans who used to argue against seat belts in this way (and against air bags, for that matter) yet it turns out seat belts enabled freedom by preventing huge numbers of dead (and yes, death is the most definitive end of freedom).

Of course, it is still true there are both dictators in Africa attempting to launder money (gain wealth) as well as youths attempting to stop fascism (redistribute power) when they see it. The point is not to say these are untrue facts, rather to say that a grain of truth can be made explosive in asymmetric information warfare and turn facts into completely false narratives.

Counter-terrorism expert Paul Cobaugh of Narrative Strategies perhaps put it best:

U.S. Department of Homeland Security and others are running around trying to make AntiFa into some type of grand, orchestrating terrorist org that’s a threat to the US. This is not true. They do show up in a semi-organized fashion to physically oppose those they consider “fascist”. I don’t condone any violence in our streets but when it comes to being a national threat, they are very low on the priority list, unless of course, you’re a fascist.

Americans on average are no more likely to get rich from African dictators laundering money than they are at risk from liberal youths storming their McMansion walls to take wealth away in the name of racial justice. However, in both cases cognitive thinkers can be seen flipping into very emotional yet unregulated territory and being set up for errors in judgment (manipulated by threat actors hitting them with “get rich/poor quick” attacks).

In conclusion, beware: false emotional appeal triggers cognitive thinkers by attacking a dangerous vulnerability known as… bias. Disnformation trackers/destroyers constantly need to be updated.

Chocolate Chip Cookie History and The Myth of “Butter Drop Do”

The traditional drop cake (also called drop biscuit) was a popular historic treat in America copied from Europe. However, somehow in America the act of baking a common and popular British drop cake with common and popular chocolate turned into a fancy narrative about how chocolate chip cookies had just been “invented” by a woman in 1938.

Is the invention story true? Are they even American?

Let’s start by scanning through the typical drop cake recipes that can easily be found in the first recipe book publications in English:

  • 1883: Ice-cream and Cakes: A New Collection
  • 1875: Cookery from Experience: A Practical Guide for Housekeepers
  • 1855: The Practical American Cook Book
  • 1824: A New System of Domestic Cookery
  • 1792: The London Art of Cookery
  • 1765: The art of cookery, made plain and easy

Now let’s see the results of such recipes. Thanks to a modern baker who experimented with a 1846 “Miss Beecher’s Domestic Recipe Book” version of drop cake, here we have a picture.

Source: FourPoundsFlour, Sarah T.

Raisins added would have meant this would be a fruit drop cake (or a fruit drop biscuit). There were many variations possible and encouraged based on different ingredients such as rye, nuts, butter or even chocolate.

Here’s an even better photo to show drop cakes. It’s from a modern food historian who references the 1824 “A New System of Domestic Cookery” recipe for something called a rout cake (rout is from French route, which used to mean a small party or social event).

Source: A Taste of History with Joyce White

That photo really looks like a bunch of chocolate chip cookies, right? This food historian even says that herself by explaining “…[traditional English] rout cakes are usually a drop biscuit (cookie)…”.

Cakes are cookies. Got it.

This illustrates quickly how England has for a very long time had “tea cakes with currants”, which also were called biscuits (cookies), and so when you look at them with American eyes you would rightfully think you are seeing chocolate chip cookies. But they’re little cakes in Britain.

More to the point, the American word cookie was derived from the Dutch word koek, which means… wait for it… cake, which also turns into the word koekje (little cake):

Dutcheen koekje van eigen deeg krijgen = a little cake of your own dough (literal) = a taste of your own medicine (figurative)

So the words cake, biscuit and cookie all can refer to basically the same thing, depending on what flavor of English you are using at the time.

Expanding now on the above 1855 recipe book reference, we also see exactly what is involved in baking a drop cake/koekje/cookie:

DROP CAKES: Take three eggs, leaving out one white. Beat them in a pint bowl, just enough. Then fill the bowl even full of milk and stir in enough flour to make a thick, but not stiff batter. Bake in earthen cups, in a quick oven. This is an excellent recipe, and the just enough beating for eggs can only be determined by experience.

DROP CAKES. Take one quart of flour; five eggs; three fourths of a pint of milk and one fourth of cream, with a large spoonful of sifted sugar; a tea-spoon of salt. Mix these well together. If the cream should be sour, add a little saleratus. If all milk is used, melt a dessert-spoonful of butter in the milk. To be baked in cups, in the oven, thirty to forty minutes.

I used the word “exactly” to introduce this recipe because I found it so amusing to read the phrase “just enough” in baking instructions.

Imagine a whole recipe book that says use just enough the right ingredients, mix just enough and then bake just enough. Done. That would be funny, as that’s the exact opposite of how the very exact science of modern baking works.

Bakers are like chemists, with extremely precise planning and actions.

And finally just to set some context for how common it became in America to eat the once-aristocratic drop cakes, here’s the 1897 supper menu in the “General Dining Hall Bill of Fare” from the National Home for Disabled Volunteer Soldiers:

Source: Report of Inspection of State Soldiers and Sailors’ Homes for Year Ending June 30, 1897, by National Home for Disabled Volunteer Soldiers

Back to the question of chocolate chip cookies versus drop cake, and given all the current worry about disinformation, a story researched on Mental Floss explains that a myth has been created around someone making “Butter Drop Do” and inventing cookies because she accidentally used a “wrong” type of chocolate.

The traditional tale holds that Toll House Inn owner Ruth Wakefield invented the cookie when she ran out of baker’s chocolate, a necessary ingredient for her popular Butter Drop Do cookies (which she often paired with ice cream—these cookies were never meant to be the main event), and tried to substitute some chopped up semi-sweet chocolate instead. The chocolate was originally in the form of a Nestle bar that was a gift from Andrew Nestle himself—talk about an unlikely origin story! The semi-sweet chunks didn’t melt like baker’s chocolate, however, and though they kept their general shape (you know, chunky), they softened up for maximum tastiness. (There’s a whole other story that imagines that Wakefield ran out of nuts for a recipe, replacing them with the chocolate chunks.)

There are three problems with this story.

One, saying “butter drop do cookies” is like saying butter cake do little cakes. That’s hard on the ears. I mean “butter drop do” seems to be some kind of a misprint or a badly scanned script.

This very uniquely named recipe can be found under a cakes category in the 1796 American Cookery book (and don’t forget many drop cake recipe books in England pre-dated this one by decades).

Butter drop do .

No. 3. Rub one quarter of a pound butter, one pound sugar, sprinkled with mace, into one pound and a quarter flour, add four eggs, one glass rose water, bake as No. 1.

The butter drop cake (do?) here appears to be an import of English aristocratic food traditions, which I’ve written about before (e.g. eggnog). But what’s really interesting is this American Cookery book in 1796 is how actual cookie recipes can be found and are completely different from the drop cake (do?) one:

Cookies.

One pound sugar boiled slowly in half pint water, scum well and cool, add two tea spoons pearl ash dissolved in milk, then two and half pounds flour, rub in 4 ounces butter, and two large spoons of finely powdered coriander seed, wet with above; make roles half an inch thick and cut to the shape you please; bake fifteen or twenty minutes in a slack oven–good three weeks.

And that recipe using pearl ash (early version of baking powder) is followed by “Another Christmas Cookey”. So if someone was knowingly following the butter drop cake (do?) recipe instead, they also knew it was explicitly not called a cookie by the author.

Someone needs to explain why the chocolate chip cookie “inventor” was very carefully following a specific cake/koekje recipe instead of a cookie one yet called her “invention” a cookie.

Two, chocolate chips in a drop cake appear almost exactly like drop cakes have looked for a century, with chips or chunks of sweets added. How inventive is it really to use the popular chocolate in the popular cake and call it a cookie?

Three, as Mental Floss points out, the baker knew exactly what she was doing when she put chocolate in her drop cakes and there was nothing accidental.

The problem with the classic Toll House myth is that it doesn’t mention that Wakefield was an experienced and trained cook—one not likely to simply run out of things, let accidents happen in her kitchen, or randomly try something out just to see if it would end up with a tasty result. As author Carolyn Wyman posits in her Great American Chocolate Chip Cookie Book, Wakefield most likely knew exactly what she was doing…

She was doing what had been done many times before, adding a sweet flavor to a drop cake, but she somehow then confusingly marketed it a chocolate chip cookie. I mean she came up with a recipe sure, but did she really invent something extraordinary?

Food for thought: is the chocolate chip cookie really just Americans copying unhealthy European habits (i.e. tipping) to play new world aristocrats instead of truly making something new and better?

What about the chocolate chip itself? Wasn’t that at least novel as a replacement for the more traditional small pieces of sweet fruit? Not really. The chocolate bar, which precipitated the chips, has been credited in 1847 to a British company started by Joseph Storrs Fry.

Thus it seems strange to say that an American putting a British innovation (chocolate bar chips) into a British innovation (drop cake/biscuit/cookie) is an American invention, as much as it is Americans copying and trying to be more like the British.

The earliest recipe I’ve found that might explain chocolate chip cookies is from 1912 (20 years before claims of invention) in “The Twentieth Century Book for the Progressive Baker, Confectioner, Ornamenter and Ice Cream Maker: The Most Up-to-date and Practical Book of Its Kind” by Fritz Ludwig Gienandt.

Source: Twentieth Century Book for the Progressive Baker, Confectioner, Ornamenter and Ice Cream Maker, by Fritz Ludwig Gienandt

This Day in History 1947: U.S. National Security Act

A recent post I published gave some of the backstory of modern intelligence and information warfare in America, from 1930s through WWII. That post actually culminates on this day in 1947, when the CIA was established officially.

The history department of the CIA doesn’t put things lightly when it describes their founding in the 1947 U.S. National Security Act (NSA):

President Harry S. Truman signed the National Security Act of 1947 (P.L. 80-235, 61 Stat 496) on July 26, 1947. The act – an intricate series of compromises – took well over a year to craft. […] The importance of the National Security Act cannot be overstated. It was a central document in U.S. Cold War policy and reflected the nation’s acceptance of its position as a world leader.

You can read more details about that “intricate series of compromises” they mention, in a 1996 document hosted by the State Department: 1945-1950 Emergence of the Intelligence Establishment

Speaking of the State Department, their historian is far more muted in assessment of the NSA and takes a weird tangent that leaves the CIA to a secondary story:

Each President has accorded the NSC with different degrees of importance and has given the NSC staff varying levels of autonomy and influence over other agencies such as the Departments of State and Defense. President Dwight D. Eisenhower, for example, used the NSC meetings to make key foreign policy decisions, while John F. Kennedy and Lyndon B. Johnson preferred to work more informally through trusted associates. Under President Richard M. Nixon, the NSC staff, then headed by Henry A. Kissinger, was transformed from a coordinating body into an organization that actively engaged in negotiations with foreign leaders and implementing the President’s decisions. The NSC meetings themselves, however, were infrequent and merely confirmed decisions already agreed upon by Nixon and Kissinger.

To be fair, while Truman had a particular take on it, those following him into office haven’t been entirely different. The Act created a National Security Council (NSC) with an Executive Secretary to advise the President indirectly (arguably through Department of State), yet did not say anything about a National Security Advisor (NSA). Nonetheless after Eisenhower’s appointment of Robert Cutler in 1952 to be “Special Assistant to the President for National Security Affairs”, which elevated/oversaw the Council, every president since has appointed a NSA.

The Air Force historian, for some additional perspective, takes an opportunity to thumb its nose at the Army and Navy, while pumping up its own balloon and ignoring the CIA altogether:

This act officially established the United States Air Force as a separate and co-equal branch of the United States Armed Forces. The U.S. Air Force’s quest for independence was a long and often contentious struggle between air-minded officers and the entrenched Army and Navy bureaucracy.

To be fair, the NSA also replaced the Department of War (started in 1789) with an Army Department in a new National Military Establishment (NME). Seems unfair for the Air Force to be talking about independence from an entrenched Army, given an Army department also was brand new and co-joined to the Air Force in NME (by 1950 called the Defense Department).

However, back to the CIA claiming acceptance of world leader position in 1947, it would take another whole year to this very same day in 1948 before Truman signed Executive Order 9981 to formally push Civil Rights and declare an end to discrimination in its own military.

The CIA historian is not wrong about the NSA being a significant event in American history. It completely shifted the entire country to discussion of National Security along the lines that the CIA’s father Donovan in “room 109” had envisioned. It seems obvious now because the shift is complete but back in 1947 it was revolutionary for the term “security” to bring more expansive thinking than prior terms such as defense, adversary or threat.

Somehow both this creation of the National Security mindset and the seminal Civil Rights order for it to work properly always have taken a back seat, if mentioned at all. Almost all narratives given about America during the Cold War focus instead on the Truman Doctrine and Marshall Plan. Check out my BSidesLV presentation called “Hidden Hot Battle Lessons of the Cold War” for more on this topic of American security, leadership and civil rights.

Permanent Improvisation: Nazi Dictatorship Was Opposite to Law and Order

Important insights come from reading “The German Dictatorship” by Karl Dietrich Bracher, who was a professor of politics and history at the University of Bonn

The German dictatorship did not mean ‘law and order.’ The Third Reich lived in a state of permanent improvisation: the ‘movement’ once in power was robbed of its targets and instead extended its dynamic into the chaos of rival governmental authorities.

Nazi Germany was a state of permanent improvisation.

Today this method of unaccountable governance is seen in headlines such as “[White House occupant] and Woody Johnson act as if the rules don’t apply to them”

Bracher goes on to say it was democracy, through regulation and governance, where the foundations of prosperity could be found because it offered a meaningful level of stability (true order based on justice).

Perhaps the next time someone says they love the “fail faster” culture of Facebook, ask them if they also see it as a modern take on the state of permanent improvisation favored by Hitler.

Facebook’s staff now claim to be in opposition to their own failure culture “Hurting People at Scale“:

“We are failing,” [a seven-year Facebook engineer] said, criticizing Facebook’s leaders for catering to political concerns at the expense of real-world harm. “And what’s worse, we have enshrined that failure in our policies.”

The failures and real-world harm are intentional and orchestrated by Facebook officers who somehow manage to escape responsibility:

…growing sense among some Facebook employees that a small inner circle of senior executives — including Chief Executive Mark Zuckerberg, Chief Operating Officer Sheryl Sandberg, Nick Clegg, vice president of global affairs and communications, and Joel Kaplan, vice president of global public policy — are making decisions that run counter to the recommendations of subject matter experts and researchers below them, particularly around hate speech, violence and racial bias…

It begs the question again, can the Security Officer of Facebook be held liable for atrocity crimes and human rights failures he facilitated?

After reading Bracher’s wisdom on Nazi platform design, and seeing how it relates to the state of Facebook, now consider General Grant’s insights of 1865 at the end of the Civil War when Lee’s treasonous Army of Northern Virginia surrendered:

I felt like anything rather than rejoicing at the downfall of a foe who had fought so long and valiantly, and had suffered so much for a cause, though that cause was, I believe, one of the worst for which a people ever fought, and one for which there was the least excuse.

It should be no surprise then that it was Grant who created the Department of Justice.

We won’t rejoice at the downfall of Facebook, despite them being one of the worst companies for which a people ever worked, and for which there was the least excuse. Their unregulated state of permanent improvisation — a fast-fail culture used to avoid accountability for real-world harms for profit at scale — needs to end.

Facebook is a digital slavery plantation. “fail faster” turns out to be just “fail” without accountability, which turns out to just be privilege to do known wrongs to people and get rich.

Grant wasn’t opposed to change or failure, of course, he just put it all in terms of being on the right side of history, which he forever will be (PDF, UCL PhD Thesis) and unlike the Facebook executives who should be sent to jail:

My failures have been errors in judgment, not of intent.

The 18th Chairman of the Joint Chiefs of Staff, General Martin Dempsey, frames Grant’s memoirs for us like this:

Our intentions matter. They reflect our motivations, our beliefs, our character. If we start with good intentions, and hold ourselves accountable to them, we start in the right place.

Facebook management continuously had bad intentions since it was first conceived as a platform for men to amass power and do wrongs (a failed attempt to invite crowds into physically shaming women who refused to go on a date with the founder).

…opened on October 28, 2003—and closed a few days later, after it was shut down by Harvard execs. In the aftermath, Zuckerberg faced serious charges of breach of security, violating copyrights, and violating individual privacy. Though he faced expulsion from Harvard for his actions, all charges against him were eventually dropped.

Bad intentions. No justice.

Fast forward to today, and officers of the company haven’t truly been held accountable. They definitely did not start in the right place and they continue to wrong people around the world. Their state of immoral and permanent improvisation has been a human rights disaster and needs to be stopped.

Slow is smooth, smooth is fast.

Photo of me applying smooth and fast theory to the 2007 North American Championships of the A-Class Catamaran

the poetry of information security