The cups, which plug into outlets on cargo planes to reheat liquids such as water or coffee, have a faulty plastic handle that easily breaks when the cups are dropped. And because replacement parts for the cup are no longer made, the Air Force has had to order a whole new cup when the handle breaks.
In an Oct. 2 letter to Air Force Secretary Heather Wilson, Grassley said that 25 replacement cups, each costing roughly $1,280 each, have been bought this year alone, for a total of roughly $32,000.
That’s a latte money.
Congress apparently wants to get a grip on the situation and a brewhaha has started.
Quick, someone introduce these air crews to iced coffee before the bean counters bring the entire program to a grinding halt.
You might be wondering if this post is about raising the physical performance bar for a soldier, and it actually is the opposite. When I say bar I mean food. And by new bar, I mean something tasty like chocolate, which lowers the dangers from physical stress.
“Research showed compliance was better when calcium and vitamin D were provided in a fortified bar,” said Army Maj. Kayla Ramotar, dietitian with the Army’s Training and Doctrine Command. “Trainees don’t get a lot of treats during basic training, and since this bar is made of chocolate, we know compliance won’t be an issue. It’s a lot more enticing than having to swallow a bunch of pills.”
I’m imaginging a poster now that says “Basic training. It’s no treat.”
Bottom line is that bone fractures were causing high numbers of drop-outs after strenuous physical tests. So the military has turned the sage old theory of “milk and cookies before bedtime” into a vitamin D enriched calcium bar. I suppose the tryptophan angle of this could mean people sleep better at night, which stimulates better recovery, but it’s seems like they’re going for the more direct vitamin to bone strength results.
From personal perspective I do believe a high consumption of vitamin D and calcium (I often was drinking a gallon of milk per day) prevented fractures many times over. One day, as I sat up on an examination table and my eyes involuntarily poured water, doctors repeatedly questioned me about incident details because they expected to see fractures where there were none.
This performance bar sounds more convenient than how I managed my diet, for sure, and I am going to wager right now that the study of 4,000 soldiers who eat the bar reveals positive results.
It has been five years since Czech climate change researchers highlighted in a report that there are ancient markers to warn when rivers drop dangerously low:
Hydrological droughts may also be commemorated by what are known as “hunger stones”. One of these is to be found at the left bank of the River Elbe (Deˇcˇ´ın-Podmokly), chiselled with the years of hardship and the initials of authors lost to history (Fig. 2). The basic inscriptions warn of the consequences of drought: Wenn du mich siehst, dann weine [“If you see me, weep.”]. It expressed that drought had brought a bad harvest, lack of food, high prices and hunger for poor people. Before 1900, the following droughts are commemorated on the stone: 1417, 1616, 1707, 1746, 1790, 1800, 1811, 1830, 1842, 1868, 1892, and 1893.
The extreme drought period in summer 2015 enabled the levelling of historical watermarks on the „Hunger Stone” (Hungerstein) in the Elbe in Czech town of Děčín. The comparison of the obtained levels of earlier palaeographic records with systematic measurements in the Děčín profile confirmed the hypothesis that the old watermarks represent the minimal water levels.
So far 22 grenades, mines or other explosives have been found in the Elbe this year, Saxony-Anhalt police spokeswoman Grit Merker told DW. “We ascribe that to the low water level. That’s pretty clear,” she said.
July was the hottest month in Germany since temperatures have been recorded, while July 31 was the hottest day, with temperatures reaching 39.5 degrees Celsius (103.1 degrees Fahrenheit) in Bernburg, Saxony-Anhalt.
Earlier this week the water level was down to 51 centimeters (20 inches) in Magdeburg, the capital of Saxony-Anhalt. The historical low point was 48 centimeters in 1934.
“If you see me, weep” has a poetic meaning, almost like writing “cry me a river” on the hunger stones, which tourists come to soak up…if you’ll pardon the pun.
Explosives being revealed is such an opposite story, perhaps the Germans soon will inscribe their stones with typically dark humor: “Achtung! Allen Kindern steht das Wasser bis zum Hals, nur nicht Beate, die fängt die Granate.” (Warning! Water too high for children, except for Wade, who found the Grenade.) It expresses that drought brings war for poor people.
Former CIA director John Brennan, whose security clearance you revoked on Wednesday, is one of the finest public servants I have ever known. Few Americans have done more to protect this country than John. He is a man of unparalleled integrity, whose honesty and character have never been in question, except by those who don’t know him.
I would consider it an honor if you would revoke my security clearance as well, so I can add my name to the list of men and women who have spoken up against your presidency
Those are the very choice words by the famous William H. McRaven, retired Navy admiral, commander of US JSOC from 2011-2014 including the 2011 raid in Pakistan that killed al-Qaeda leader Osama bin Laden
I am reminded, of course, of General Beck’s lament in 1938, which history warns us was done far too late and far too dependently on foreign intervention (Beck estimated Chamberlain’s cautious approach was evidence domestic German resistance could not count on UK support):
Also I have been told these “More Becks” beer ads in Germany in the decades after WWII were no coincidence:
After decades of seeing activists lay out the obvious economics of meat, and reading research by economists confirming the obvious, it looks as if the market finally is shifting. Eating meat is by far the number one impact to climate change and executives are starting to execute on the meatless menu, as you will see in a minute.
It always has seemed weird to me that if you wanted to remove meat from your work meals, or airplanes for that matter, you had to check a special box. Really it should be the other way around. If someone wants to add meat, let them be the “special” case.
I suppose executive dinners and board meetings should have something like this:
please check box if you want a major global catastrophic impact from your meal
Makes little to no sense to have meat automatically, and people should have to choose to accelerate global destruction, rather than set it as the mindless default.
Let me be clear here. I’m not saying I would never check the box. I’m not saying there would never be need for meat. I would always want the default to be meatless. When I say make it rare I mean it both ways. The economics of why are obvious, as I will probably say continuously and forever.
For example, years ago I was running the “Global Calculator” created for economic modeling, and reducing meat consumption undeniably had more impact than any other factors.
The Global Calculator is a model of the world’s energy, land and food systems to 2050. It allows you to explore the world’s options for tackling climate change and see how they all add up. With the Calculator, you can find out whether everyone can have a good lifestyle while also tackling climate change.
A sad and ironic side note here is the fact that meat consumption is the top factor in the “extinction crisis“, as 3/4 of earth’s animal population is disappearing at an alarming rate.
I think it still may be counter-intuitive for a lot of folks when they hear they should stop eating meat to reduce climate change to prevent extinction of animals.
If you really like meat you will eat it rarely. Get it?
Thus a logical approach to solving many of the expensive problems people face today and into the future is to limit meat consumption within commercial space, because that’s where some expansive top-down decisions easily are made.
Imagine Google removing meat from its school-lunch-like program for its school-campus-like facilities for its school-children-like staff running its school-peer-review-like search engine. Alas, that probably means real executive leadership (not exactly what you get with kids trying to stay in school forever) where someone issues a simple order to reflect a principled stand (pun intended).
The first step on this path really should be Mar-a-Lago converts to vegan-only menus and becomes a research center for climate change, but I digress…
…told its 6,000 global staff that they will no longer be able to expense meals including meat, and that it won’t pay for any red meat, poultry or pork at WeWork events. In an email to employees this week outlining the new policy, co-founder Miguel McKelvey said the firm’s upcoming internal “Summer Camp” retreat would offer no meat options for attendees.
“New research indicates that avoiding meat is one of the biggest things an individual can do to reduce their personal environmental impact,” said McKelvey in the memo, “even more than switching to a hybrid car.”
It’s crazy to me that someone is calling out new research here when there is so much legacy work, but I guess that covers the question why they waited so long to do the right thing.
To find the most compelling climate change falsehood currently influencing public opinion, van der Linden and colleagues tested popular statements from corners of the internet on a nationally representative sample of US citizens, with each one rated for familiarity and persuasiveness.
The winner: the assertion that there is no consensus among scientists, apparently supported by the Oregon Global Warming Petition Project. This website claims to hold a petition signed by “over 31,000 American scientists” stating there is no evidence that human CO2 release will cause climate change.
The study also used the accurate statement that “97% of scientists agree on manmade climate change”. Prior work by van der Linden has shown this fact about scientific consensus is an effective ‘gateway’ for public acceptance of climate change.
Bring out the facts! I’ve noticed security professionals often ignore climate change harm and need facts as a gateway to accept that there are risks. Maybe a good time to drop facts on these self-proclaimed risk management elites is when they head to Las Vegas this summer…observe them carelessly gorging on meat while claiming to care about threats to their environment, and hand them an invite them to an exclusive WeWork party.
A friend recently went through my liquor cabinet and pulled out a mostly-empty bottle of Knob Creek. I had forgotten about it, although in the early-1990s it had been a favorite. It was introduced to me by a Milwaukee bartender in an old dark wooden dive of a bar on the city waterfront.
“I’ll take whatever” meant he poured me a glass of seltzer, stirred in a spoonful of very dark jam, threw an orange peel twist on top and told me “enjoy life, the old-fashioned way.” It sounded corny (pun not intended), especially when he also growled “this ain’t a bright lights and gin or vodka type place” (pre-prohibition, not a speakeasy).
“What’s with the jam?” I asked. He threw a thumb over his shoulder at a cast-iron looking tiny pot-belly stove against a black wall under a small brightly-lit window. I squinted. It was almost impossible to focus on except for its small red light. Steam was slowly rising from its top edges into the bright window. “Door County cherries” he said as he wiped the bar “pick’em myself. That’s my secret hot spiced mash.” This was an historic America, with heavy flavors from locally-grown ingredients, which contrasted sharply with what “popular” Milwaukee bars were serving (gin or vodka).
It was a very memorable drink. For years after I continued to have Knob Creek here and there, always thinking back fondly to that waterfront dive bar, and to the advice to avoid “bright lights and gin or vodka”. Knob Creek wasn’t exactly a replacement for the rye I really wanted, yet it was good-enough alternative, and I didn’t drink it fast enough to worry about its rather annoyingly high price of $15 a bottle.
Ok, so my friend pulls this old bottle of Knob Creek out of my cabinet. He’s drinking it and I’m telling him “no worries, that’s an old cheap bottle I can grab another…”. He chokes. “WHAAAT, nooo. Dude the Knob is one of Beam’s best, it’s a $50 bourbon. It’s the really good stuff.” Next thing I know my old Knob Creek bottle is in the recycling bin and I’m on the Internet wondering if I should replace it.
African-American Distillers May Have Invented Bourbon
A lot has changed in the world of American whiskey marketing since Knob Creek was $15
This year is the 150th anniversary of Jack Daniel’s, and the distillery, home to one of the world’s best-selling whiskeys, is using the occasion to tell a different, more complicated tale. Daniel, the company now says, didn’t learn distilling from Dan Call, but from a man named Nearis Green — one of Call’s slaves.
The real kicker to this Jack Daniel PR move is that it explains master distillers came from Africa, and slavery meant they ended up in regions that give them almost no credit today:
“[Slaves] were key to the operation in making whiskey,” said Steve Bashore, who helps run a working replica of Washington’s distillery. “In the ledgers, the slaves are actually listed as distillers.”
Slavery accompanied distilling as it moved inland in the late 18th century, to the newly settled regions that would become Tennessee and Kentucky.
American slaves had their own traditions of alcohol production, going back to the corn beer and fruit spirits of West Africa, and many Africans made alcohol illicitly while in slavery.
It makes sense, yet still I was surprised. And after I read that I started to pay attention to things I hadn’t noticed before. Like if you’ve ever watched “Hotel Rwanda” its opening song is “Umqombothi”, which has lyrics about a tradition of corn-mash used for beer in Africa.
Both the use of charred casks and corn mash foundations are being revealed by food historians as African traditions (even the banjo now, often associated with distilleries, is being credited to African Americans). Thus slaves from Africa are gradually being given credit as the true master distillers who brought Bourbon as a “distinctive product of the United States” to market.
Slave owners were not inclined to give credit, let alone keep records, so a lot of research unfortunately still is required to clarify what was going on between European and African traditions that ended up being distinctly American. That being said, common sense suggests a connection between African corn mash and master distiller role of African slaves that simply is too strong to ignore.
Prohibition Was Basically White Supremacists Perpetuating Civil War
If we recognize that master distillers using corn mash to invent Bourbon were most likely slaves from Africa, and also we recognize why and how Prohibition was pushed by the KKK, there is another connection too strong to ignore.
My studies had led me to believe anti-immigrant activists were behind banning the sale or production of alcohol in America. Now I see how this overlooks the incredibly important yet subtle point that master distillers were ex-slaves and their families on the verge of upward social mobility (Jack Daniel didn’t just take a recipe from Nearis Green, he hired two of his sons). The KKK pushed prohibition to block African American prosperity, as well as immigrants.
Let’s take this back a few years to look at the economics of prohibition. Attempts to ban alcohol had been tried by the British King to control his American colonies. In the 1730s a corporation of the King was charged with settling Georgia. A corporate board (“trustees”) was hoping to avoid what they saw as mistakes made in settling South Carolina. Most notably, huge plantations were thought to be undesirable because causing social inequalities (ironic, I know). So the King’s corporation running Georgia was looking at ways to force smaller parcels to create better distribution of wealth (lower concentrations power) among settlers. The corporation also tried to restrict use of Africans as slaves to entice harder working and better quality of settler and…believe it or not, they also tried to ban alcohol presumably because productivity loss.
These 1730s attempts to limit land grabs and ban slavery backfired spectacularly. It was the South Carolinian settlers who were moving into Georgia to out-compete their neighbors, so it kind of makes sense wealth was equated to grabbing land and throwing slaves at it instead of settlers themselves doing hard work. It didn’t take more than ten years before the corporation relented and Georgia regressed to South Carolina’s low settler standards. The alcohol ban (restricting primarily rum) also turned out to be ineffective because slaveowners simply pushed their slaves to distill new forms of alcohol from locally sourced ingredients (perhaps corn-based whisky) and smuggle it.
By the time a Declaration of Independence was being drafted, including some ideas about calling their King a tyrant for practicing slavery, it was elitist settlers of Georgia and South Carolina who demanded slavery not be touched. Perhaps it’s no surprise then 100 years later as Britain was finally banning slavery the southern states were still hung up about it and violent attacks were used to stop anyone even talking about abolishing slavery. While the rest of the Americas still under French, British, Spanish influence were banning slavery, the state of Georgia was on its way to declare Civil War in an expansionist attempt to spread slavery into America’s western territories.
So here’s the thing: the King’s corporation heads inadvertently had taught their colonies how slaves, alcohol and land were linked to wealth accumulation and power. White supremacists running government in Georgia and South Carolina (aspiring tyrants, jealous of the British King) wanted ownership for themselves to stay in power.
Prohibition thus denied non-whites entry to power and ensured racial inequality. Cheaters gonna cheat, and it seems kind of obvious in retrospect that prohibition by both the British King and the US government were clumsily designed to control the market.
The current era of bourbon enthusiasm is based on the products of about seven US distilleries. But before Prohibition, the US had thousands of distilleries! 183 in Kentucky alone. (When the Bottled-in-Bond act took effect in 1896, the nationwide count was reportedly over eight thousand). Each distillery produced many, many different brands.
Prohibition destroyed almost all of those historic distilleries.
From 8,000 small to 7 monster distilleries because…economic concerns of white supremacists running US government.
The KKK criminalized bourbon manufacturing. Thousands, including emancipated master distillers, were forced out of their field. Also in that Bottled-in-Bond year of 1896, incidentally, southern white-supremacists started erecting confederate monuments to terrorize the black population. By the time Woodrow Wilson was elected President in 1912 he summarily removed all blacks from federal government, which one could argue set the stage for a vote undermining black communities, and restarted the KKK by 1915. Prohibition thus arose within concerted efforts by white supremacists in America to reverse emancipation of African Americans, deny them social mobility, criminalize them arbitrarily, and disenfranchise them from government.
What’s War Got to Do With the Price of Knob Creek?
Otho H. Wathen of National Straight Whiskey Distributing Co. points out: “The increase in 1934 (in drunken driver automobile accidents) for the entire country was 15.90 per cent. The increase in the repeal states, which included practically every big city where traffic is heaviest, was 14.65 per cent. …in the states retaining prohibition the increase was 21.56 per cent.”
Knob Creek was first in use in 1898, by the Penn-Maryland Corp. I have looked through our archives here (I have the old history books from the companies we acquired when we purchased National Brands)
The blog even shows this “Cincinnati, Ohio” label as evidence of its antiquity:
This is an awkward bit of history, when you look at the origin story told by the Jim Beam conglomerate:
When the Prohibition was lifted in 1933, bourbon makers had to start from scratch. Whiskey takes years and years to make, but the drinking ban was overturned overnight. To meet their sudden demand, distillers rushed the process, selling barrels that had hardly been aged. Softer, mild-flavored whiskey became standard from then on. Full flavor was the casualty.
But we brought real bourbon back. Over 25 years ago, master distiller Booker Noe set out to create a whiskey that adhered to the original, time-tested way of doing things. He named it Knob Creek
They’ve removed the text about Knob Creek being a physical place. When I first bought a bottle it came with marketing that referenced Knob Creek Farm, a non-contiguous section of the Abraham Lincoln Birthplace National Historical Park. That’s definitely no longer the case (pun not intended) as all the marketing today says white distillers of Jim Beam are resurrecting pre-prohibition traditions, without specifying the traditions came from slaves.
From that perspective, I’m curious if anyone has looked into the Penn-Maryland decision to name its whiskey after an Abraham Lincoln landmark. Does it imply in some way the emancipation of distillers, which Beam now is claiming simply as pre-prohibition style? More to the point, if Jack Daniel is finding slavery in its origin story and making reference to the injustices of credit taken, will Beam take the hint or continue to call Knob Creek their recent innovation?
My guess, based on reading the many comments on the “post-age” Knob Creek now being made (the bottles used to say 9 year), Beam is moving further away from credit to master distillers who were emancipated by Lincoln. So I guess, to answer my original question, buying another bottle of Knob makes little sense until I see evidence they’re giving credit to America’s black master distillers who invented the flavor and maybe even that label.
In the meantime, I’ll just keep sipping on this 1908 Old Crow (Woodford)…
Artificial Intelligence, or even just Machine Learning for those who prefer organic, is influencing nearly all aspects of modern digital life. Whether it be financial, health, education, energy, transit…emphasis on performance gains and cost reduction has driven the delegation of human tasks to non-human agents. Yet who in infosec today can prove agents worthy of trust? Unbridled technology advances, as we have repeatedly learned in history, bring very serious risks of accelerated and expanded humanitarian disasters. The infosec industry has been slow to address social inequalities and conflict that escalates on the technical platforms under their watch; we must stop those who would ply vulnerabilities in big data systems, those who strive for quick political (arguably non-humanitarian) power wins. It is in this context that algorithm security increasingly becomes synonymous with security professionals working to avert, or as necessary helping win, kinetic conflicts instigated by digital exploits. This presentation therefore takes the audience through technical details of defensive concepts in algorithmic warfare based on an illuminating history of international relations. It aims to show how and why to seed security now into big data technology rather than wait to unpoison its fruit.
Watching Richard Bejtlich’s recent “Revolution in Intelligence” talk about his government training and the ease of attribution is very enjoyable, although at times for me it brought to mind CIA factbook errors in the early 1990s.
Slides that go along with the video are available on Google drive
Let me say, to get this post off the ground, I will be the first one to stand up and defend US government officials as competent and highly skilled professionals. Yet I also will call out an error when I see one. This post is essentially that. Bejtlich is great, yet he often makes some silly errors.
Often I see people characterize a government as made up of inefficient troglodytes falling behind. That’s annoying. Meanwhile often I also see people lionize nation-state capabilities as superior to any other organization. Also annoying. The truth is somewhere in between. Sometimes the government does great work, sometimes it blows compared to private sector.
Take the CIA factbook I mentioned above as an example. It has been unclassified since the 1970s and by the early 1990s it was published on the web. Given wider distribution its “facts” came under closer scrutiny from academics. So non-gov people who long had studied places or lived in them (arguably the world’s true leading experts) read this fact book and wanted to help improve it — outsiders looking in and offering assistance. Perhaps some of you remember the “official” intelligence peddled by the US government at that time?
Bejtlich in his talk gives a nod towards academia being a thorough environment and even offers several criteria for why academic work is superior to some other governments (not realizing he should include his own). Perhaps this is because he is now working on a PhD. I mean it is odd to me he fails to realize this academic community was just as prolific and useful in the 1990s, gathering intelligence and publishing it, giving talks and sending documents to those who were interested. His presentation makes it sound like before search engines appeared it required nation-state sized military departments walking uphill both ways in a blizzard to gather data.
Aside from having this giant blind spot to what he calls the “outsider” community, I also fear I am listening to someone with no field experience gathering intelligence. Sure image analysis is a skill. Sure we can sit in a room and pore over every detail to build up a report on some faraway land. On one of my private sector security teams I had a former US Air Force technician who developed film from surveillance planes. He hated interacting with people, loved being in the darkroom. But what does Bejtlich think of actually walking into an environment as an equal, being on the ground, living among people, as a measure of “insider” intelligence skill?
Almost three decades ago I stepped off a plane into a crowd of unfamiliar faces in a small country in Asia. Over the next five weeks I embedded myself into mountain villages, lived with families on the great plains, wandered with groups through jungles and gathered as much information as I could on the decline of monarchial rule in the face of democratic pressure.
One sunny day on the side of a shoulder-mountain stands out in my memory. As I hiked down a dusty trail a teenage boy dressed all in black walked towards me. He carried a small book under his arm. He didn’t speak English. We communicated in broken phrases and hand gestures. He said he was a member of a new party.
Mao was his leader, he said. The poor villages felt they weren’t treated well, decided to do something about it. I asked about Lenin. The boy had never heard the name. Stalin? Again the boy didn’t know. Mao was the inspiration for his life and he was pleased about this future for his village.
This was before the 1990s. And by most “official” accounts there were no studies or theories about Maoists in this region until at least ten years later. I mention this here not because individual people with a little fieldwork can make a discovery. It should be obvious military schools don’t have a monopoly on intel. The question is what happened to that data. Where did information go and who asked about it? Did others have easy access to data gathered?
Yes, someone from private sector should talk about “The Revolution in Private Sector Intelligence”. Perhaps we can find someone with experience working on intelligence in the private sector for many, many years, to tell us what has changed for them. Maybe there will be stories of pre-ChoicePoint private sector missions to fly in on a moment’s notice into random places to gather intelligence on employees who were stealing money and IP. And maybe non-military experience will unravel why Russian operations in private sector had to be handled uniquely from other countries?
Going by Bejtlich’s talk it would seem that such information gathering simply didn’t exist if the US government wasn’t the one doing it. What I hear from his perspective is you go to a military school that teaches you how to do intelligence. And then you graduate and then you work in a military office. Then you leave that office to teach outsiders because they can learn too.
He sounds genuinely incredulous to discover that someone in the private sector is trainspotting. If you are familiar with the term you know many people enjoy as a hobby building highly detailed and very accurate logs of transportation. Bejtlich apparently is unaware, despite this being a well-known thing for a very long time.
A new record of trainspotting has been discovered from 1861, 80 years earlier than the hobby was first thought to have begun. The National Railway Museum found a reference to a 14 year old girl writing down the numbers of engines heading in and out of Paddington Station.
It reminds me a bit of how things must have moved away from military intelligence for the London School of Oriental and African Studies (now just called SOAS). The British cleverly setup in London a unique training school during the first World War, as explained in the 1917 publication “Nature”:
…war has opened our eyes to the necessity of making an effort to compete vigorously with the activities — political, commercial, and even scientific and linguistic — of the Germans in Asia and Africa. We have discovered that their industry was rarely disinterested, and that political propaganda was too often at the root of “peaceful penetration” in the field of missionary, scientific, and linguistic effort.
In other words, a counter-intelligence school was born. Here the empire could maintain its military grip around the world by developing the skills to better gather intelligence and understand enemy culture (German then, but ultimately native).
By the 1970s SOAS, a function of the rapidly changing British global position, seemed to take on wider purpose. It reached out and looked at new definitions of who might benefit from the study and art of intelligence gathering. By 1992 regulars like you or me could attend and sit within the shell of the former hulk of a global analysis engine. Academics there focused on intelligence gathering related to revolution and independence (e.g. how to maintain profits in trade without being a colonial power).
I was asked by one professor to consider staying on for a PhD to help peel apart Ghana’s 1956 transition away from colonial rule, for only academic purpose of course. Tempted as I was, LSE instead set the next chapters of my study, which itself seems to have become known sometime during the second World War as a public/private shared intelligence analyst training school (Bletchley Park staff tried to convince me Zygalski, inventor of equipment to break the Enigma, lectured at LSE although I could find no records to support that claim).
Fast forward five years to 1997 and the Corner House is a good example of academics in London who formalized public intelligence reports (starting in 1993?) into a commercial portfolio. In their case an “enemy” was more along the lines of companies or even countries harming the environment. This example might seem a bit tangential until you ask someone for expert insights, including field experience, to better understand the infamous pipeline caught in a cyberwar.
Anyway, without me dragging on and on about the richness of an “outside” world, Bejtlich does a fine job describing some of the issues he had adjusting. He just seems to have been blind to communities outside his own and is pleased to now be discovering them. His “inside” perspective on intelligence is really just his view of inside/outside, rather than any absolute one. Despite pointing out how highly he regards academics who source material widely he then unfortunately doesn’t follow his own advice. His talk would have been so much better with a wee bit more depth of field and some history.
Eastman Kodak investigated, and found something mighty peculiar: the corn husks from Indiana they were using as packing materials were contaminated with the radioactive isotope iodine-131 (I-131). Eastman Kodak at the time had some of the best researchers in the country on its team (the company even had its own nuclear reactor in the 1970s), and they discovered something that was not public knowledge: those farms in Indiana had been exposed to fallout from the 1945 Trinity Test in New Mexico — the world’s first atmospheric nuclear bomb explosions which ushered in the atomic age. Kodak kept this exposure silent.
The American film industry giant by 1946 realized, from clever digging into the corn husk material used for packaging, that the US government was poisoning its citizens. The company filed a formal complaint and kept quiet. Our government responded by warning Kodak of military research to help them understand how to hide from the public any signs of dangerous nuclear fallout.
Good work by the private sector helping the government more secretly screw the American public without detection, if you see what I mean.
My point is we do not need to say the government gives us the best capability for world-class intelligence skills. Putting pride aside there may be a wider world of training. So we also should not say private-sector makes someone the best in world at uncovering the many and ongoing flaws in government intelligence. Top skills can be achieved in different schools of thought, which serve different purposes. Kodak clearly worried about assets differently than the US government, while they still kind of ended up worrying about the same thing (colluding, if you will). Hard to say who evolved faster.
By the way, speaking of relativity, also I find it amusing Bejtlich’s talk is laced with his political preferences as landmines: Hillary Clinton is setup as so obviously guilty of dumb errors you’d be a fool not to convict her. President Obama is portrayed as maliciously sweeping present and clear danger of terrorism under the carpet, putting us all in grave danger.
And last but not least we’re led to believe if we get a scary black bag indicator we should suspect someone who had something to do with Krav Maga (historians might say an Austro-Hungarian or at least Slovakian man, but I’m sure we are supposed to think Israeli). Is that kind of like saying someone who had something to do with Karate (Bruce Lee!) when hinting at America?
And one last thought. Bejtlich also mentions gathering intelligence on soldiers in the Civil War as if it would be like waiting for letters in the mail. In fact there were many more routes of “real time” information. Soldiers were skilled at sneaking behind lines (pun not intended) tapping copper wires and listening, then riding back with updates. Poetry was a common method of passing time before a battle by creating clever turns of phrase about current events, perhaps a bit like twitter functions today. “Deserters” were a frequent source of updates as well, carrying news across lines.
I get what Bejtlich is trying to say about speed of information today being faster and have to technically agree with that one aspect of a revolution; of course he’s right about raw speed of a photo being posted to the Internet and seen by an analyst. Yet we shouldn’t under-sell what constituted “real-time” 150 years ago, especially if we think about those first trainspotters…
One might think history would be trivially easy, given how these days every fact is on the Internet at the tips of our fingers. However, being a historian still takes effort, perhaps even talent. Why?
The answer is simple: “the value of education is not the learning of many facts but the ability of the mind to think”. I’ll let you try and search to figure out the person who said that.
A historian is trained to apply expertise in thinking, run facts through a system of sound logic for others to validate, rather than just leave facts on their own. It is a bit like a chef cooking a delicious meal rather than offering you a bowl of raw ingredients. Analysis to get the right combinations of ingredients cooked together can be hard. And on top of finding the results desirable, we also need ways to know the preparations were clean an can be trusted.
Take for example a BBC magazine article written about long distance communication, that cooks up a soup called “How Napoleon’s semaphore telegraph changed the world”.
This article unfortunately offers factual conclusions that are poorly prepared and end up tasting all wrong. Let’s start with three basic assertions the BBC has asked readers to swallow:
The last stations were built in 1849, but by then it was clear that the days of line-of-sight telegraphy were done.
The military needs had disappeared, and latterly the operators’ main task was transmitting national lottery numbers.
The shortcomings of visual communication were obvious. It only functioned in daytime and in good weather.
First point: Line-of-sight telegraphy is still used to this day. Anyone sailing the Thames, or any modern waterway for that matter, would happily tell you they rely on a system of lights and flags. I wrote it into our book on cloud security. The BBC itself has a story about semaphore adoption during nuclear disarmament campaigns. As long as we have visual sensors, these signal days will never be done. Dare I mention the line-of-sight communication scene in a futuristic sci-fi film The Martian?
Second point: Military needs are not the only need. This should be obvious from the first point, as well as from common sense. If this were true you would not be reading a blog, ever. More to the stupidity of this reasoning, the French system resorted to a lottery because it went bankrupt. The inventor had pinned all his hope for a very expensive system on military financing and that didn’t come through. So the lottery was a last-ditch attempt to find support after the military walked.
A sad footnote to this is the French military didn’t see the Germans coming in latter wars. So I could dive into why military needs didn’t disappear, but that would be more complicated than proving there were other needs and the system just wasn’t funded properly to survive.
Third point: Anyone heard of a lighthouse? What does it do best? Functions at night and in bad weather, am I right? Fires on a hill (e.g. pyres) also work quite well at night. Or a flashlight, such as the one on your cell-phone.
Try out the Jolla phone app “Morse sender” if you want to communicate over distance at night and bad weather using Morse code. Real shortcomings of visual communication come during thick smoke (e.g. old gunpowder battles or near coal power), which leads to audio signals such as the talking drum, fog horns, bagpipes and songs or cries.
Ok, so all those three above points are false and easily disproved, tossed into the bin. Now for the harder part, the overall general conclusion in two sentences from BBC magazine:
Smoke, fire, light, flags – since time immemorial man had sought to speak over space.
What France did in the first half of the 19th Century was create the first ever system of distance communication.
Shame that the writer acknowledges fire and flags here because those are the facts we used above to disprove their own analysis (work at night, still in use). Now can we disprove “first ever system of distance communication”?
I say this is hard because I’m giving the writer benefit of the doubt. Putting myself in their shoes they obviously see a big difference between the “immemorial” methods used around the world and a brief French experiment with an expensive, unfunded militaristic system.
As hard as I try, honestly I don’t see why we should call the French system first. Consider this passage from archaeologist Charles Jones’ 1873 “Antiquities of the Southern Indians”
Note this is a low-cost and night-time resilient system that leaves no trace. Pretty damning evidence of being earlier and arguably better. We have fewer first-hand proofs from earlier yet it would be easy to argue there were complex fire signals as far back as 150 BCE.
The Greek historian Polybius explained in The Histories that fire signals were used to convey complex messages over distance via cipher. A flame would be raised and lowered, turned on or off, to signal column and row of a letter.
6 The most recent method, devised by Cleoxenus and Democleitus and perfected by myself, is quite definite and capable of dispatching with accuracy every kind of urgent messages, but in practice it requires care and exact attention. 7 It is as follows: We take the alphabet and divide it into five parts, each consisting of five letters. There is one letter less in the last division, but this makes no practical difference. 8 Each of the two parties who are about signal to each other must now get ready five p215tablets and write one division of the alphabet on each tablet, and then come to an agreement that the man who is going to signal is in the first place to raise two torches and wait until the other replies by doing the same. 10 This is for the purpose of conveying to each other that they are both at attention. 11 These torches having been lowered the dispatcher of the message will now raise the first set of torches on the left side indicating which tablet is to be consulted, i.e. one torch if it is the first, two if it is the second, and so on. 12 Next he will raise the second set on the right on the same principle to indicate what letter of the tablet the receiver should write down.
It even works at night and in bad weather!
Speaking of which there may even have been a system earlier, such as 247 BCE. Given the engineering marvel of the lighthouse Pharos of Alexandria, someone may know better of its use for long-distance communication by line-of-sight.
Has the point been made that the first ever system of distance communication was not the French during their revolution?
I think the real conclusion here, in consideration of BBC magazine’s attempt to persuade us, is someone was digging for reasons to be proud of French militarism. Had they bothered to think more deeply or seek more global sources of data they might have avoided releasing such a disappointing article.
When native Americans demonstrated excellent long distance communication systems, European settlers mocked them. Yet the French build one and suddenly we’re supposed to remember it and say…oh la la? No thanks, too hard to swallow. That’s poor analysis of facts.
If there is a quintessential American dessert it is the banana split. Why? Although we can credit Persians and Arabs with invention of ice-cream (nice try China) the idea of putting lots of ice-cream on a split banana covered in everything you can find but the kitchen sink…surely that is pure American innovation.
After reading many food history pages and mulling their facts a bit I realized something important was out of place. There had to be more to this story than just Americans love big things — all the fixings — and one day someone put everything together. Why America? When?
I found myself digging around for more details and eventually ended up with this official explanation.
In 1904 in Latrobe, the first documented Banana Split was created by apprentice pharmacist David Strickler — sold here at the former Tassell Pharmacy. Bananas became widely available to Americans in the late 1800s. Strickler capitalized on this by cutting them lengthwise and serving them with ice cream. He is also credited with designing a boat-shaped glass dish for his treat. Served worldwide, the banana split has become a prevalent American dessert.
The phrase that catches my eye, almost lost among the other boring details, is that someone with an ingredient “widely available…capitalized”; capitalism appears to be the key to unlock this history.
Immigration and Trade
The first attribution goes to Italian immigrants who brought spumoni to America around the 1870s. This three flavor ice-cream often was in the three colors of their home country’s flag. No problem for Americans. The idea of a three flavor treat was taken and adapted to local favorites: chocolate, strawberry and vanilla. Ice-cream became more widely available by the 1880s and experimentation was inevitable as competition boomed. It obviously was a very popular food by the 1904 St. Louis World’s Fair, which infamously popularized eating it with cones.
In parallel, new trade developments emerged. Before the 1880s there were few bananas found in America. America bought around $250K of bananas in 1871. Only thirty years later the imports had jumped 2,460% to $6.4m and were in danger of becoming too common. Bananas being both easily sourced and yet still exotic made them ideal for experiments with ice-cream. The dramatic change in trade and availability was the result of a corporate conglomerate formed in 1899 called the United Fruit Company. I’ll explain more about them in a bit.
At this point what we’re talking about is just Persian/Arab ice-cream modified and brought by Italian immigrants to America, then modified and dropped onto the newly marketed banana of capitalism. Serving up all the fixings over a banana-split make a lot of sense if you put yourself in the shoes of someone working in a soda/pharmacy business of 1904 trying to increase business.
Back then Bananas and Pineapples Were The Exotic New Thing
Imagine you’re in a drug-store and supposed to be offering something amazing or exotic to draw in customers. People could go to any drugstore. You pull out the hot new banana fruit, add the three most-popular flavors (impressive yet not completely unfamiliar) and then dump all the sauces you’ve got on top. You now charge double the cost of any other dessert. Should you even add pineapple on top? Of course! The pineapple arrived fresh off the boat in a new promotion by the Dole corporation:
In 1899 James Dole arrived in Hawaii with $1000 in his pocket, a Harvard degree in business and horticulture and a love of farming. He began by growing pineapples. After harvesting the world’s sweetest, juiciest pineapples, he started shipping them back to mainland USA.
I have mentioned before on this blog how the US annexed Hawaii by sending in the Marines. Food historians rarely bother to talk about this side of the equation, so indulge me for a moment. Interesting timing of the pineapple, no? I sense a need for a story about the Dole family to be told.
The arrival of James Dole to Hawaii in 1899, and a resulting sudden widespread availability of pineapples in drugstores for banana splits, is a dark chapter in American politics.
James was following the lead of his cousin Sanford Ballard Dole, who had been born in Hawaii in 1844 to Protestant missionaries and nursed by native Hawaiians after his mother died at childbirth. Sanford was open about his hatred of the local government and had vowed to remove and replace them with American immigrants, people who would help his newly-arrived cousin James protect their family wealth. 1890 American Protectionism and Hawaiian Independence
To understand the shift Dole precipitated and participated in, back up from 1899 to the US Republican Congress in 1890 approving the McKinley Tariff. This raised the cost of imports to America 40-50%, striking fear into Americans trying to profit in Hawaii by exporting goods. Although that Tariff left an exception for sugar it still explicitly removed Hawaii’s “favored status” and rewarded domestic production.
Within two years after the Tariff sugar exports from Hawaii had dropped a massive 40% and threw the economy into shock. Plantations run by white American businessmen quickly cooked up ideas to reinstate profits; their favored plan was to remove Hawaii’s independence and deny sovereignty to its people.
At the same time these businessmen were convinced they would need to remove Hawaiian independence, Queen Lili`uokalani ascended to the throne and indicated she would reduce foreign interference on the country, drafting a new constitution.
These two sides headed directly at each other and disaster in 1892 despite the US government shifting dramatically to Democratic control (leading straight to the 1894 repeal of the McKinley Tariff). Republican damage had been done, Dole was using his own party’s platform as excuse to call himself a victim needing intervention. As Hawaii hinted towards more national control the foreign businessmen in Hawaii begged America for annexation to protect their profits.
An “uprising” in early 1893 (a loyalist policeman accidentally noticed large amounts of ammunition being delivered to businessmen planning a coup, so he was shot and killed) was used as the premise to force the Queen to abdicate power to a government inserted by the sugar growers, led by Sanford Dole. US Marines stormed the island to ensure protecting the interests of elitist businessmen exporting to America, despite only recently operating under a government that wanted reduction of imports. Sanford’s pro-annexation government, ushered in by shrewd political games and US military might, now was firmly in place as he had vowed.
The Hawaiian nation’s fate seemed sealed, however it actually remained uncertain as the newly elected US President openly opposed by principle any imperialism and annexation. He even spoke of support for the Queen of Hawaii. Congressional (Republican) pressure mounted and by 1897 the President seemed less likely to fight the annexation lobby. Finally in 1898, as war with Spain unfolded, Hawaii was labeled by the military as strategically important and abruptly lost its independence definitively.
Few Americans I speak with realize that their government basically sent the Marine forces to annex Hawaii based on increased profits for American missionaries and plantation owners delivering sugar to the US, and then sealed the annexation as convenient for war.
Total Control Over Fruit Sources
Ok, segue complete, remember how President Sanford’s cousin James arrived in Hawaii in 1899 ready to start shipments of cheap pineapples? His arrival and success was a function of that annexation of the independent state; creation of a pro-American puppet government lured James to facilitate business and military interests.
This is why drugstores in 1904 suddenly found ready access to pineapple to dump on their bananas with ice cream. And speaking of bananas, their story is quite similar. The United Fruit Company I mentioned at the start quickly was able to establish US control over plantations in many countries:
Nearly half of Guatemala fell under control of the US conglomerate corporation, apparently, and yet no taxes had to be paid; telephone communications as well as railways, ports and ships all were owned by United Fruit Company. The massive level of US control initially was portrayed as an investment and benefit to locals, although hindsight has revealed another explanation.
“As for repressive regimes, they were United Fruit’s best friends, with coups d’état among its specialties,” Chapman writes. “United Fruit had possibly launched more exercises in ‘regime change’ on the banana’s behalf than had even been carried out in the name of oil.” […] “Guatemala was chosen as the site for the company’s earliest development activities,” a former United Fruit executive once explained, “because at the time we entered Central America, Guatemala’s government was the region’s weakest, most corrupt and most pliable.”
Thus the term “banana republic” was born to describe those countries under the thumb of “Great White” businessmen.
And while saying “banana republic” was meant by white businessmen intentionally to be pejorative and negative, it gladly was adopted in the 1980s by a couple Americans. Their business model was to travel the world and blatantly “observe” clothing designs in other countries to resell as a “discovery” to their customers back home. Success at appropriation of ideas led to the big brand stores selling inexpensive clothes that most people know today, found in most malls. The irony of saying “banana republic” surely has been lost on everyone, just like “banana split” isn’t thought of as a horrible reminder of injustices.
In other words the banana-split is a by-product or modern representation of America’s imperialist expansion and corporate-led brutal subjugation of freedoms in foreign nations, during the early 1900s. Popularity of “banana republic” labels and branding, let alone a dessert, just proves how little anyone remembers or cares of the history behind these products and terms.
Nonetheless, you know now the secret behind widespread availability of inexpensive ingredients that made this famous and iconic American dessert possible.