Let Them Eat Recipes for Cake: Why Consciousness Never Will Be Code

Security professionals are intimately familiar with the tension between formalization and practice.

We can document every protocol, codify every procedure, and automate every response, yet still observe the art of security requires something more. Things made Easy, Routine and Minimal judgement (ERM) depend on a reliable source of Identification, Storage, Evaluation and Adaptation (ISEA).

A recent essay by astrophysicist Adam Frank in Noema Magazine explores a similar tension in consciousness studies, one that has profound implications for how we think about all intelligence, both human and artificial.

The tension here is far from new. Jeremy Bentham’s ambitious attempt to create a mathematical model of ethics—his utilitarian calculus—ultimately failed because it tried to reduce the irreducible complexity of moral experience to quantitative formulas. No amount of hedonic arithmetic could capture the lived reality of ethical decision-making. His codified concept of “propinquity” was never made practical, foreshadowing the massive deadly failures of driverless AI hundreds of years later.

In sharp contrast, Ludwig Wittgenstein succeeded in understanding language precisely because he abandoned the quest for mathematical foundations while being one of the best mathematicians in history (yet not a very good WWI soldier). His practical and revolutionary language games emerged from what he called “forms of life”—embodied, contextual practices that resist formal reduction. We depend on them heavily today as foundational to daily understanding.

Frank’s central argument is that modern science has developed what he calls a “blind spot” regarding consciousness and experience. The idiocy of efficiency means a rush to reduce everything to computational models and mathematical abstractions has totally forgotten something fundamental to success:

Experience is intimate — a continuous, ongoing background for all that happens. It is the fundamental starting point below all thoughts, concepts, ideas and feelings.

The blindness of the efficiency addict (e.g. DOGE) isn’t accidental. It’s built into the very foundation of dangerously lowering the safety bar for how we practice science. As Frank explains, early architects of the scientific method deliberately set aside subjective elements to focus on what Michel Bitbol calls the “structural invariants of experience“—the patterns that remain consistent across different observers. That may be a baseline, a reductive approach, far too low to protect against harm.

The problem emerges when abstractions are allowed to substitute for reality itself, without acknowledging fraud risks. Frank describes this as a “surreptitious substitution” where mathematical models are labeled as more real than the lived experience they’re meant to describe.

Think of how temperature readings replaced the embodied experience of feeling hot or cold, to the point that thermodynamic equations became regarded as more fundamental than the sensations they originally measured.

Meta is Fraud, For Real

This leads to what Frank identifies as the dominant paradigm in consciousness studies: the machine metaphor (meta). From this perspective, organisms are “nothing more than complicated machines composed of biomolecules” and consciousness is simply computation running on biological hardware.

And of course there’s a fundamental difference between machines and living systems. Machines are engineered for specific purposes, while organisms exhibit something far more substantive in what philosophers call “autopoiesis“—they are self-creating and self-maintaining. Meta is extractive, reductive, a road to death without a host it can feed on. As Frank notes:

A cell’s essence is not its specific atoms. Instead, how a cell is organized defines its true nature.

This organizational closure—the way living systems form sustainable unified wholes that cannot be reduced to their parts—suggests a different approach to understanding consciousness. Rather than asking how matter creates experience, we might ask how experience and matter co-evolve through embodied symbiotic healthy interaction with the world.

You Can’t Eat a Recipe

To understand this distinction, consider consciousness within the act of cooking to eat rather than just computation. The recipe captures the structural patterns and relationships—the “how” and “what” that can be systematized and shared.

Actual cooking involves embodied skill, responsiveness to the moment, intuitive adjustments based on how things look, smell, and feel. There’s a tacit knowledge that emerges through the doing itself.

A skilled chef can follow the same recipe as the unskilled one and produce something entirely different. Ratatouille wasn’t about a rat, as much as the lived experience that can make analysis of an environment what I like to call in AI security work “compost in, cuisine out” (proving “garbage in garbage out” a false and dangerously misleading narrative).

A lightning strike enlightens this animated film protagonist like Frankenstein turned chef

The consciousness-as-cooking isn’t just about following instructions—it’s about lived engagement with materials, real-time adjustments, the way experience shapes perception which shapes action in an ongoing loop. OODA, PDCA… we know the loop models of audit and assessment as fundamental to winning wars.

Frank’s emphasis on “autopoiesis” fits here perfectly. Like cooking, consciousness might be fundamentally about self-creating and self-maintaining processes that can’t be fully captured from the outside. You can describe the biochemical reactions in bread rising, but the seasoned baker’s sense of when a proper bagel is ready involves a different kind of knowing altogether.

AI Security is Misunderstood

The necessary perspective has serious implications for how we think about artificial intelligence and its role in information security. When we treat intelligence as “mere computation,” we risk building systems that can process information but lack the embodied understanding that comes from being embedded in the world.

Everyone using a chatbot these days knows this intimately when you ask about the best apple and the machine spits back the fruit when you want the computer, or vice versa.

Frank warns that the deceptive reductionist approach “poses real dangers as these technologies are deployed across society.” When we mistake computational capability for intelligence, we risk creating a world where:

…our deepest connections and feelings of aliveness are flattened and devalued; pain and love are reduced to mere computational mechanisms viewable from an illusory and dead third-person perspective.

In security contexts, this might mean deploying AI systems that can detect patterns but lack critical contextual understanding that comes from embodied experience. They might follow the recipe perfectly while missing the subtle cues that experienced practitioners would notice.

Palantir is maybe the most egregious example of death and destruction from fraud. They literally tried to kill an innocent man, with zero accountability, while generating the terrorists that they had begged millions of dollars to help find. I call them the “self licking ISIS-cream cone” because Palantir is perhaps the worst intelligence scam in history.

Correct Approach: Embedded Experience

Rather than trying to embed consciousness in physics, Frank suggests we need to “embed physics into our experience.” This doesn’t mean abandoning mathematical models, but recognizing them as powerful tools that emerge from and serve embodied understanding.

From this perspective, the goal isn’t to explain consciousness away through formal systems, but to understand how mathematical abstractions manifest within lived experience. We don’t seek explanations that eliminate experience in favor of abstractions, but account for the power of abstractions within the structures of experience.

Cooking School Beats Every Recipe Database

This might be why the “hard problem” of consciousness feels so intractable when approached mathematically—it’s like trying to capture the essence of cooking by studying only the recipe. The formalization is useful, even essential, but it necessarily abstracts away from the very thing we’re most interested in: the lived experience of the cooking itself.

Perhaps consciousness studies—and by extension, our approach to AI and security—needs more public “cooking schools” and fewer Palantir “recipe databases.” More emphasis on cultivating the capacity for analysis and curiosity for lived inquiry rather than just dumping money into white supremacist billionaires building racist theoretical machine models.

This is the opposite of abandoning rigor or precision. It means recognizing that some forms of knowledge are irreducibly embodied and contextual. The recipe and the cooking are both essential—but they operate in different domains and serve different purposes.

For those of us working in security, our most sophisticated tools and protocols will always depend on practitioners who can read the subtle signs, make contextual judgments, and respond creatively to novel situations. The poetry of information security written here since 1995 lies not just in the practice of developing algorithms, but in the lived practice of protecting systems and people from harm in an ever-changing world.

The question isn’t whether we can build machines that think like humans, but whether we can create technologies that enhance rather than replace the irreducible art of human judgment and response. Like Bentham’s failed calculus, purely computational approaches to intelligence miss the embodied nature of understanding. But like Wittgenstein’s language games, consciousness might be best understood not as a problem to be solved, but as a form of life to be lived.

Perhaps the poet Wallace Stevens captured this tension best in “The Idea of Order at Key West,” where he writes of the sea and the singer who shapes our perception of it:

She sang beyond the genius of the sea.
The water never formed to mind or voice,
Like a body wholly body, fluttering
Its empty sleeves; and yet its mimic motion
Made constant cry, caused constantly a cry,
That was not ours although we understood,
Inhuman, of the veritable ocean.

The sea was not a mask. No more was she.
The song and water were not medleyed sound
Even if what she sang was what she heard,
Since what she sang was uttered word by word.
It may be that in all her phrases stirred
The grinding water and the gasping wind;
But it was she and not the sea we heard.

Consciousness, like the singer by the sea, is neither reducible to its material substrate nor separate from it. It emerges in the dynamic interaction between embodied beings and their world—not as computation, but as the lived poetry of existence itself.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.