Sacrifice, Truth, and Why We Fight
From Gettysburg to AI Alignment—How Human Meaning Survives Across Time
This is, indeed, a philosophy blog. But lately, I’ve been fighting a different battle—moral interventions against a democratic crisis.
Taking a risk that I’m going to immediately lose readers, I want to take a stab at demonstrating to you why philosophy matters—urgently—right now. It’s not just abstraction; it’s how we fit the fragments of reality into coherence. It’s how we see clearly—especially when clarity is under siege.
That’s what we’ll do here. We’ll connect some seemingly disparate dots—from the sacrifice of soldiers at Normandy to the challenge of AI alignment, from how societies determine truth to how humans transmit meaning across time.
Recently, a friend of mine asked me how my apparent romanticism about sacrifice—all this talk of Gettysburg and Normandy—fits into what I’ve described as a purely epistemic liberal ethic. It’s a fair question. I get it. “Epistemic liberal ethic” sounds technical, bloodless—reductionist, even. How does that square with reverence for men who bled into the soil?
Yes, it’ll get a little technical in places. But don’t mistake this for some nihilistic exercise in reduction. Quite the opposite. We’re going to use philosophical rigor to understand why certain things matter so deeply—and how different ways of seeing the same truth can sharpen, not dull, its significance.
Let me dive right in: To understand why this matters so urgently today, we need to look at sacrifice through an unusual lens. I believe sacrifice functions as a form of information transmission across time. When someone dies for an idea, they're encoding a message to the future at the highest possible cost. Think about what that means. These aren't just emotional appeals or rhetorical flourishes—these are data points authenticated by the ultimate human price.
For those less versed in philosophical jargon, let me break this down clearly. When I talk about an “epistemic” liberal ethic, I'm talking about how societies figure out what's true. And when I talk about sacrifice as “temporal information,” I'm suggesting that people who die for ideas are trying to tell us something they thought was desperately important for future generations to understand. Across time.
But why does this matter? Well, that's what we're about to explore.
When someone sacrifices their life for an idea—when they decide that transmitting a message to the future is worth dying for—they're engaging in the most expensive form of signaling possible. A soldier dying at Normandy isn’t just fighting Nazis; he’s speaking to us. He’s saying: ‘This system of self-governance, this belief that humans can rule themselves through law rather than force, is worth everything—even my life.’
The Nazis were also sending a signal through time, weren't they? Through their methodical documentation of their own atrocities, through their careful record-keeping of genocide, they were transmitting their own message to the future. They wanted future generations to know exactly what they did and why they thought it was justified. They too paid a massive price to encode their information—not through noble sacrifice, but through the systematic destruction of human dignity itself.
This creates a stark contrast in the signals being transmitted across time. On one side, we have people willing to die to defend the idea that all humans deserve equal dignity and the right to self-governance. On the other, we have people willingly destroying human dignity to transmit a message about racial supremacy and authoritarian control. Both signals were enormously costly to transmit—but they encode fundamentally opposing truths about human nature and possibility.
Think about what that means in information theory terms. If you wanted to design a system for transmitting extremely important information across time, you'd want some way to distinguish crucial messages from noise. You'd want a mechanism that makes sending false signals prohibitively expensive. Well, that's exactly what sacrifice does—it authenticates the message through the highest possible human cost.
I want to be very clear: I don’t mean that every soldier storming Omaha Beach was consciously thinking, I am transmitting a message to future generations. Of course not. Many fought for their comrades, out of duty, or simply to survive. But sacrifice becomes legible as a signal because of what society does after. It’s the collective—through remembrance, through memorials, through the stories we tell—that encodes the meaning of their sacrifice. That’s how societies speak across time. The individual act is personal; the collective act is civilizational.
Of course, not every sacrifice speaks to a noble cause. People have died for fascism, apartheid, and holy wars. Sacrifice authenticates the signal—it doesn’t vindicate the message. The blood cost forces us to listen, but we must still judge what it is saying. Sacrifice is a transmission; context is the decoder.
Now, for those wondering what this has to do with my “epistemic liberal ethic”—stay with me. When I argue that liberalism is fundamentally about how societies discover truth, I'm talking about the frameworks we build to process reality collectively. Free speech, independent institutions, democratic processes—these aren't just nice ideas, they're the machinery through which human societies figure out what's real.
But here's where sacrifice enters the picture: Some truths are so fundamental, so crucial to human flourishing, that previous generations literally died to transmit them to us. When I express contempt for those who would casually dismantle democratic institutions, I'm not just being romantic or emotional. I'm recognizing the weight of information that was transmitted to us at devastating human cost.
Well, I am being romantic and emotional, actually. I can and do tear up when talking about this stuff. And you know what? That emotional response isn't separate from the epistemic framework—it's part of it. When we feel moved to tears thinking about sacrifices made at Normandy or Gettysburg, those tears themselves become signals, creating links back to our shared human condition.
Looking at sacrifice as temporal information isn't meant to reduce these profound human moments to cold data points. It's just one way of understanding the relationship between pieces of information across time. Another is through the poetic frame—the way a soldier's last letter home can reach across decades to touch something deep in our hearts. The way a battlefield can feel hallowed ground not because of any rational calculation, but because we can feel the weight of what happened there.
These aren’t competing frameworks—they’re complementary ways of understanding the same profound truth. When I analyze sacrifice as costly signaling across time, I’m not trying to strip it of its emotional power. I’m trying to understand why it moves us so deeply, why it should move us so deeply. The fact that I can examine this philosophically doesn’t make my tears any less real when I contemplate what those sacrifices mean.
And this—this collision of analysis and feeling, of reason and grief—is precisely what brings me to the frontier where philosophy meets our technological future. If we can't fully capture this meaning in philosophical analysis alone, how can we possibly encode it in artificial intelligence? Because if sacrifice transmits meaning across generations through the highest possible human cost, I must ask: How do we carry that signal into a world governed increasingly by artificial intelligence?
This is, I must warn you, dear reader, the part that makes AI alignment so damned hard. When I sit in rooms with brilliant AI researchers who reduce human values to optimization problems—who try to encode ethics into utility functions—I want to ask them: How do you plan to transmit the signal contained in a soldier’s sacrifice to an artificial intelligence? How do you encode the meaning carried in our tears when we stand on hallowed ground?
This isn't just about sentiment—it's about how human consciousness processes profound meaning. We can understand sacrifice both as costly information transmission across time and as something that moves us to tears, because our consciousness can hold multiple frameworks of understanding simultaneously. We can analyze and feel, calculate and weep, all at once. This isn't a bug in human consciousness—it's a feature. It's part of how we process the deepest truths about human existence.
But try encoding that in an AI system. Try reducing the multi-layered meaning of sacrifice—its function as temporal information, its emotional resonance, its role in binding human communities across time—into machine-readable parameters. The problem isn't just technical complexity; it's that something fundamental about human meaning-making resists this kind of reduction.
This is why I insist that humans are not optimization problems and society is not a problem to be solved. The way we process and transmit the most important truths about human existence operates on multiple levels —logical, emotional, cultural, temporal. When we try to reduce this rich complexity to pure computation, we lose something essential about how human consciousness actually works.
So I took a risk there, delving back into philosophical territory to show you something crucial: how philosophical thinking can illuminate deep patterns that connect seemingly disparate truths. I also just needed some “self-care.”
From sacrifice as temporal signaling to the AI alignment problem, from the preservation of democratic institutions to the transmission of human meaning across time—these aren't separate issues. They're interconnected parts of the same profound story about human consciousness and collective truth-seeking.
But more importantly, this philosophical framework reveals something simpler and more urgent: these people trying to overthrow the liberal democratic project are grand imposters. When they treat human society as an optimization problem, when they dismiss democratic institutions as inefficient code to be rewritten, when they ignore the costly signals transmitted to us through sacrifice—they reveal something deeper than mere error.
They reveal their blindness to how human meaning and truth actually work.
They’re not just wrong. They are wrong in ways that previous generations paid an enormous price to warn us about. The signals are there, authenticated by the highest possible human cost—if we’re still willing to receive them.
Philosophy, in the end, is not mere abstraction. It’s a tool for seeing these truths—truths that hold firm even when power insists otherwise. A tool for remembering why some things are worth defending, even dying for. And why those who would casually dismantle them deserve not just our opposition, but our contempt.
So I’ll try to keep the heavy theory to a minimum—because, in truth, there is a coup in progress in America. And perhaps this is, at bottom, an intellectual justification for something simpler: a reminder—to you, to myself—of why we fight.
“The struggle itself toward the heights is enough to fill a man’s heart. One must imagine Sisyphus happy.” — Albert Camus
Ok I finished the article and it’s great. What I keep coming back to is the word EMPATHY. The people making these high level decisions and reducing human behavior to cheap data points are also (sadly) sociopaths. The definition of a sociopath is a person devoid of empathy. One who breaks the invisible social contract of unwritten laws that protect us all. LAWS are just data points. It’s possible to be within the law but outside human decency (which is predicated on empathy). It’s all so clear now.
Could you speak to (either directly or indirectly) how it is that Thiel, a mentee of Girard, has fallen so hard for "surveillance" and "control", over "sacrifice" (or, the Cross in Girardian thought)? Also, his use of Palantir telegraphs something troubling that plays out in Tolkien. I'm not a LOTR follower, but I'm just fascinated with the co-option of literature -- be it fiction/myth (-see Jordan Peterson, Kingsnorth, etc.). There's something underway w/in a certain calculating assembly of "thinkers" with "words" that seems to put up a cover for anti-democracy and its AI reductions.