I am the CEO of Palantir Technologies. My company is valued at a quarter of a trillion dollars. Let me be clear: that is not a typo. Two hundred and forty-nine billion dollars. Our stock has surged 320% in the last year alone. And what do we sell? Surveillance. Not ‘data integration.’ Not ‘operational intelligence.’ Not ‘decision advantage.’ Those are just the sanitized terms I use on stage, in front of audiences who prefer their oppression dressed up in corporate jargon. But in rooms where the truth is an asset, I call it what it is: surveillance. Mass surveillance. The kind that doesn’t just watch you—it predicts you, controls you, and, when necessary, erases you.
I gave a speech on March 3rd at the Andreessen Horowitz American Dynamism Summit. ‘American Dynamism’—what a charming euphemism for what it really is: a slush fund for military tech. The kind of fund that turns patriotism into profit margins, where ‘defending the nation’ is just another revenue stream. And let me tell you, the thesis works. It made me a billionaire. Not because I built something better, but because I sold the one thing this country’s power structure will always buy: compliance at scale.
Here’s what I said, word for word, to a room full of people whose combined wealth could buy and sell entire nations:
‘If Silicon Valley believes we are going to take away everyone’s white-collar job and you’re gonna screw the military—if you don’t think that’s gonna lead to nationalization of our technology, you’re retarded.’
Yes, I said retarded. The clip went viral—eleven million views. My comms team flinched. They asked me not to say it again. They don’t get it. The word wasn’t a mistake. It was a test. A test of who’s really in charge. The stock went up. The audience laughed. The message was received.
Let me break down what I actually meant by ‘nationalization.’
I meant it literally.
I was telling the most powerful people in tech that if they refuse to hand over their tools to the military-industrial complex, the government will take those tools by force. And I said it from a stage designed to look like a cozy living room, complete with throw pillows that cost more than most Americans make in a month. I sat on one of those pillows. I sipped San Pellegrino. I delivered an ultimatum wrapped in venture-capital charm.
And the audience? They laughed.
Not because it was funny. Because they thought I was talking about someone else’s company.
I was talking about someone else’s company.
Three weeks before my speech, the Pentagon labeled Anthropic—a rival AI company—a ‘supply chain risk.’ Why? Because Anthropic had the audacity to set ethical boundaries. They said: If our AI is used for lethal autonomous weapons, we stop. If capability outpaces safety, we stop. The Pentagon didn’t see ‘ethics.’ They saw a threat to the supply chain. So what happened? Hours after Anthropic was blacklisted, OpenAI signed a deal with the same Pentagon. The terms? Classified. The timing? Deliberate. The lesson? The military AI market does not pause for morality. It does not negotiate. It consumes.
Two hundred thousand people signed a petition—#QuitGPT—begging OpenAI to honor its original charter, the one that promised to ‘benefit all of humanity.’ That charter is still on their website. So is the announcement of their Pentagon contract. They occupy the same digital space. One is the marketing. The other is the product.
And Palantir? We don’t even pretend.
We are the surveillance state’s favorite tool. We have over a billion dollars in contracts with the Department of Defense. We work with ICE. We work with intelligence agencies whose names I’m not allowed to say out loud, even in a room full of billionaires. Our software has been used to track undocumented immigrants, to dismantle families, to do things I’m not permitted to describe in public. The revenue from the things I can’t talk about? It dwarfs the revenue from the things I can.
The ACLU called our ICE contracts a system for ‘tracking and deporting families.’ They were right. Did we stop? No. Did the contracts stop? No. Did the families stop being tracked? No. We issued a statement. They issued a statement. The machine kept running.
Oh, and the name of my company? Palantir. It’s from Tolkien—the seeing stones, the palantíri. In the books, Sauron corrupted them, used them to manipulate and destroy anyone who gazed into their depths. I named my surveillance empire after a fictional device designed to corrupt absolute power. I have a doctorate in social theory. I’ve read the books. I know exactly what I’m doing.
Here’s what you need to understand:
I am not making threats.
A threat implies uncertainty. There is no uncertainty here. The tech industry will cooperate with the military. The companies that jump first will be rewarded with riches beyond imagination. The companies that hesitate will be absorbed by the ones that didn’t. And the companies that refuse? Their technology will be taken—through backdoors, through ‘procurement channels,’ through whatever means necessary. The process has already begun. Anthropic was the first domino. OpenAI was the second. I’m not warning you. I’m narrating the inevitable.
Last year, Palantir’s revenue was $3.12 billion. Up 33%. Analysts keep saying we’re overvalued. They’ve been saying it for four years. Every year, the stock doubles. Every year, the analysts adjust their models. Their models have been wrong four times. I have been wrong zero times.
The market doesn’t reward ideals. It rewards prediction. And my prediction is this: Within three years, every major AI company in America will be working for the military. That prediction is now on the record, right next to the slur that got eleven million views.
The audience gave me a standing ovation. Nine seconds. I timed it. I time everything.
The future of American technology was decided in that room—between the sparkling water, the designer throw pillows, and the applause of men who think they’re the predators, not the prey.
I am the CEO of Palantir Technologies. I am worth more than the annual budgets of Estonia, Latvia, and Lithuania combined. I built a company named after a device from a story about the corruption of power. I stood in front of the most powerful people in the world and told them their choices were compliance or confiscation. They clapped. The stock soared. The cage locked into place.
I’m not a villain. I’m not a hero. I’m the philosopher holding up the mirror.
And what you see in that mirror isn’t a warning.
It’s a funeral.

No comments:
Post a Comment