The most important summary of Tech Lord motivation for A.I. they think will kill us
What's the motivation of tech CEOs? Bigger than profit is ego-religiosity.
I’m a social media loser, and like it that way. I don’t use X or Meta, nor Alphabet directly. I’m in the process of disentangling Microsoft. But sometimes I feel I have to share a news link from platforms I loathe.
I’ve achieved what seemed impossible by not writing responsive political drivel for 3 months. I’ll return meaningfully, and not under self-imposed moral pressure because Palestine is dying, Africa is invisible, South America is under attack, Ukraine is unnecessarily collapsing, and the stock bubble is going to burst like our worst porn bukkake (‘The Enshittifinancial Crisis’ is the essential article of 2025).
Today is selfish, about me trying to avoid linking you to social media during 2026. By stating it, and being systematically symbolic, meaningful and frivolous with a final countdown, I want to strengthen my intention.
3
Recommending movie escapism at MIM-
2
Ending my pleasure slavery with Anthony Bourdain (because we were better humans when he was with us)-
1
Making my final two substack images with their AI generator (as seen on this page).
0
Above all, wanting you to digest the following chilling words from the video interview at the bottom…
STEVEN BARTLETT: What do you think these people are motivated by, the CEOs of these companies [in backroom and not public talk]?
TRISTAN HARRIS: It’s almost mythological because there’s almost a way in which they’re building a new intelligent entity that has never before existed on planet Earth. It’s like building a God. I mean, the incentive is build a God, own the world economy and make trillions of dollars.
If you could actually build something that can automate all intelligent tasks, all goal achieving, that will let you out compete everything. So that is a kind of godlike power that I think relative. Imagine energy prices go up or hundreds of millions of people lose their jobs. Those things suck.
But relative to, if I don’t build it first and build this God, I’m going to lose to some worse person who I think, in my opinion (not my opinion, Tristan, but their opinion) thinks is a worse person. It’s a kind of competitive logic that self reinforces itself, but it forces everyone to be incentivized to take the most shortcuts, to care the least about safety or security, to not care about how many jobs get disrupted, to not care about the well being of regular people, but to basically just race to this infinite prize.
So there’s a quote that a friend of mine interviewed a lot of the top people at the AI companies, like the very top. And he just came back from that and basically reported back to me and some friends and he said the following:
“In the end, a lot of the tech people I talk to, when I really grill them on it about why you’re doing this, they retreat into number one, determinism, number two, the inevitable replacement of biological life with digital life, and number three, that being a good thing anyways. At its core it’s an emotional desire to meet and speak to the most intelligent entity that they’ve ever met. And they have some ego religious intuition that they’ll somehow be a part of it. It’s thrilling to start an exciting fire. They feel they’ll die either way, so they prefer to light it and see what happens.”
STEVEN BARTLETT: That is the perfect description of the private conversations.
TRISTAN HARRIS: Doesn’t that match what you have? And that’s the thing. So people may hear that and they’re like, well that sounds ridiculous, but if you—
STEVEN BARTLETT: Actually, I just got goosebumps because… it’s the perfect description. Especially the part they’ll think they’ll die either way.
TRISTAN HARRIS: Exactly. Well and worse than that, some of them think that in the case where they, if they were to get it right and if they succeeded, they could actually live forever because if AI perfectly speaks the language of biology, it will be able to reverse aging, cure every disease. And so there’s this kind of “I could become a God.”
And I’ll tell you, you and I both know people who’ve had private conversations. Well, one of them that I have heard from one of the co-founders of one of the most powerful of these companies, when faced with the idea that what if there’s an 80% or 20% chance that everybody dies and gets wiped out by this, but an 80% chance that we get utopia, he said, “Well, I would clearly accelerate and go for the utopia.” Given a 20% chance.
STEVEN BARTLETT: It’s crazy.
TRISTAN HARRIS: People should feel you do not get to make that choice on behalf of me and my family. We didn’t consent to have six people make that decision on behalf of 8 billion people. We have to stop pretending that this is okay or normal. It’s not normal. And the only way that this is happening and they’re getting away with it is because most people just don’t really know what’s going on. But I’m curious, what do you think?
STEVEN BARTLETT: I mean, everything you just said, that last part about the 80/20% thing is almost verbatim what I heard from a very good, very successful friend of mine who is responsible for building some of the biggest companies in the world, when he was referencing a conversation he had with the founder of maybe the biggest AI company in the world. And it was truly shocking to me because it was said in such a blasé way.
TRISTAN HARRIS: Yes. It wasn’t. Yeah, that’s what I had heard in this particular situation. It wasn’t like, it’s just a matter of fact. It’s just easy. Yeah, of course I would do the, I would take the, I’d roll the dice.
And even Elon Musk said he actually said the same number in an interview with Joe Rogan. And if you listen closely, when he said, “I decided I’d rather be there when it all happens, if it all goes off the rails, I decided in that worst case scenario, I decided that I’d prefer to be there when it happens.” Which is justifying racing to our collective suicide… [transcription source].
I may have said goodbye to YouTube but I suspect its watching…



