Total Surveillance Is One Switch Away
We don't trust each other anymore. Everything anyone else says, you treat it like an attack.
— John Robb
Timestamps
Resources

About John Robb
John Robb is the editor of the Global Guerrillas Report on Substack and Patreon, where he publishes predictive frameworks at the intersection of war, technology, and politics. A former USAF special operations pilot who flew Tier 1 missions with Delta and SEAL Team 6, Robb later earned his MPPM from Yale, became Forrester Research's first internet analyst, and co-founded Gomez Advisors (sold to Compuware for $295 million). He is the author of Brave New War, a military strategy classic on open-source warfare. His work focuses on how networked systems create both unprecedented fragility and opportunity for individuals navigating institutional collapse.
Transcript
Show full transcript
What's the most important thing you have mapped that changes what individuals should be doing now? It changes year by year. In the moment. Yeah, the big things I'm kind of struggling with, and it's up of mind for me, is where the technology is going, driven largely by Elon, and where the nation is going or where the nation went. That affects Americans and Europeans and everyone else. is post-nationalism and the effect it has on people and what we can expect from the state and what we can expect from society going forward. Broad strokes, what's the message that you give to individuals in an elevator-length conversation? You know, when they ask you, John, what do you see what's coming? From the post-nationalism perspective, it's our common identities of loud, high trust, kind of coherent, cohesive society to operate, gone. We have lots and lots of identities now, and social networking and everything else on the internet kind of fragments that, amplifies it.
no gimmicks. Go to trustrevolution.co. That's trustrevolution.co. Okay, let's get into it. John, Rob, welcome back. Thank you, John. I appreciate it. It has been not quite a year, 10 months since we spoke last April, and you have been very busy. Not surprisingly, you published a great deal. Let's jump in here. Of all of that, what's the most important thing you have mapped that changes what individuals should be doing now it changes year by year the big thing in the moment yeah the big things i'm i'm kind of struggling with and it's of mine for me is where the technology is going driven largely by elon and where the um the nation is going or where the nation went that affects americans and europeans and everyone else it's is post-nationalism and the effect it has on people and what we can expect from the state
and what we could expect from society going forward. And those two things are the things I've been focusing on. And what is that broad strokes? What's the message that you give to individuals in a elevator-length conversation? You know, when they ask you, John, what do you see what's coming? How do you summarize that for them if it's possible? From the post-nationalism perspective, it's our common identities of loud, high trust, kind of coherent, cohesive society to operate, gone. We have lots and lots of identities now. And social networking and everything else on the internet kind of fragments that, amplifies it. And that makes social decision-making almost impossible. Because we don't trust each other anymore. And everything anyone else says, you treat it like an attack. You don't trust what they say. There isn't this common identity to kind of ground us, kind of like being an American or being German or being anyone. And that means that the state is not functioning the way it becomes a system. And nobody has loyalty and considers a system legitimate.
It's like one thing I've picked up recently is that everybody is trying to beat the system. take the opportunity, even the people that never really did anything like outside the box, the legalities or decorum or morality in the past, they will take advantage of that system and cheat it and loot it. If the opportunity becomes available and is costless, that's a huge shift. So we can expect this kind of general looting and individualistic behavior and dysfunctional governance going forward on that side. Not going to get any better. And on the other side is this technological shift with AI and autonomy. Autonomy is more important to me than AI, but it encompasses robotics. It includes AI workers and the like. Embodied AI, would you say? Autonomy makes AI capable of doing real work, shifting from just doing tasks for three, four hours, and then it falls apart. You have to reboot the task to being a coworker or a robotic colleague or employee
that does work over time, that you could train, that you can learn to trust because their behavior is consistent over time. And we're getting close to cracking autonomy. I was like, to eat my own horn, I think we found a methodology for doing that. But I think they'll stumble in is that those Optimus robots you see coming, when they come, they're not really going to be that useful in the home environment or in most tasks because they don't have autonomy. They're not persistent over time in terms of their cognition, in terms of their behavior, bounded in terms of their behavior, able to take instruction and learn from that instruction and apply it in the future. Once you crack that, all of a sudden, all those optimist robots become very, very, very useful. They become trusted companions in the home doing maid working. And in industrial settings, they can work like regular workers accessing all the human-built systems, just like a human can, and then improve themselves
and bring our society to a lot more dynamic situation. And virtual workers become colleagues, co-workers, trusted employees when they can operate autonomously for months, years. They learn from their interactions. They could take on personalities that generate trust. Their work is consistent that you can predict their behavior in the future and improve it and take management instruction. Autonomy unlocks all of that. And then all of a sudden we see them doing all sorts of cognitive work, all sorts of physical work. So yeah, that's coming in. That's coming really, really quickly. What's your time horizon or timeframe rather, John, for let's just even say the next step change where I think about for those that are in the weeds with this Opus 4.6 from Anthropic introduces sub-agents and the ability to build teams or swarms. In fact, you know, not perhaps your use of the word swarm, but there's in terms of being able to collect
and work on a particular problem in a persistent manner in parallel. What's the next step change, do you think, in terms of moving toward autonomy? Everything I've seen is that, you know, I don't know if it's a step change. It's a cognitive model that will keep it consistent over time. And nobody's got that. They're just trying to load up the amount of information it has available and hard-coded the instructions that it should, you know, stay on a certain task. Right. And guardrails, harnesses, we hear these terms, yeah. Yep. And that allows it, I think the best one I've seen is eight hours, which is the latest clod stuff. They can go work on a task of eight hours and then it starts to peter out. I think this might have been the C compiler they built in Rust and they spent $20,000 in tokens, if I recall. And out comes a commercial grade C compiler written in Rust. You could do amazing, amazing things in eight hours. Right. But it's not a human replacement. It's like a super tool.
And I think our bias is towards super tools because, you know, we're older, you know, boomers, millennials or whatever. That's the way we think it. I mean, most boomers treat AI as a search tool. So that's my extent of everything. And then younger folks are all focusing on them as a work. Most of the really high-end programmers I know are all getting 20x their productivity. And not just from automating tasks. They solve problems. Gravitate towards the best solution quicker, much, much quicker than they ever used to. And that's what really blows their mind. And they're very different from almost all other programmers. You know, it's like some people are really, really leveraged from this stuff. Right. Well, and speaking of leverage, your long night piece, your long night concept refers to AI surveillance at population scale. And you wrote in December that it is inevitably coming back. What's the timeline and when does the window close for building alternatives?
So, you know, first perhaps walk us through what this is and then why is it coming back and what do we do? Okay, so the Long Night was my term for describing what a surveillance state, totalitarian surveillance state looks like in the modern age. You've heard of the Stasi in East Germany. This does everything the Stasi does, you know, unfold. Orders of magnitude more, more intrusive, more manipulative, more aggressively, and orders of magnitude less expensive. because you don't need roomfuls or buildingfuls of bureaucrats or you don't need a lot of paid informants and managed informants. It's all automated. And the weird part about it is that operations, the way they've built out our networks, built everything that is necessary, and including AI, making this turnkey. All it takes is kind of the political will or the political mistake to kind of turn it on, and it's there. And what we're talking about is like AI surveillance of every single individual, even an AI assigned to every single individual building profiles in them.
And you can scale it to the entire global scale. I mean, you know, it's, you'd have a billion people watched in real time online, not only just censored, controlled, but manipulated and persuaded to go in certain directions, change their attitudes and then punished and, you know, using the standard kind of a Slovian kind of response is that, you know, get them into pathways that they based on reward and punishment. at a micro scale. And it can all be done by a very small group of people with the right kind of a network access. All it takes is switch. The worrisome part is that our politics, because of the collapse of the nation-state, collapse of our common identity, is turning more into kind of what I call a hollow state. It's kind of a state turns into a mechanism for looting what existed in that nation-state and transferring it to politically connected individuals. So when you think of, say, $450 billion, current budget is being, some people say as high as $750 billion, is being just brutally looted, you know, through false use of subsidies and other things.
And then on the macro scale, you know, programs that shouldn't exist in the Defense Department and other places that are just being, you know, they persist for a long time, even though they have no use. And that's, you know, add that all up and that's well over a trillion. And it's just being connected. It's just sucking the money out of the existing system and looting it. And then that kind of system is largely illegitimate, has no loyalty. People aren't loyal to it. And if there's enough pushback, if there's enough chaos caused by it, things degrade and things get less efficient, less useful. the tendency of that of those people who are benefiting from this and every politician gets this is like you you've seen the Pelosi portfolio strategy yeah yeah there's no market ups and downs and everything is just up up up up up up up and that's why she ends up what 400 billion 400 million dollars but how can a politician it's only been a politician to have 400 million dollars and that's the everybody's doing that and they'll protect themselves yes and the way they protect
themselves is to you know go to the corporations and say let's hey john i'm gonna stop you it is i i hate this i hate to to kill the the flow here but it is i'm losing two to three seconds at a time of your audio yeah so i don't know is it internet is it maybe are you no i've never had i've never had a problem with my audio before on this well i am sorry i hate it's i hate it's it's us me we got 90 90 percent in the thing up here yeah that is weird i don't suppose you have a mic another mic no i just use these here yeah and they're charged up yeah okay well we'll um yeah man they're fully charged um so it's not so internet speed is good i mean i it is the video seems a tad choppy and it could be maybe it's round tripping to me that it's bad and it's recording well but i just wanted to double check okay you want to test you want to test that because It looks good to me. And if the recording's on my end and it's uploaded, you probably should hear everything I say.
It should be. Okay. Okay. So, again, sorry to interrupt. I just hate to lose the good stuff. So, we will, well, let's see. So, I did kind of get you there on, we were talking Pelosi Tracker. Well, all this will be edited, of course. Pretty sure that it will have my audio fine on this end. Okay. Because that will be part of the upload. Because that's the real way Riverside works, right? It is. It is. It doesn't really matter about the back and forth. it really just, we both record each other. Yeah. As long as it's not AirPod to your computer, we're good. As long as it's not actually clipping the audio and recording. So again, fine. Okay, good. Sorry about that again. So we'll, I'll pick up here. So John, you wrote that, that Musk buying Twitter only delayed the night, a long night sort of encroaching. Why, purposely naive question, why didn't that plus Doge stop this? His purchase of Twitter, and I was ahead of that again, it was like talking about he should buy Twitter like a year or so before this and that he would be able to use the AI, the data for AI, is that it delayed it
and that the tendency of all these networks to operate in unison to kind of scrunch down in free speech and drive us towards a kind of approved orthodoxy of thought, speech, behavior was broken. There's still this kind of illegitimate kind of looting system out there that's like broken kind of political and social system and that the pressure will continue to build to kind of disillusionment and mute it any kind of criticism of it to do things that rock the boat and that musk in terms of its his operation of x is vulnerable to attack we already saw him kind of hold when uh during the gaza thing and he could fold from government pressure in the future all the other corporations would flow to, I mean, the kind of hard stance towards free speech that they professed years ago is gone. Right. Do whatever that's necessary. Everyone's kind of fatigued. I don't see, I don't see the long night being stopped. Just getting easier and easier to do.
Sorry. So on the thread that you just tugged on, Musk has rolled XAI under X and now X under SpaceX, if I, if I've got that correct. And so I do want to come back to that because you and I had a great discussion about that before we started recording. But to the point of the performative free speech protections, what do you make of the current administration's seemingly bold challenges to the EU and the UK about their threats to free speech, their fines of X, among others? What is what is happening there? yeah i mean trump's um trying to reignite kind of a nationalism both here and and also in europe and because the nationalism starts to refocus people on the benefit of the citizens being paramount when you start doing that then you start to look at everything in terms of trade in terms of immigration in terms of defense from a new perspective a different perspective
because right now but i don't know how much he's going to be able to bring us back from a pure globalization globalist approach to nationalism i think the tennessee still towards that globalism and globalism is like we're not focused on the prosperity of citizens anymore we were during the cold war during you know most of the you know the u.s experience we're focused on like top-line gdp growth and activity uh trade flows people flows information flows and you as a former citizen are now just a participant in this larger global system you are on your own and you benefit a little bit from being here versus somewhere else and uh that's a really tough place we're not gonna really help you compete you know you are competing in this system and um you know less expensive comes along the corporations are you primary focus because they are the big drivers of the GDP. And if their profitability is great,
then we're doing great. Right. On average. And no matter what the individual, you know, the mean is for any individual in the U.S. And I think one of the things that you wrote that I found striking, and you touched on it just a moment ago, is that it was the end of the Cold War at which the American middle class began its precipitous decline. Tell us a bit more about why that is and why perhaps it taken so long for us to recognize that it has been ongoing for that long Yeah The cold door of course was a life and death struggle for the U In order to withstand the kind of pressures associated with it and prevail over the long term, we globalized to an extent that means we had trading partners and we created a national trading system that had ties to the rest of the world. But the focus was primarily on the progress and prosperity of the middle class because that provided kind of a bulwark against socialist intrusion, the communist intrusion. The messaging just broke apart against the middle class that was adhesive, had firm underlying social structures like the
family unit and the like. And when the Cold War ended, the need for that middle class to fill out their armies, to drive the technology forward, to be loyal, disappeared. Everyone who had money and benefited a little bit from that financialization that happened in the 80s, decided we're now global. And then all those smarmy, you know, world is flat and history and all that other stuff started coming out and that pop stuff became law. In every talk show, if you talked about fair trade versus free trade, you were considered a protectionist. If you said we shouldn't be involved in all these alliances and wars around the world, oh wait, you are now an isolationist off the show, off the conversational platform. we went straight globalized globalization and that changed the perspective is that we don't have to you know worry about the prosperity of the middle class we don't have to worry about what they're doing we have to do is give them opportunities for connectivity you can travel more you can you can get more information from around the world
you can do things that people come in yeah that doesn't really work uh i mean historically i I mean, cosmopolitan empires require kind of this cohesive internal group. It doesn't change over time. And they rule dictatorially over everyone else, but they allow them a lot of freedom of action and freedom of religion and everything else. But they're never in charge. And we don't have that. We globalized everything. So it's like, it's just a mishmash. Well, and I raised that and I appreciate you taking us through that. And the reason I did is I wanted to connect then back to the specifics of their surveillance threat model and the long night returning. And so there was, if I understand you, a bulwark against it in Elon acquiring X and change of administration, Democrat-Republican here in the U.S., with what you have written about most recently, the, I think it's fair to say, the inevitable return of the left,
to the Democrat Party. We then have to look ahead, as you write, at what this surveillance state as a service looks like. And so two questions. Given the picture you've just painted, what job does it do and what does it look like in practice? On the long night, it can be packaged as a surveillance state at the service, turned on. Likely it will be implemented by aligned AIs. So alignment of an AI means it adopts a certain value set. moral structure that group would be given network access it can access every single network by law then it works to keep people away from sensitive or or rise topics topics that are dangerous considered threats what i think will kick it off that ear from the connected kind of insider class whatever they are i call them the cosmopolitan class that are the ones who are truly benefiting as they accelerate away income-wise, that was that 10%, 20%. Is that, and we're probably in this,
but we're kind of getting pulled along with it, is that the disruption caused by AI workers and autonomous AI, when they start sweeping in and they add to the disruption competition from global immigration and outsourcing, precariousness of work today, the job losses and the economic devastation will accelerate for most people. It's going to be hard to maintain a kind of a standard of living that you maintained in the past. The rate of degradation of daily life will accelerate. We're already seeing like little things on the edges of how things have degraded. Like you can't use the telephone system anymore. I mean, you can use it for direct point to point, but a standard telephone number is really not useful because you get spammed constantly and no one picks up the phone. That kind of little, that little stuff is degraded. Yeah, no, when that kind of fear cuts it, yeah. I mean, we saw it during COVID. Yeah, we saw it during COVID. We saw it during other times where people considered certain topics dangerous.
The tendency is that, and we're seeing it from the connected insider classes, that they're fine with, and with the younger people too, is that they're fine with sacrificing some level of speech and independent thought or order. or structure perceived presence of right and once that ball gets rolling it becomes ever wider there's no way to after a certain point there's no way to turn it back because it's so pervasive that any mention of changing it and so automated that or any attempt to reform it or or roll it back will be stopped before it even gets started and you start to create kind of a basis of thought where innovation is blocked or blunted new ideas that are needed to solve complex global problems are pushed to the side because they're not approved they're disruptive dangerous uh the disconnect between or the delta between uh reality and our
method of thinking becomes so wide that a crack collapse is inevitable. There's no way to course correct it while we're doing that. And I'm reminded of, in the wake of 9-11, the dystopically, is that a word, named Patriot Act and the Snowden revelations of the NSA's collect-it-all approach. And so what I hear you say is, collect-it-all was bad enough. You know, we know the stories about the hidden closet in the AT&T building in downtown Manhattan and splicing into one of the fiber trunks and effectively collecting all communications and encryption at that time was hit or miss. And it did certainly accelerate a lot of encryption technologies being adopted. What I hear you say is we move from collecting it at the center or at a junction to manipulating it at the edges. Is that fair? Right. I mean, watching you as an individual, everything you do, everything can be collected on you.
As we start to put cameras and automation and autonomy everywhere, you'll be constantly surveilled. Data is going to be pouring off of you and it will all be, AI is running in a central server somewhere or in space. Who knows? It's like, you know, it's a survival mechanism for that kind of disorderly corrupt. Well, let me ask you a question that may, perhaps it should be obvious, but I don't want to assume. What job does it do to them, or for them, rather? If they have so much wealth, so much power, this cosmopolitan elite, regardless of party, what job does this do for them, ultimately? This being the long night, the AI that we've talked about. It keeps the gravy train rolling. You know, you see a little bit of this in Europe, too. is like how they're crunching down on all, you know, voices they consider dangerous and, you know, jerry-rigging elections and things like this. Whether or not the AFD and others are right or wrong is ideally beside the point.
The fact that they're suppressing all of this and, you know, UK sending, what, 12,000 people to jail every year for speech violations on Twitter. More than Russia. Yeah, it's like, it's just, it's just nuts stuff. It is that kind of thing, being automated and tuned up and scaled in a way that eventually it makes it impossible to even criticize it. And the number of things that is considering danger coming from corrupt sources of power to align it in ways that are not beneficial to the rest of us and ultimately will result in stagnation and decline. Just in a broad strokes. Yeah, the more disorderly things get, the more people have to fear, the more they have to worry about, the more this trust they have of their neighbors and others that they, in their society, the more inevitable it becomes that we get this. And I think that is a bit of an editorial aside here, but I was having this conversation
with my wife a couple of nights ago with regard to, well, and for those who hear me talk about this from time to time, she's from Morocco, different perspective and a very, I wouldn't say collectivist culture, but large families, large communities, tight knit. Most people live simple lives. And she is immensely grateful, but in awe continuously as to the fragmentation and distrust. And so I say that just to illustrate that it has been on my mind a lot. And I take your point quite seriously that it is what fuels the ability for those pulling the strings to put these systems into place and to accelerate. Well, lest this be an hour-long black pill. Yeah, you're right. Have to ask. And, you know, obviously you'll shoot us straight on this, John. Where do we find hope, promise, you know, in my world, certainly Bitcoin, Nostra, end-to-end encrypted communications. Do any of these actually matter against state-level AI surveillance? And if not,
what can we do what should we do yeah no uh i'm not sure that most of those will actually do much do they kind of you know long night they can you know clamp down on everything to a degree that we're not we're not you know we don't have any previous experience with degree of control and i mean the things that we could have done of course to prevent this and kind of tie people into the this new ai economy was the data ownership thing i was pushing years ago yes you know if i I mean, it's like all of the AIs that we see now are built on our data. Everyone who was online contributed to it. All that value is derived from us. We get no benefit other than the potential free use of whatever they want to throw us during their development stage. Use of the products. It would be like you build, you're a farmer, you have 10 acres, you've grown all these crops. and when he comes in and says, I'll harvest them all for you, but all the benefit comes to me
because I have these harvesting machines, you don't have that. In fact, we're going to mine and fill all the resources out of your soil underneath because we have the mining equipment, you don't have it. You don't have to worry about it. It's all taken care of. We'll give you some freebie stuff so you can survive. You become a sharecropper on your own land. Right. And that was the thing. It's like, I look back at history and I saw that what happened when the U.S. was started and made it really different than Europe was that we owned land when we came here. Is that everyone was working on land only owned by the nobles because they're the only ones who could own it. And you were, you know, sharecropping effectively as a serf. And once you got your stuff done, you stopped. But here you owned and you accumulated wealth and you improved on that ability to accumulate wealth and improve your capacity to do things. And that created the markets. Wealth accumulation at the individual level created the kind of mass markets that the industrialization kind of fed into and exploded.
Changed the world. I mean, all of the things that we see from cars to electricity, everything brought to everyone's home. And now everyone's emulating it. Like, it's like, of course it would happen, but it didn't happen unless you had that example. And we're in that kind of same thing with data is that if we had that kind of connection, that ownership piece of this new thing. You wouldn't feel like you're, you know, you go to work and they grab all your data and all your skill sets are being watched and then is sucked up and put into an AI to compete with you or replace you. You're not going to feel that, you know, being used or suffer the economic consequences of it. You know, you have data ownership, but wait, that's worth something. It's being paid. You know, it's valuable. I should get a piece of that. It could have changed the whole dynamic of the way things. And I would be willing to do extra tasks and demonstrate new things and learn new things and show them and create new industries of people doing that. To come up with new tasks that can be copied by AI if you had a mechanism for owning going forward.
So we missed that boat. I did it in front of the Senate and they didn't take it. So they laughed at AI. It was one year before AI. They go, AI, that doesn't exist. And so I don't know about structural stuff. I do know that one positive thing is that I found that AI's philosophical level, you think of them as potential. So they have, like any given task, and they have dozens of ways of completing that task or any topic, they have dozens or hundreds of points of view. And, you know, complex ways of approaching that idea. There's no difference between all of them to the AI. They can say one's more popular than the others, but there's no value attributed to them. Human beings are all about constraints. We have constraints on our time, our lifetime, our financials, ideas based on what we've experienced and we've learned to kind of limit our thinking to. We have goals and orientation that leads us forward.
The pairing of those two is amazing. As someone said, constraint breeds creativity. I always forget who to attribute that to, but it's powerful. I did. That was me. No, no, true. Yeah. No, I mean, I'm sure I'm not the first one to say it, but constraints breed creativity. Creativity can't happen without that constraint. And you see, you match the AI to the human, you get a more powerful connection. So the idea of AI is operating independently is open loop and is bad. and if they're paired with humans it's a good thing forward going forward because we could all prosper if we make that easier to do and more of a requirement that means like if i have ais that i'm training to be my employees that have control over the data on those ais what i trained and it shouldn't be sucked up to some mothership absolutely and exploited and used to put me out of business in the future or used by my competitors. And so that kind of, I run it in my own VM,
my own account is separate from everyone else. And those employees, whether robotic or AI workers, are working with me and they're working on my constraints and they're learning from me and I'm improving them and we all benefit. You might give them some autonomy and they can do some stuff on their own in terms of improvement. You can pay them, you can incorporate them. But all of that, yields a better output for all of us. And that changes the way things move forward. But the tendency of the system is going to be this kind of looting mentality where everything's centralized and a few people get all the benefit, all the AIs in the system. And that kind of wealth thing. And then you start to add in the dynamics that are changing with robotics and military force. You get autonomous robotics. We know your robotics. They're just at it. switch away from being running military programs so everybody has a bodyguard or wealth or consequences as dozens of bodyguards they're only you know you know ever you ever heard that whole
thing where you have these people going off to their hidey holes and their their boat holes oh yeah the preppers rich guys yeah yeah yeah they have these these remote things and they always have bodyguards and they're going how how do you if their disaster does strike how do you keep them loyal now you have the answer it's like milan musk will have an army of 10 you know in 15 years he'll have an army of 10 000 right i mean full of states where everything is just almost robotics doing it grounds work everything else and everyone is has the ability to be a security bot too it shifts the kind of military balance it's going to take him on the police or whatever i am reminded it's it's a bleak but i'm as a science fiction fan it's a bleak but wonderful series called Murderbot. They unfortunately butchered it for Apple TV, but the books are fantastic. And it goes to that point, right? Well, let me ask you, so with all of that, and I think you've done unsurprisingly a great job of contrasting the magnitude, the promise,
the potential with some very bleak potential outcomes. What would you, what are you maybe, that you'll discussed, excuse me, what would you build to either direct, redirect, embrace, counter all of this? What would you build in the world of perhaps data and AI? I think we're in the early stages of kind of not a technological singularity going towards superintelligence Because if we get to any kind of technological singularity ag agi asi is a disaster it adapt we dead you absolutely asi for us please artificial super intelligence so super intelligence is when you take away all of what we did when we built ai is we took social data and we now reverse you know backed out this kind of cognitive structure it's still tied to all of humanity and it's still being built and improved on by adding more data from humanity but what you do with artificial
super intelligence is you back out the core principles a kind of pure rational mind okay and you strip that social connection away as i'm asimov's rules are gone it's all gone right well you you rely on trying to kind of contain it but that true rationality is fully alien fully disconnected from us and if it's capable if it's fast it's super smart there's no telling how it will interact with it will it might solve great physics stuff and it might do whatever but you should if we ever do get to that point where it is we disconnect it like that and it should be like air gap clean rooms mars right don't want it anywhere close to you could never let it interact with the rest of society these social ais that we have now are so connected to us they're they're just kind of a reflection a mimicry of us so i noted yesterday that they're a rorschach test i think the the open claw uh that we discussed earlier are a rorschach test
yeah so the artificial super intelligence stuff is is is uh Yeah, that's, that's, what was the, what was the second piece of this? I was going to go. Yeah, I think it was, I think it, you know, it's, it's really, what would you advise someone to build or embrace or create that makes the most of, of the situation? Right. Is that you should be focusing on trying to find ways to build AIs, persistent AIs, AI agents, or, you know, is the current term, agentic support for yourself. that will leverage your ability to make money and operate in the world. Whether if you have a business, whether it's a retail business or whether it's like just consulting, is that you build these things and you work with them and you improve them. You stay ahead of the technology and you're trying new systems to turn them into revenue
enhancers or revenue streams in and of themselves. and the more you do that the more prepared you are for when things really start accelerating because in an economic singularity is like certain you know what happens with when you get close to a singularity is you stretch out the molecules gets you know of the feet that are closest to the black hole get pulled a little faster than the ones on the top of your head and it eventually becomes just a string of molecules you know it's like we're getting to that point with with economics and social standing is like and social power is that the people are being sucked into the singularity or getting accelerating away globalization was a piece of it financialization was a piece of it but what ai is doing it's it's going to leverage people to such a degree that they will just be in a different world they won't live like the rest of us and woe to everyone else so example of this was what i came up with is uh if you wanted to create an economic system or AIs walking away with just average AIs, nothing super intelligent, is that as
they become AI workers and AI robotics, and they're all autonomous and they're working in all these different roles, is that you give them the ability to make money. You incorporate them. And then they earn money and they can spend money on improving themselves. But you can do simulation time to refine their behaviors, like if you're an economist taxi You could, you have a lot of kids in your car. You want to, you know, your core capabilities don't, if you haven't provided you the skill set to handle kids while you take simulation time and training programs to help you interact with kids better, speak with them, have the right kind of equipment for them. You just entertain them while they're being shuttled around to their various tasks because their parents don't do it anymore. And the incentives are aligned, I think is what you're driving at. Right. And so they are improving, like we're trying to improve. and they create an economy because a lot of those services and a lot of those capabilities would be delivered by other AIs. Now, things get really wonky is if you get,
if you believe the shift in AI is going from terrestrial to orbital because frankly, there's not enough energy to run all these AIs, do all the inference that we have to do. It takes 10 years to even break ground on a nuclear power plant, even if you accelerate it, right? And we've hit the limit of what, available energy is there on Earth. And the AI stuff is still ramping much, much faster. So everything's moving to space. So it's what Elon's going to do with the Starship is that he can put up these massive solar arrays with data centers on the back in the shade and put them in sun-tickbring this orbit. They're always facing the sun very 24-7. And they don't have to have new cooling systems. They just have to radiate the heat off in the space in the shade side. And he thinks he can put up 100 gigawatts a year, which is about a quarter of what we consume in the U.S. within three years.
So he's going to, he's ramping towards that. To your point of the singularity, that is. Right. It just, so now you have an economy that's like, new workers are being added by as fast as you can add the new energy. And it's in space. And here's the biggest platform. And no one else really can touch it. nation states can't tax it? I mean, I suppose there's, there are always ways to put the squeeze on the human in the loop, but. As long as there's a corporation in the states, but the thing is, here's the thing, is that he, in order to kind of get approval to build a nuclear power plant or any kind of power plant, you have to get thousands of these different signatures, right? He got one signature to do this. Already done. From the Trump administration. Go build. So one approval process and he's gone. Now, what he can do is, if that's the place where everything is being hosted on this cheap energy infrastructure that's expanding exponentially, it's cheaper than doing anything in terrestrial. Hosting billions, then tens of billions and hundreds of billions of AIs that are working as virtual workers in corporations everywhere around the world and working as kind of autonomous interfaces for robotics.
Then you have this economy in space where it's separate. And if they're incorporated there, it changes the whole kind of legal dynamics. Do you want to say, if you don't, I mean, what would happen to a company or even the EU, right? Who said, okay, well, unless you let us, them, they can't do business here. You go, okay. Fine. And then, yeah, you die as an economy, right? Everyone else is doing it and you're just going to die. And the only one that's probably going to be separate would be China. And China is going to do it all internally and try to sell that. But this thing, it could grow so fast with so many new participants and the speed of the transactions operating at not human speed, but agent speed. Or AI, autonomous AI speed, is that it could become 15 years, 10, 15 years, 99% of the global economy. We've taken it aggregate. I have. And that leaves us for a billion. I have at least a dozen sci-fi novels from my teens and 20s flooding back into my memory having this conversation.
Usually it's like colonies and the colonials get super rich and the earth gets poor or something. But this is like, it's right there and then it just becomes. Right. It's outside, you know, outsized influence by Elon, but it's out of his control because you let all these, you know, hosted agents or, you know, AIs live up there. But everybody who has these AIs that are working for them, you get to put them there. And they could be making money and approving it and they drag you along with it. And if that becomes 99% and all those AIs are getting wealthy and all the people connected to those AIs are getting wealthy, you get so much more wealthy because all boats are rising in that thing than everyone else just connected to the terrestrial economy. Well, and I think what that brings to me, too, I think to pull to a different piece you've written, but it, of course, all connects back, is you wrote in January that the top 10% now account for half of consumer spending for the first time in American history.
And if the middle class, as you have called out since the end of the Cold War, is being structurally replaced, what does that mean for someone building a life outside that bracket? And I want to connect that to what you just said. Is there a window that closes in which we need to bootstrap a certain capability with regard to AI to be, as you said, drawn up into this interstellar sort of economy? I mean, that's a wide-ranging question, but I guess what in essence is happening? I think you have about 10 years and it's like income. What that meant when you had 50% of consumer spending, shifting to the top 10%, means that the longstanding capability or the cool thing about the U.S. was it had widespread wealth. And even the top-tier guys in the turn of the century in 1900s, I mean, they were a smaller segment of the total wealth of the economy because most of the economic activity was local. And so what ends up happening is that
they're now driving our technological development. What happened before is that instead of prior to the U.S., prior to individual land ownership, all of technological development was focused on the needs of nobility and wealthy, and they wanted toys and weapons. Nothing else. I mean, look at Leonardo da Vinci. He was like designing weapons and toys. It was like that's, you know, and some art, but most technology was focused on that. And then it changed to, with the middle class, with this rise of individual land ownership shifting from individual farmer to the financial middle class homeowner, it shifted to appliances and labor-saving devices. Because if you're wealthy before that, you didn't need a labor-saving device. You just hire somebody, like, cheap. All the things that we now associate with progress. And now we're shifting back. And just at the time that all this new technology for AI and everything else is being developed, it's being directed towards the needs of that top 20 percent top 10 for 50 but it's like 70
for the top 20 it's like being directed towards their needs and their needs are how do i compete better and the funny part about them the people that are like gravitating up they have a strict kind of social just when it comes to their own lives so no divorce you know they can't no one gets divorced well because divorce you lay out the demographics in your in your piece if you get divorced that's the surest way for to wipe you out if you stay single and you're not tool income or you're not like and you don't have kids that are going to kind of come behind you and push you forward with their earning capacity and they're you have to track them into you know good good earning and then they have to have families and there's lots of ways to maintain that conclude it's a competition and they have to you have to this kind of loosey-goosey everything is okay you know everything accepted they'll say that but they don't do that it's a luxury of morality is that they they'll say oh it was whatever do whatever you know that whatever you do in
personal life it doesn't reflect they're maximizing it by by restricting it and not to say that other things are you know are bad it's just more towards some level of long-term stability and um social cohesion at the at the micro level that helps you roll forward and accumulate wealth and accumulate leverage from all this new technology the better off you are going to be in your whoever comes after you they know what they're optimizing for right and they're going to go and so you have to like get your personal life in order and the personal life of the people around you have focused on trying to go up that curve. And then you have to leverage yourself with AI and AI robotics and participate in this and try to connect yourself to whatever the singularity piece is. It'll drag you forward. And the opportunities they're going to provide you, I mean, it's just... You saw Zempick and other things like that. We're going to have
you know, interference agents for interfering with the aging process. You know, there's a genetic sequence that kicks that off. They're going to have stuff that, and that stuff is going to become available for the people on the Zooming Up much, much, much, much sooner. And they'll be found out, you know, they'll be using that and utilizing it so they'll live longer, healthier lives. So that advantage compounds. Compounds. And so the more workers they have or colleagues, AI colleagues they have working with them, the more they get pulled forward. So, I mean, I think if you're not starting right now, you're already at a disadvantage. You're not going to be on the top tier, but you can catch up if it's through hard work. But you definitely need to start to having leverage. It's all about leverage. I used to call it super empowerment. The technology allows people to be super empowered. And those super empowered people can do outsized things, things that wasn't possible even a decade ago or two decades, you know, even five years ago. you can leverage your ability to emulate to gain to live better live longer yeah no i think it's
good you know the other thing you have to watch out for with this is that there's a great at at distraction at distraction yeah yeah uh so the chances that you're going to get pulled off by ai interactions other uh you know people are using as friends and therapists and and lovers and all and stuff and you add the robotics into it it starts to become like you're going to see people fall off you know they're just going to be pulled off into the weeds and i think we see this already yeah yeah it is the uh ready player one scenario right from from the from the novel where you know certain percentage of us are in a hobble in a in a vr rig and our every you know whim is is catered to but we're we're sort of rotting as as humans wrote a little science fiction though it's short story i never really published but it was this based on the turkey idea is that you know as ai
progresses i did it like decade ago it's like this ai progresses it starts stealing all our ideas we're really fast and instantiating them in ai and then and replacing us so work increasingly becomes a situation where people do gig work where they learn a new task and they pay for the education necessary to learn that task and, you know, go into debt to do that. And then they do that task, AI watches it, replaces them, and then they're fired and they have to learn something new. And they pay off, you know, they hopefully earn enough to actually pay off the educational debt and then do it again and again. Like every five months, year, maybe down to every month, doing the same thing. Annual re-skilling, yeah. Yeah, and they're living in, like everyone's working in these trailer parks so they can be monitored to make sure that they're being, you know, their physical tasks are being captured correctly. Yeah. And it turns into these farms of people all around the world that are everything that's, every task, every skill, every methodology that humans could do
is all being extracted and captured. And nobody has that data ownership and it becomes just extraction system for the lower 90% or 80%. Man. And that, you know, what strikes me is that where the matrix distills that into we are all batteries. I think that's, you know, what you present is a more nuanced approach, which is that we are, you know, we're neural nets. We're neural farms for AIs. If they had done that where they were actually extracting skill sets and rather than just heat energy, it would have been, you know, those capabilities. That would have been a much more interesting, scary thing. Yeah, now that you say it, well, there's your next project. I want to shift. Yeah, go ahead. Yeah, no, please. I was going to say, I want to shift, and it, however, taps into what you've said about the immense opportunity for distraction. And so in that vein, let's talk about what I would call information sovereignty and dare we go into Minneapolis. So you've mapped how viral content manufactures tribal identity,
empathy triggers you've noted that will conscript millions into conflicts and so it strikes me that that machinery is running amok right now in minneapolis with regard to ice all of it in light of that how does someone maintain independent judgment when algorithms are optimized to to capture us and to magnify rage what does what does sort of sovereignty of thought look like in that regard It hard I mean a lot of very very smart people get caught up in these what I call swarms They kicked off by an empathy trigger like a George Floyd video Now Rene Good. Yeah, and then Rene Good and then others, and they just, or invasion of Ukraine and all the pictures of people being killed. First was Israel on October 7th and then in Gaza. afterwards these swarms just sweep people have very low resistance levels to empathy transfer
and empathy is not like a sympathy it's like the mental modeling of the victim you take on their perspective and that process hits very very quickly there's a lot of information transfer and you become them and they're connected to you almost on a tribal level your your tribal age is a a kinship connection and their enemy is your enemy and their outrage their fear is your fear and outrage like when a mouse sees a another mouse being electrocuted or they grit their teeth the empathy transfer gives them that mental state of that mouse being electrocuted and they take it on they take on the the tense muscles and the and the and the fear and the and we're doing the same thing with line now if you feel yourself getting enraged anything absolutely anything even if you think it's justified or not if you see a video something happening don't okay because you are you're being played and it's not i know people who are trying to do this intentionally but
it doesn't have to be intentional it could be just like a viral thing that takes off in like the George Floyd. It just took off. And a lot of people amplified that trying to make it intentional. But the thing is, if you find yourself being enraged and your mind is going into overload away because you don't want to get into that state, you don't want to become tribalized. If we're talking about the long night and how that could happen, we're just like a swarm away. What happens is when people get into that swarm mentality, it's all about victory at any cost. Like when we did it against Russia because of Ukraine, We pushed this up to the edge of nuclear war, like in terms of, there was a kind of a logical, methodical process of unwinding this using this traditional diplomacy between nuclear powers. They have 6,000 nukes and we should treat them like that. But we didn't do that because we're all in this mentality where we kind of gave in and kind of centralized our intelligence. And the intelligence was focused exclusively on how do we defeat, utterly defeat Russia?
We disconnected them, and everyone operated at an individual level to disconnect them. Way down in governments that had done nothing to formally disconnect from Russia or take any action, there were agencies and bureaucracies disconnecting from Russia. Everyone was trying to find ways to kind of damage them, hurt them. And we intensified the conflict. And so the chances of any resolution or any kind of peaceful exit from it became existing. Did you say, John, that, I mean, I forget who coined the term suicidal empathy, but it rattles around in the back of my head. And I think it was coined and used at the individual level. And we see individuals, you know, without going too into details, who are taking on, and you noted this, you know, they're feeling the physical pain in their bodies because they have been doused in this, you know, kind of content and these algorithms. and so they take it on individually and they behave in ways that are self-harming for the perceived benefit of another with whom they have no real connection.
So my point there is is what you've described as this suicidal empathy at the swarm or network level? The society level, yeah. I mean, we hadn't seen, I mean, nation states go to war and they ramp up this tribalism, right? Because nationalism is a form of tribalism that we've weakened over time. It takes a while. It takes a lot of effort. A lot of propaganda, a lot of push. Forms happen in weeks and days. Okay, so it goes at a whole society level. And we saw it cross the line with the Ukraine situation in war and peace. It changed the whole rhetoric on the war. From, you know, the kind of low emphasis it had with Crimea and other intrusions by Russia into Ukraine to we're fighting Hitler. and it changed the whole dynamic of it and and the resolution and the loss of life that resulted i mean the million people that are dead today because of that lightning and we also saw you know at the nation state level with israel when they were attacked the mentality associated with
that justified the what came after that kind of swarm intensity both the nation level and then and now we saw it on the back side you know start to ramp up in defense of gaza so we're talking at the national level, you could have the suicide current and it could be the thing that kicks off the long night. It's like, if everyone thinks that this is so dangerous that we have to suppress it, that this is like the end of all things, then why don't we lock it down? Victory at all costs must be... Yeah, and that's the camel's nose. Every network starts to get intrusions and then let's add this, let's add that, let's add this, and then it goes more and more and more and then all of society is aligned to a specific orthodoxy behavior and we end. Well, I know you're in the business of more of diagnosing, connecting and diagnosing than you are prescribing. And we touched on this earlier, but other than, you know, I think disconnecting from social networks is either
impractical for some, or I would argue not enough. Someone listening thinks, wow, this is, this is, Why would we disconnect? Okay, good. Yeah, so let's go there. What are a few choices that one makes in light of all this to maintain agency, intellectually, economically, physically? Stay connected, but be skeptical. And don't get swept away in the emotional kind of contagion associated with empathy. Try to protect the others around you from being exposed, but that seems to be impossible now. A lot of people, you know, they get swept up in it, and that can affect your relationship with them if you're not. Yes. as courageous they are. And then watch out for people trying to slide in aligned AI services. And what I mean aligned is aligned to whatever they think is right or wrong in their value system. Is that like when we are quickly going to AI tutors everywhere,
all the, everyone with any kind of money, their kids are being tutored with technological literacy. Their kids are doing tutors rather than going to standard private school. you know, and by AIs. And they're three grade levels ahead or four grade levels ahead and they're grouping together and they're doing this. But you have to watch the alignment, the value structure that's being put into that. And that takes, you have to know what it is and know to look for it. And then finding a system for doing that, that allows you to have that control. Some people say, I want these certain religious values in or certain viewpoints. Good, do that. But you have to find that system. not like public school where you're kind of forced in and do it but if you're in these systems these ais these tutors are going to be much more intrusive with the kids going up they're always going to be surrounded by AI and they'll be guiding them forward in their decision making forever you'll never be alone a few more years you will never be alone you always have this AI companion hopefully not a super intelligent AI that will end up killing you but
that are working with you and helping you become better, achieving your goals, and thereby, through that, achieving their own goals. So what I hear you say is, as one that you would entrust your children's time and safety with, is to assure that you are capable, you're literate, you are able to screen, to choose, to select what it is that your kids will be exposed to because they will be exposed to it. And the same thing for your business and everything else. as disconnected and locally controlled as you can get. If you have a private VM in an AI server cloud in space, it's okay as long as you control it and nobody else is siphoning off the intelligence that you're, the modifications, the adaptations, the improvements that you're making to that AI to their central cloud. Right. You'll be fine. If you're relying on these centralized services, yeah, they're going to have a lot of locks and controls and alignment issues that are going to bite you in the ass long term.
Every investment you make is going to be siphoned off to the cloud and used by them to compete against you. Plus, they'll be limiting and directing the behavior of your AI set. And then that goes with society too, like the neighborhood and the nationals. So we have to kind of make sure that, yeah, it's going to be embedded. Sorry to make this so much about AI, but it's like, I don't think people fully appreciate how this fascist is coming. No, how could you not? Right. Go ahead. No, it's not ASI. It's not the super intelligence that everyone's worried about. It's like, this is stuff that every, almost every AI that we've interacted with socially is capable of doing real work long-term. And it's already here. They're already capable. And I think, you know, I have gone through, and perhaps on a day-to-day or if not day-to-day weekly basis, feel the wave of this is hype, this is real, this is hype, this is real. And I think it is objective
as I can be. It's hard not to make this a focal point of conversation with regard to all things technology, certainly, but in the vein of this show about trust and being able to perceive and have the critical thought to know what is being thrown at us, what are our family members, friends, colleagues, being engaged with in terms of AI. And maybe on that note, John, and you've noted that you've worked with your son on some projects, perhaps as specific as your son, but if not broadly, what would you advise a young person who's trying to navigate the next decade? What, we touched on this a little bit, but what should they go build? What's the opportunity really? everything you do regardless of the focus area try to find ways to leverage yourself with it when they start driving and that you don't have to do something technology related it's could be real world you're solving but you're using ai am using technological resources that
super empower yourself as long as you're doing that and you're doing that to the maximum you're keeping abreast of how to do that, you'll do fine. You don't have to go out and build technology or, you know, or build something new in the technological area. Just using technology to amplify your capacity. Did you say leverage? Yeah. Yeah. We were talking earlier about trust and stuff that could be cool is there are possibilities with AI to really, you know, heighten. I always thought that Dunbar number, you know, familiar with the Dunbar number. 150. It's like, yeah, as the limit of your, relationships of who you can know enough about in order to trust them or like adjudicate the level of trust that you're going to afford them right is that ai could potentially serve as a way to increase that orders of magnitude is that i have a trusted ai who knows who i am okay it's been adjudicated by the system as something that will tell the truth and will properly convey the assets
of who I am and my expected behavior in the future. And you have one and you have one and you have one. And it's up to a million of other people who do this. They interact and find people that you can trust and you implicitly trust them the moment you see them. A digital twin. Yeah, your doppelganger, your version of you that is that, what do they call that? The selfish ledger. Right, right. We're already creating a selfish ledger of ourselves. meaning that's a version of ourselves that are online because we're interacting. We've never met in person, right? So, you know, everything we know about each other is this digital version. Everything we've written, everything we do. It's intermediated. Yeah, and the success of that digital version is our success because it confers back on. And the real world is that that can be quantized, that can be given a trust metric or way of conveying that other people can adjudicate as to whether or not they trust that.
know that you could be really quickly just flow into these trusted relationships that can scale in ways that we haven't seen since tribalism. Because tribalism was real trust. I knew you. I work with you. I live with you. Your success in the long term is my success. If you die, I'm weaker as a result. And then we went to beyond the tribe. We started to barter. And barters are clear transactions. I give you this. You give me that. We only do that with enemies. We only barter with enemies. We gift if it's in the tribe, right? And then we start bartering and we scale that and then we go, okay, well, there's too many players here, too many chances of getting stolen from or defrauded. Let's start building laws. Let's start building fake trust, like a hex of trust. And then you start to do that and it scaled and now we're getting up to, then we went to global. We had nationalism kind of contain that, but then it went to global and all those fake trust things start to really fall apart. So everybody's kind of like, how can I loot this system? How can I take advantages?
To where we began. Yes, we're back to the breakdown of trust after we tried to get bigger than the tribe. Can the AIs provide us a kind of a tribalism? Me and my 5 million best friends or people I could trust have decided that we're not going to do this. We're not going to deal with people who do this or this or this or this. And we can do things together without friction, without the kind of Coasean hangups. And we can keep, we can, one thing about AIs doing, incorporate AIs and groups of AIs incorporating is that it can eliminate Coasean friction and, you know, down to very, very, very low levels. And humans can match that or incorporate into that is to build this kind of system among each other. Oh, you know, what just hit me, John, as you were saying that is much as price is a signal, the signal that flows through free markets is what if trust signals can flow at the same speed, with the same distributed efficacy?
Have you ever read the Manneke Nako? The Bruce Sterling book? Or it's actually a short story? You don't know. Okay, so look up the Bruce Sterling short story, Manneke Nako, the cat. I consider myself a fan, so I'm embarrassed that I can't count that as one I've read. It's one of the best things he ever wrote. it's about this kind of uh ai enabled system for gifting okay and so he doesn't really get into the ad piece but it's the system that knows what you want by watching you and what what you desire and it finds somebody that has the ability to give it to you and you build up kind of credit and and what happens is you give stuff that you can give to the people that are and it's finding what they really need versus an income. So you need a place to crash or you need some supplies for your art. You need to find people that are appreciative of the art or whatever. It becomes this whole system that runs parallel to the economy and it kind of takes off in Japan.
And it's like... Because of course it would. Yeah. And the rest of the country is like, why is the economy evaporating? We can't stop this. And it's like they try to come in and regulate it and flow it. But it turns everything into a gifting economy that provides you everything you... think you need versus what we've been taught that we should need. Coincidence of what you really need. Yeah. I was like, yeah. And matching them without true economic, this intermediary or mechanism that we go through. Right. Pretty cool. Incredibly. And I think that, you know, that's maybe a terrific place to wrap it up is I do find myself personally in a delicate, sometimes balance between being in awe of the possibility and the potential in a lifelong, you know, nerd seeing some of this science fiction that I read as a kid come true. And as you've laid out some very potent cautionary tales of what can go wrong. And so I think that's why, John, I was really eager to have this conversation. I'm grateful
for your time and your insights. And I cannot strongly enough suggest people subscribe to Global Gorillas on Substack because it is a very unique collection of work that gives, I think, insight into what's coming and what could be right around the corner. Thank you so much. Been fun, Sean. Thank you so much, John. Hope to do this very soon and all the best. All right. Take care. you