Teachable Moment: Will the digital control grid inevitably fail, or is it already here?

By Gabriel
Published: Dec 09 2025
Teachable Moment Technocracy Government Remoralization Digital ID

Are people truly choosing digital tyranny?

This is my contribution to a larger discussion started by Terry Wolfe (AKA The Winter Christian) referencing a piece: Is Global Technocracy Inevitable Or Dangerously Delusional?. Terry Wolfe states “Brandon Smith has been a more sober analyst than most for the last decade. He says the digital prison planet is fragile and unlikely.” I’ll certainly agree on the fragility of a global control grid, but wholeheartedly disagree that digital totalitarianism is somehow a remote possibility. Any frank and sober analysis of our existing digital landscape reveals how overtly tyrannical and abusive much of it is, but this is often blamed on the victims.

For example, Brandon Smith’s piece argues that digital tyranny is entirely built on the consent of those ruled over by it, therefore if enough people simply ‘smarten up’ then all the issues will never come to pass. This ignores the very real and pressing impacts that people are already experiencing. For decades people have lamented countless people throwing away their lives scrolling on screens, but hardly any of them really ask Why. Why do so many people carry smartphones, despite the numerous privacy, security, and mental health dangers? Why do people put up with our present digital experience when it is so overtly malevolent?

Convenience’ we can sneer. This simple word allows us to shift the blame away from billions in social engineering research and manipulative tactics onto those who merely play the cards they’ve been dealt. It is the height of hubris to believe that just because we may have taken proactive steps to protect ourselves in this evolving virtual environment, that those who fail must simply be too dumb and lazy to care. Resistance to modern digital intrusions requires a certain luxury of time, if not an investment in skills or particular sacrifices. While yes smartphones and other modern systems and tools are very much responsible for accelerating human dispossession, to identify them as the cause is to mistake the symptoms for the illness.

It is my position that the nature of the control grid is primarily financial. This is plain to see in the technological realm. Surveillance capitalism is a fascinating feat of financial engineering that has built the foundations of our terrifying and tragic digital experience online. The “attention economy” as it exists was not an inevitable outcome of the Internet Age, but rather a deliberate construction to enable mass data collection. The smartphone was a convenient beachhead for these intrusions but was certainly not the only means of delivering this kind of manipulation.

Which is where Brandon Smith’s piece makes some fairly important points about AI technology, but still misses the forest for the trees. The AI bubble is undeniably a creature of the national security state just as Big Tech. We can point and laugh at people who succumb to asking LLM chatbots for life advice, but in doing so we’re forgetting the full-spectrum assault on deploying that particular payload to desperate, vulnerable people. My point, is that digital tyranny is downstream of actual totalitarianism, not the other way around. It is absurdly obtuse to begin human history at the invention of the transistor and then use that as the “blank slate” to condemn people for falling into particularly nefarious traps.

It is genuinely fascinating to me that for all the shouting about the evils of smartphones, there has been very little interest in understanding why they continue to thrive and dominate with hardly any real resistance. It is my belief that placing all the blame on digital tools (even where it is warranted) achieves two disastrous aims: 1) It is a useful distraction from the social, economic, and other real-life pressures that drive adoption. 2) Pushes simplistic discussions about digital tools being wholly good or bad without real nuance and due consideration. To use a metaphor, smartphones are an apex predator adapted to a particular environment, the predator will ultimately remain until the environment changes. In concrete terms, people aren’t ‘on the road to slavery’ because they have smartphones. They have smartphones because they are on the road to slavery.

This inversion of cause and effect is not only simplistic, but places the blame on those assaulted rather than on those doing the assaulting. It is a familiar pattern that rarely ventures deep into examining the specific mechanisms of how power is consolidated over time, and leads to structural impacts on people’s behavior. A lack of domain knowledge can be easily made up with writing everything off as evil, but this approach rarely leads to useful discussion on a better path forward. On the other hand, those with the critical technical skills can struggle on two fronts: 1) being unable to bridge the knowledge gaps to the general public, therefore sounding incomprehensible 2) being entirely reliant on the control grid for income, staying silent as long as the credits roll in.

What one needs to consider is that one of the primary drivers for digital control systems is risk management. Almost every mass-surveillance technology has a simple small-scale security project as a proof-of-concept. Facial recognition, access control, and invasive data collection are all vital features of any robust security solution. It just so happens that those systems are also very valuable for tyrannical regimes. This goes even further into the realm of cyberspace, where control over access to information itself can maintain order over unstable situations. We have already seen many counties in the ‘free world’ move towards not only regulating social media itself, but also developing control measures over the Internet itself.

Scale is the enemy: a deep analysis of the control grid

In response to Terry’s points, DJ put together a very valuable presentation: Technocracy: Okay but can they do it? (No not really.). I disagree with the title, and many of the conclusions drawn. That said, this presentation is absolutely worth the watch due to the excellent historic overview of how our digital experience was constructed over time. A main point of DJ’s presentation here is that the entire digital landscape went through various phases of development. He explains how digital systems started as research projects, then moving into institutions, to how industry picked them up to being to “model reality” to track people and transactions. This detailed technical explanation of where we have been and where things are headed is unparalleled and certainly worth your time. Unfortunately, I do worry that it isn’t particularly accessible to those without significant technical backgrounds, so I will do my best to charitably explain the main ideas.

Transcript

DJ explains that for a technological control grid to function, it requires particular dependencies. In the presentation he does a fantastic job outlining how those dependencies were logically constructed together. To properly control individual behavior, a digital ‘model’ is required to conceptualize the person or entity in virtual space. Trying to do this for billions of people, the development of immensely scalable and interoperable systems is critical. He explains that these advances have happened through a few major ‘revolutions’ in technological infrastructure. What makes his presentation invaluable is that he outlines precise technical details and explains how they are directly connected to particular tyrannical measures.

Where DJ and I disagree is the assertion that we are not ‘advanced enough’ for digital totalitarianism. We could nit-pick back and forth the question of is it complete or scalable enough, but that is to miss the forest for the trees. The ‘digital control grid’ is not a prerequisite for human tyranny, but rather an enhancement on top of it. Tyrannical digital systems are creatures of the environments they are developed in. If it was truly the case that a system was itself sufficient to be the difference between tyranny and liberty, then liberty could simply be delivered to less-free nations via cargo plane. While delivering entirely Free and Open Source systems to people in tyrannical nations is certainly worthwhile assistance, it itself is wholly insufficient to transform their lives.

In fact, I would argue that Brandon Smith is correct that digital tyranny is brittle and requires human cooperation, but it is the height of naivete to believe that human cooperation can only be coerced via technological means. This is where I disagree with the conclusion that a digital control grid is something that exists in the future rather than the present. Digital totalitarianism is something that exists alongside tyranny in other domains, not merely before it. Both Brandon and DJ are correct that the public is being lied to about the nature and applicability of AI. However, DJ raises the important point that ‘on-site’ LLMs can be ‘good enough’ to be used to dictate behavior inside systems that are gated behind various control measures.

Terry, DJ, and Brandon are mistaken to focus on the concept of a global and total control grid. This all-or-nothing approach to the problem of technological tyranny dismisses real, present dangers in the name of hypothetical future ones. Cyberspace is certainly becoming more hostile, but this is largely due to the governments and corporations that have so much control over it are themselves becoming more overtly tyrannical. It is my position that for many, their perception of digital tyranny is shaped by their experiences with how they experience abuse of power in their real lives. If one is so lucky to be (even relatively) insulated from the consequences of abusive governments and corporate coercion, digital tyranny seems like a remote and fanciful idea. Yet ironically, for those who genuinely struggle with the worst of digital assaults technology isn’t actually that high up the concern list. For many people who’s data is directly leveraged against them and their vulnerable circumstances, the need to acquire necessities is a much more principle concern.

If a concrete example of this is needed, try searching for jobs online without having/leveraging any personal connections. Job seekers are expected to divulge an absurd amount of personal information online to countless institutions with no real recourse if their information is misused and abused. Your typical ‘gig worker’ is unlikely to have privacy at the height of their concerns when the choice is between ‘opting out of tyranny’ or attempting to make a day’s wage. Both of these people are by their precarity ripe for exploitation in ways that some will never even contemplate. This “class divide” on digital tyranny is no accident, it is a powerful way to assert the rigid barriers on social mobility. This is before you consider concrete financial incentives such as the comforts of working in line with state preferences, and the punishments of working against them. Consider how politicians can choose winners and losers by giving particular industries (such as AI) preferential treatment, and those working for more independent pursuits (like your local farmer) may struggle with red tape and burdensome regulation.

Fight tyranny now to prevent the control grid

As stated earlier, the control grid is here not in the future. Any future instantiation of a digital control system is the direct consequence of present-day power plays. Our hostile digital experience, and its grave impacts on much of the public should be evidence enough that there is an urgent threat of technological totalitarianism. Instead of giving up on the public as helpless ‘sheeple’ we should be more willing to dissect and interrogate the incentives driving behaviors. The ‘big picture’ is a lot more complex and messy than the surface public discussion would lead you to believe. If any of this was simple, it would have reached a conclusion by now.

It is all too easy to a-priori define complete digital totalitarianism as impossible or unsustainable, but that does nothing to address the real problems that exist today. Instead of patting ourselves on the back for resisting measures that are yet to come, why not consider what needs to be done today? There are innumerable ways in which we can make our current digital landscape better. Some of those decisions are personal, but others are much more complicated collection action problems. Instead of giving up and treating those coordination challenges as impossible, we should directly focus efforts on confronting them. I think it is prudent, but also entirely critical to fight to leave our descendants a better digital future no matter what is in store.

Ultimately, the hardest part is humanizing those who struggle with these assaults. In the broader discussion around global tyranny, it seems that the most seductive explanation is the one that transfers blame from the tyrants to the people. I am convinced this is a critical part of how such tyranny is maintained. People will exclaim that it would all end if the people just refused to tolerate it anymore, but how many of us tolerate the abuse against our fellow citizens when it happens? What red lines get crossed as we watch on indifferently, merely because it doesn’t yet impact us? What are we actually doing in our day-to-day lives to enable, or add friction to abuses of power. What should be done to safeguard people’s liberties both in their real lives and in cyberspace?

These questions aren’t easy. They have messy answers that require consistent work. Nothing is as simple as a single decision that eradicates all the problems. As such, we should be more surgical and precise when it comes to analyzing problems. While I may disagree with DJs conclusions, I am thoughoughly impressed with his outlining of his thought process in a way that can allow us to get to the meat of these issues. It is this level of analysis I find terribly lacking in many dissident circles. It is so much easier to chant consensus rather than to openly inquire about the messy reality we find ourselves in. When it comes to being capable of preventing egregious overreach, technological or not, it requires an uncharacteristic level of humility and courage. We don’t just need technical skills to fight digital tyranny, meaningful compassion and understanding are far more essential.

The simple fact is that none of us alone has any real control over what the future brings. There are many good reasons to believe that technological tyranny is a dire and urgent threat that is unlikely to be stopped. On the other hand, there are also a wide variety of reasons to hold on to hope that we can prevent future abuses and address existing ones. The real discussion isn’t about what is and isn’t likely, so much of that is out of our hands. The important part is what are the real risks, and what can be done to address them today.




Sharing is caring!

Please send this post to anyone you think might be interested.
If you haven't already, don't forget to bookmark this site!
You can always subscribe with the RSS feed or get posts sent to your inbox via Substack.

Prev B @ Next