The Dystopian Question We’re Afraid to Ask
The question arrives with the bad manners of a guest who says the thing everyone in the room is thinking. It sits there at the table, underdressed and unapologetic, while the other guests look at their plates. Is social media a psyop? Surely not. Surely that’s the kind of question asked by people with string and thumbtacks and photographs connected by red yarn. Surely the answer is no.
Read the research. Then ask the question again.
P.W. Singer and Emerson T. Brooking spent five years documenting what they called the weaponization of social media in their essential, alarming, and compulsively readable book LikeWar. Their central argument, backed by more than a hundred pages of source notes dense enough to constitute their own academic text, is not speculative. It is descriptive. The internet has become a battlefield, they write, and the feed is the frontline. Social media has been weaponized by state actors, terrorist organizations, political campaigns, and the platforms themselves — weaponized not through a single dramatic act of conquest but through the patient, iterative, A/B-tested refinement of systems designed to do one thing above all others: hack the human brain.
The question is not whether this is happening. The research settled that. The question is what to call it when it happens to everyone, all at once, inside systems so normalized that refusal feels impossible and resistance feels ridiculous.
Dystopian fiction has a word for it. Several, in fact. Let’s use them.
The Slot Machine That Lives in Your Pocket
Begin with the neuroscience, because the neuroscience is where the story becomes undeniable.
Social media platforms are not designed to connect you with the people you love. They are designed to maximize the time you spend on the platform, because time spent on the platform is the raw material from which advertising revenue is manufactured. Every design feature — every pull-to-refresh, every infinite scroll, every notification badge, every like counter — is an engineering decision optimized for a single behavioral outcome: keep the human looking at the screen.
The mechanism is the variable reward schedule, borrowed directly from behavioral psychology and the gambling industry. B.F. Skinner established in the mid-twentieth century that the most powerful reinforcement pattern for producing compulsive behavior was not the consistent reward but the unpredictable one. The rat that receives a pellet every time it pulls the lever stops finding the lever interesting. The rat that receives a pellet sometimes, unpredictably, pulls the lever with an urgency that borders on desperation.
The slot machine is built on this principle. So is the social media feed. Every scroll is a pull of the lever. Sometimes you get a pellet — a post that moves you, a message that matters, a validation that lands. More often you get nothing worth the time it cost. But the possibility of the pellet is sufficient. The dopamine system, which the 2025 peer-reviewed research published in PubMed Central confirms fires not on reward but on anticipation of reward, does not distinguish between a slot machine in a casino and a phone in a pocket. It recognizes the pattern and responds with the same neurochemical urgency.
A 2025 study measuring brainwave activity via EEG in a hundred participants, published in PubMed Central, found that social media use engages brain reward pathways that are structurally identical to those observed in substance addiction. The prefrontal cortex — the seat of judgment, impulse control, and the considered decision — shows reduced engagement. The amygdala, which processes emotional threat and reward, becomes hyperactive. Prolonged exposure produces measurable structural changes in both regions. The adolescent brain, still under construction during this critical developmental window, is especially vulnerable to these alterations.
The U.S. Surgeon General understood the gravity of this. His 2023 advisory, titled Social Media and Youth Mental Health, carries a sentence that deserves to be read slowly: “Our children have become unknowing participants in a decades-long experiment.” Ninety-five percent of teenagers between thirteen and seventeen use social media. More than a third report using it almost constantly. Nearly forty percent of children between eight and twelve use platforms whose own terms of service prohibit their presence. The Surgeon General called for warning labels on social media platforms. That call is still awaiting the act of Congress required to implement it.
Our children have become unknowing participants in a decades-long experiment. — U.S. Surgeon General Vivek Murthy, 2023. The experiment did not require consent. It required only a phone, an app, and a brain that had not yet finished forming.
The Battlefield That Replaced the Town Square
Singer and Brooking document something that the slot machine framing alone does not fully capture: social media is not merely addictive. It is, in the precise military sense of the word, a battlespace. A contested terrain where opposing forces deploy narrative as ammunition, emotion as strategy, and the human attention span as the territory to be captured and held.
The numbers are staggering in their specificity. By the time LikeWar was published, at least seventy countries had been identified as targets of state-sponsored disinformation campaigns. Not fringe actors on obscure platforms. Seventy sovereign nations, their public discourse seeded and poisoned by coordinated networks of fabricated identities, amplified outrage, and algorithmically supercharged false narrative. Russia’s interference in the 2016 U.S. election is the most documented case, but it is emphatically not the only one. ISIS copied the Twitter tactics of Taylor Swift — the same strategies of parasocial intimacy, emotional resonance, and network amplification — and used them to recruit across twelve languages in two dozen countries. Terrorist organizations, autocratic governments, and political campaigns discovered simultaneously that the platform did not merely carry the weapon. The platform was the weapon.
The mechanism that makes the platform a weapon is the same mechanism that makes it an addiction: the algorithm’s optimization for engagement. Engagement, the research confirms with uncomfortable consistency, is not driven by accuracy, beauty, nuance, or wisdom. It is driven by emotional arousal. Outrage, fear, disgust, and tribal identity produce more clicks, more shares, more time-on-platform than any other content category. The algorithm does not favor these emotions because it has political preferences. It favors them because they work. And because they work, the feed fills with them. And because the feed fills with them, the population absorbs a daily diet of curated maximum-emotional-arousal content that would, in any prior media environment, have been recognized as a sustained psychological operation.
Frances Haugen, the former Meta product manager who became the most significant whistleblower in the history of social media, confirmed what neuroscientists had suspected and what Singer and Brooking had documented: the platforms knew. Internal research at Facebook, she revealed, showed that the algorithm knowingly amplified divisive, emotionally charged content because it maximized engagement. They knew it was happening. They knew the damage it caused. They chose the engagement anyway, because engagement was the product, and the product was what paid.
The algorithm does not amplify outrage because it has political preferences. It amplifies outrage because outrage works — because it keeps you on the platform, and your presence on the platform is what the platform sells. The psyop does not require a conspirator. It requires only a business model.
What Brave New World Got Right That 1984 Got Wrong
Dystopian fiction has been wrestling with this question for a century, and the most prescient treatments of it are not the ones that imagined surveillance and censorship and the jackboot on the neck. Those futures were correct about some things. But the future that arrived in the pocket of nearly every human being on earth looks less like Orwell’s Oceania and more like Huxley’s World State.
In Nineteen Eighty-Four, Winston Smith is controlled through terror, scarcity, and the deliberate infliction of pain. The Party maintains power by making the alternative to compliance unbearable. It is a system of enforced misery, and its citizens know, at some level, that they are miserable. That knowledge is the seed of resistance. Winston’s love for Julia, his secret diary, his meetings with O’Brien — all of it is possible because the misery is legible. The oppression has a face.
Huxley’s World State controls through pleasure. Through distraction. Through the gentle, perpetual provision of soma — the happiness drug that smooths every rough edge of experience into comfortable irrelevance. The citizens of the World State are not miserable. They are sedated. They are perpetually entertained. They do not resist because they do not recognize, in any visceral or immediate sense, that there is anything to resist. The oppression has no face. It has a feed.
Neil Postman saw this coming in 1985, in Amusing Ourselves to Death, before the internet existed, before the smartphone existed, before the algorithm existed. He wrote that Huxley, not Orwell, had imagined the more accurate future — that the danger to free society was not the destruction of information but its drowning in a sea of irrelevance, entertainment, and emotional stimulation calibrated to produce the maximum engagement and the minimum genuine thought. Postman was describing television. The platform that arrived forty years later made his argument seem quaint by comparison.
Ray Bradbury understood it too, with the particular intuitive clarity of a writer who felt things in his bones before he could explain them analytically. The parlor walls in Fahrenheit 451 are not a surveillance system. They are an entertainment system. They are warm, they are immediate, they are emotionally engaging, and they are specifically designed to fill every interior space until the interior life has nowhere left to live. Mildred is not oppressed. Mildred is entertained. The oppression is the entertainment, and it is indistinguishable from kindness.
Huxley warned us about the soma. Postman warned us about the television. Bradbury warned us about the parlor walls. None of them could have predicted the algorithm — but all of them described the mechanism. The control that arrives as entertainment. The oppression that feels like a gift.
The Psyop That Does Not Require a Puppetmaster
The word psyop — psychological operation — carries with it the implication of intentional design by a malevolent architect. A puppetmaster pulling coordinated strings. A room full of people deciding, with deliberate malice, to manipulate a population.
The more disturbing truth about social media is that the psyop does not require a puppetmaster. It requires only the convergence of three things that are each, individually, entirely explicable: a business model built on advertising revenue, an engineering culture optimized for engagement metrics, and a neurological system in the human brain that was shaped by evolution for conditions that no longer exist.
The human brain’s sensitivity to social reward — to the validation of the tribe, to the detection of threat, to the monitoring of status and belonging — was adaptive on the African savanna two hundred thousand years ago. It is not adaptive on a platform that has weaponized those same sensitivities to sell advertising. The like button is not a neutral social feature. It is a precision instrument that reaches into the limbic system and activates the same circuitry that once told your ancestors whether the tribe accepted them or was about to cast them out. The platform did not create that vulnerability. It found it, mapped it, and built a business model on top of it.
The result — documented by peer-reviewed research, confirmed by whistleblowers, acknowledged by a sitting Surgeon General, and mapped in disturbing detail by Singer and Brooking — is a system that produces the practical effects of a psychological operation without requiring anyone to have intended them. The polarization is real. The manipulation of political belief is real. The erosion of the shared epistemic ground without which democratic governance cannot function is real. The damage to adolescent mental health, documented in brain scans and depression rates and the Surgeon General’s own language of urgent public health crisis, is real.
The psyop does not require a puppetmaster. It requires a business model. And that is, in many ways, the more frightening origin story.
The psyop does not require a malevolent architect. It requires only a business model built on attention, an engineering culture optimized for engagement, and a neurological system that was never designed for what the algorithm knows how to do to it.
What 2096 Looks Like When the Feed Has Run for Seventy More Years
The world of Shards of a Shattered Sky does not have social media in the form we would recognize. It has something quieter and more total: an information environment so completely personalized, so exquisitely calibrated to the specific emotional architecture of each user, that the distinction between the world as it is and the world as the algorithm presents it has become a question that most people no longer think to ask.
They stopped asking because the feed was always right. Or rather: the feed was always satisfying, which is not the same thing as right but which the dopamine system, after seventy more years of careful refinement, can no longer reliably distinguish. The neural interfaces of 2096, already in nascent commercial development in 2025, have made the calibration more precise. The platform that once inferred your emotional state from your typing speed and your scroll patterns now reads it directly from the prefrontal cortex it has spent decades learning to navigate.
The surveillance architecture described in earlier posts in this series and the informational architecture of the 2096 feed are not separate systems. They are the same system, approached from different directions. One collects what you do. The other delivers what you feel. Together, they constitute the most comprehensive behavioral management apparatus in the history of human civilization — and it arrived not through conquest but through convenience, not through coercion but through the freely chosen click of “accept” on a terms-of-service agreement that nobody read.
The world of 2096 did not need to be built by a villain. It needed only the continuation of the present at its current pace, for seventy more years, without the serious and sustained regulatory intervention that the Surgeon General requested and the platforms resisted and the Congress declined to provide.
In 2096, the feed is not a window onto the world. It is the world, for most of the population that inhabits it. The distinction between what is and what the algorithm presents dissolved so gradually that there was no moment anyone could point to and say: there. That was when we lost the thread.
So Is It a Psyop?
Return to the question. Give it the answer it deserves, which is neither a paranoid yes nor a comfortable no.
If a psyop is an intentional, coordinated, centrally directed psychological operation designed by a malevolent actor to manipulate a target population against its own interests — then no. Social media is not, in its totality, a psyop. It is something that emerged from the convergence of incentives rather than the execution of a plan.
But if a psyop is a system that produces, at scale, the effects of a psychological operation — the manipulation of belief, the erosion of epistemic ground, the exploitation of neurological vulnerability, the cultivation of emotional dependency, the restructuring of political reality to serve the interests of those who control the information architecture — then the research, read honestly and in full, does not offer a comfortable alternative.
Singer and Brooking write that information literacy is no longer merely an education issue. It is a national security imperative. The library journal that starred their review understood why. A population that cannot reliably distinguish the amplified from the authentic, the curated from the real, the emotional reaction from the considered judgment, is a population whose sovereignty has been quietly, profitably, and very nearly irreversibly compromised.
That is not a paranoid reading. It is the reading that the data supports.
Dystopian fiction did not invent this scenario. It described it. The description preceded the evidence by decades. And the evidence, arrived at last, looks disturbingly like the story that the genre has been telling since Huxley put down his pen.
The question was never whether it could happen. The question, as always, is whether we are paying attention.
Sources Cited
Peer-reviewed research, government advisories, investigative journalism, and the literary tradition underlying the argument this post makes.
LikeWar: The Foundational Text
- W. Singer and Emerson T. Brooking — LikeWar: The Weaponization of Social Media (Houghton Mifflin Harcourt, 2018) — https://www.amazon.com/LikeWar-Weaponization-P-W-Singer/dp/0358108470
- NPR Fresh Air — The Weaponization of Social Media and Its Real-World Consequences (interview with Singer and Brooking) — https://www.npr.org/2018/10/09/655824435/the-weaponization-of-social-media-and-its-real-world-consequences
- International Review of the Red Cross — LikeWar review and humanitarian implications — https://international-review.icrc.org/sites/default/files/reviews-pdf/2019-12/irrc_101_910_21.pdf
- NYU Global Affairs Review — Understanding the Future of War: A Review of LikeWar — https://wp.nyu.edu/schoolofprofessionalstudies-ga_review/understanding-the-future-of-war-a-book-review-of-likewar-the-weaponization-of-social-media/
Neuroscience of Addiction, Dopamine, and Social Media
- PubMed Central — Social Media Algorithms and Teen Addiction: Neurophysiological Impact (2025) — https://pmc.ncbi.nlm.nih.gov/articles/PMC11804976/
- PubMed Central — Modern Day High: The Neurocognitive Impact of Social Media Usage (EEG study, 2025) — https://pmc.ncbi.nlm.nih.gov/articles/PMC12329480/
- SAGE Journals — Dopamine-Scrolling: A Modern Public Health Challenge Requiring Urgent Attention (2025) — https://journals.sagepub.com/doi/10.1177/17579139251331914
- PubMed Central — The Emotional Reinforcement Mechanism of Social Media Addiction (2025) — https://pmc.ncbi.nlm.nih.gov/articles/PMC12108933/
- Psychiatric Times — The Empathy Crisis: How Social Media Algorithms Drive Emotional Numbing (2025) — https://www.psychiatrictimes.com/view/the-empathy-crisis-how-social-media-algorithms-drive-emotional-numbing
The U.S. Surgeon General’s Advisory
- S. Department of Health and Human Services — Social Media and Youth Mental Health: The U.S. Surgeon General’s Advisory (2023) — https://www.hhs.gov/surgeongeneral/reports-and-publications/youth-mental-health/social-media/index.html
- NCBI Bookshelf — Social Media and Youth Mental Health: Full text of the Surgeon General’s Advisory — https://www.ncbi.nlm.nih.gov/books/NBK594761/
- American Academy of Pediatrics — Surgeon General advisory on social media and youth mental health — https://publications.aap.org/aapnews/news/24543/Surgeon-general-advisory-warns-of-social-media-s
- Yale Medicine — How Social Media Affects Teen Mental Health: expert analysis of the Surgeon General’s findings — https://www.yalemedicine.org/news/social-media-teen-mental-health-a-parents-guide
Frances Haugen, Meta, and the Whistleblower Record
- The Markup — Investigative reporting on Facebook’s internal research and algorithmic decisions — https://themarkup.org
- Wired — Frances Haugen and the Facebook Files: what the whistleblower revealed — https://www.wired.com/tag/facebook/
- The Guardian — Frances Haugen testimony and Meta’s internal knowledge of algorithmic harm — https://www.theguardian.com/technology/facebook
- ProPublica — Investigative reporting on social media platforms and public harm — https://www.propublica.org
Algorithmic Design, Engagement, and the Business Model
- Frontiers in Psychology — Resistance or Compliance? The Impact of Algorithmic Awareness on Information Browsing (2025) — https://www.frontiersin.org/journals/psychology/articles/10.3389/fpsyg.2025.1563592/full
- Center for Humane Technology — Social media design, engagement optimization, and human cost — https://www.humanetech.com/youth
- EFF — Surveillance, behavioral design, and the conditions for genuine autonomy — https://www.eff.org/issues/surveillance
- MIT Technology Review — AI, algorithmic design, and the attention economy — https://www.technologyreview.com
The Dystopian Literary Tradition
- The Guardian — Brave New World at 90: Huxley’s warning and the world that arrived — https://www.theguardian.com/books/aldous-huxley
- Lit Hub — Neil Postman, Amusing Ourselves to Death, and the prescience of Huxley over Orwell — https://lithub.com
- com — Fahrenheit 451: Bradbury’s parlor walls and the entertainment that became oppression — https://www.tor.com/tag/ray-bradbury/
- The Atlantic — What Orwell and Huxley understood about information, distraction, and control — https://www.theatlantic.com/entertainment/
- Electric Literature — Speculative fiction and the media environment it predicted — https://electricliterature.com/tag/speculative-fiction/
Disinformation, State Actors, and Information Warfare
- Atlantic Council Digital Forensic Research Lab — Disinformation research and state-sponsored information warfare — https://www.atlanticcouncil.org/programs/digital-forensic-research-lab/
- Freedom House — Freedom on the Net: annual report on internet freedom and information manipulation — https://freedomhouse.org/report/freedom-net
- Stanford Internet Observatory — Research on disinformation, state actors, and platform manipulation — https://cyber.fsi.stanford.edu/io/
- First Draft — Disinformation research, case studies, and documented information operations — https://firstdraftnews.org

