The Great Neurodiversification of America
This was not merely the spread of “wokeness” but the quiet triumph of procedural morality, hypersensitivity, and literal-rule reasoning as governing virtues.
In the summer of 2020, I read a story about an engineer at Google who was fired for writing an internal memo expressing his opinion on the then-budding practice of DEI. It was a turning point in America. The “woke” era was in full swing and now openly enforcing the moral verdict: prioritize psychological “safety” over the open exchange of opinions. What happened to Larry Summers at Harvard fifteen years earlier had been a gender revolution. What happened at Google was a cognitive one. The new cultural hierarchy that emerged from that episode is what I choose to call the Great Neurodiversification, the slow but decisive transfer of social authority from neurotypical intuition to neurodiverse rationality. Everything that followed—speech codes, trigger warnings, algorithmic fairness audits—can be traced to that shift.
The woke argument is simple but unsettling. I do not believe wokeness is a new ideology, nor a revival of Marxism, nor a spasm of moral panic. In my view, it is the political expression of a demographic fact: the rising influence of brains that think literally, feel intensely, and organize morality as a system of rules. The culture of cancellation, for instance, is not merely emotional hysteria but the moral logic of pattern recognition applied to situation ethics. Once our institutions become filled with people who experience inconsistency as danger and ambiguity as threat, moral life begins to resemble computer code. Every violation of procedure must be corrected, every improvised remark investigated. Rational or common sense appeals are useless because the offense is not cognitive but structural; it lies in the pattern itself.

Most people still think of neurodiversity as a medical category rather than a social force. They see accommodation policies or diagnostic labels, not attempts at the re-wiring of public life. Yet over the past two decades the norms associated with neurodiverse cognition—explicit communication, emotional precision, intolerance for uncertainty—have become the default expectations of our uberdigital culture. The university classroom is redesigned around trauma awareness, the workplace is reoriented toward psychological safety, the social network punishes ambiguity with public outrage—each reflecting a progressive civilization rearranged to match the sensitivities of a new cognitive majority. The result is not who sits in the corner office but how everyone inside the office now thinks.
The transformation is visible in the same places Helen Andrews, the author of Boomers: The Men and Women Who Promised Freedom and Delivered Disaster, has identified as markers of feminization: law, journalism, medicine, education. The Individuals with Disabilities Education Act of 1975 and the Americans with Disabilities Act of 1990 established a permanent legal infrastructure for accommodation. The fifth edition of the Diagnostic and Statistical Manual of Mental Disorders in 2013 widened the spectrum, normalizing self-identification and embedding neurodiversity language into human-resources codes. As diagnoses and self-diagnoses proliferated, the cognitive profile of entire professions shifted. By the early 2020s, nearly one in five college students reported some form of neurodivergent condition, and universities adjusted accordingly. The guiding principle of the modern institution is no longer excellence but predictability. Emotional safety has become the new fairness. Microaggressions anyone?
Every ideology has its value hierarchy. The older American one prized freedom, risk, spontaneity, and informal trust. The new psychosocial order inverts these priorities. Consistency outweighs creativity, procedure replaces discretion, emotional regulation counts far more than courage. Where previous generations treated empathy as intuition, the new generation treats it as compliance with approved language. Feeling is measured by form. The result is a kind of algorithmic morality—rational, exacting, and profoundly humorless. Have noticed how dour are the expressions of people who march for one progressive cause or another these days?
The shift is easiest to observe in personal communication itself. The neurotypical mind relies on context, tone, and improvisation; the neurodiverse mind depends on explicit rules and predictable cues. Digital media, being text-based and asynchronous, favors the latter. Online, every statement stands naked of irony; every misstep can be easily archived and scored. “Speech equals violence” is not hysteria but cognitive literalism. Ostracism, the old name for the cancel culture, functions not as vengeance but as moral housekeeping. In this sense, social media is less a public square than a perpetual group project run by its most rule-sensitive participants.
Many people just give up trying to be right.
Evolutionary psychology offers a clue to why this cognitive style dominates the digital age. Baron-Cohen’s systemizing intelligence—the ability to build and maintain abstract systems—was once confined to narrow technical domains. In a world mediated by code, it is at risk of becoming the governing intelligence of civilization. The engineer, the lawyer, the compliance officer, and the HR specialist now speak the same procedural language. Conflicts once settled by intuition or reconciliation are now prolonged indefinitely because the system has no mechanism for forgiveness. Every error awaits correction; every correction can spawn new rules. Many people just give up trying to be right.
The Covid pandemic years made this temperament universal. Bureaucratic empathy fused with technological rationality and produced a culture of cognitive safety. Lockdowns were justified by emotional risk once it was determined that closures and distancing were no longer justifiable in medical terms; protests were reclassified as therapeutic necessity. Workplaces migrated to screens, where affect was flattened into protocol. What looked like political polarization may in fact have been cognitive sorting: the intuitive versus the procedural, the context-sensitive versus the rule-bound. By 2025 the neurodiverse ethos—predictability as virtue, spontaneity as threat—has become institutional orthodoxy in academia, the mainstream press, and many government institutions.
Law and governance illustrate the outcome. If feminization once softened the law through empathy, neurodiversification may harden it through over-specification. In today’s America, statutes have multiplied to cover every contingency, leaving no space for prudence or even for common senses. Discretion becomes bias; mercy becomes inconsistency. Administrative justice now functions like software: if input A produces output B, no appeal to circumstance can intervene. Human judgment, being unpredictable, is treated as error. In many quarters, what was once the moral conscience of the profession has been automated out of it.
This American experiment is historically unprecedented. Other civilizations honored seers, monks, or savants but never allowed their mental style to shape the whole social order. The United States may be the first to institutionalize cognitive variance as moral authority, the Neurodiverse States of America. Has diversity of brains replaced diversity of beliefs? I am afraid the resulting polity may end up being neither democratic nor tyrannical but procedural: a government of checklists, by checklists, for checklists, with a list of lists to govern them all.
Could this be the reason why a discredited political-economic model such as Marxism is suddenly in fashion?
The mechanisms that enforced feminization now enforce neurodiversification. Anti-discrimination law required numerical parity between sexes; accommodation law requires psychological comfort for every mind. Most especially minds that have arbitrarily classified as in particular need of such accommodation. The proliferation of diagnostic categories turned therapy into infrastructure. Corporate “neuro-inclusion” programs reward emotional sensitivity and penalize spontaneity. What began as compassion became compulsion: the right to never feel uncertain, disturbed, or confused. Employers comply not because they believe in this ethic but because the penalties for non-compliance are as subtle and wide-ranging as they are immense. A culture designed to include everyone inevitably privileges those least able to tolerate ambiguity. Could this be the reason why a discredited political-economic model such as Marxism is suddenly in fashion?
Once established, these norms reinforce themselves. Every new rule generates another. Universities, fearing liability, extend accommodations from exams to syllabi, from syllabi to discourse itself. The classroom becomes a padded room. In workplaces, mediation replaces conversation; in journalism, sensitivity review replaces editing. The logic of protection is infinite. Karens everywhere are the new thought, speech and behavior police. Institutions no longer blow past parity; they overshoot balance entirely, becoming ever more homogeneous in thought while congratulating themselves on diversity of brain type.
The enormous cost of this cognitive monoculture is growing withdrawal by those whose strengths lie in intuition and social flexibility. Neurotypical individuals—the improvisers, the negotiators, the socially fluent—find little room in an environment governed by protocols and scripts. These are the people so aptly described by Alexis de Tocqueville. The very qualities that once stabilized communities—humor, contextual judgment, informal consensus—are recast by some of the new literati as liabilities. As the de Tocqueville Americans retreat, innovation declines and bureaucracy fills the vacuum. The country gains emotional safety and loses vitality. The uniformity of Marxism replaces the energy of Capitalism.
Empirical research may help explain the imbalance. Studies by Baron-Cohen and Soulières show that many on the autism spectrum score higher than average on nonverbal intelligence tests, particularly in pattern recognition and abstract logic. Yet meta-analyses by Norton and Zeinoun (2017) and Korpela et al. (2022) confirm lower scores in emotional intelligence and affective empathy. The very same neural architecture that enables exceptional analysis often impairs social inference. Institutions that equate intelligence with systemizing ability thus privilege high-IQ, low-EQ cognition—minds brilliant at detecting inconsistency but extremely, I would say pathologically, uneasy with human complexity. As workplaces, universities, and media adopt these priorities, the neurotypical majority becomes the new cognitive underclass. Their gift for reading context, managing emotion, and maintaining cohesion is replaced by procedure. Compassion turns bureaucratic, conversation therapeutic, friendship conditional. The social contract is rewritten in pseudo-clinical language.
The marginalization of the neurotypical, once known as “salt of the Earth” does not make neurodiverse people villains and destructors of culture. It simply exposes the danger of any single cognitive type dominating the cultural ecosystem. When the intuitive are silenced, institutions lose warmth; when the systemizers prevail unchecked, empathy becomes formulaic. As Daniel Goleman argued, emotional intelligence is the social technology that keeps the intellect fully human. A civilization that forgets this balance may achieve perfect order and still collapse from loneliness. In fact, loneliness is said to be on the rise in America, especially among the younger generations. Why?
Inclusion has become assimilation—the replacement of social intuition by cognitive process.
Criticism of this new social order remains taboo. To question neurodiversification is to risk charges of ableism or cruelty or racism. Yet even sympathetic scholars now note the strain of “context collapse” in public discourse—the inability to interpret statements beyond their literal text, the bureaucratization of compassion, the algorithmic policing of morality. Behind these symptoms lies a deeper anxiety: that the culture’s moral software has outgrown its emotional hardware. The question is not whether neurodiverse perspectives should be included, because it is lapalissian that they should be, but whether inclusion has become assimilation—the replacement of social intuition by cognitive process.
The Great American Neurodiversification, then, is not a tragedy but it is fraught with amber warnings. Is the problem the rise of different minds or the disappearance of complementarity between them? I believe a truly healthy civilization needs both: the pattern-seeker and the storyteller, the coder and the diplomat. I am afraid that in many quarters, we have traded spontaneity for stability, and context for control. If it is true that America has become more precise and more self-aware, could it be that is less alive? Our institutions are filled with intelligent people, why do they no longer know how to forgive? If the country, and any other country for that matter, is to remain humane, it must relearn how to let the rational and the relational coexist—how to make room once again for the messy genius of the neurotypical alongside the ordered brilliance of the neurodiverse.

Leave a Reply
Want to join the discussion?Feel free to contribute!