This is part one of a review which is continued here.
Concerns about the psychological and sociological effects of social media phone apps are as myriad and complex as the devices—and attendant economic sector—are themselves.
Talking about them is a very difficult affair.
When Shoshanna Zuboff distilled her research on the subject to fit between the covers of The Age of Surveillance Capitalism in 2018, her sprawling account was built upon a sociological framework spanning hundreds of years, bending around the narrative of the origins of behaviourism, an obscure school of psychology, and its challenge to the weighty question of human free will. Hanging from this framework was an endless inventory of news stories, confessions, technological white papers, university studies, quotes from tech executives, and succinct summaries of precedents and theory required to make sense of it all.
Zuboff’s book is a dense, relentless read, designed to slowly chip away at every conceivable doubt in the minds of those most likely to resist its message. The work is done by the hammer, not the chisel. Reading it was not a happy experience but a necessary one. Every few pages, I found myself lowering the book from my eyes and just sitting alone for long minutes of somber contemplation.
Jonathan Haidt aims to knock down. It does not intend so much to be a book of why. Rather, it is a book about what to do about it regardless of the why.
The Anxious Generation, by comparison, is a slim, sleek, and finely pointed missile, self-propelling and slicing precisely toward immediate targets. The bowling pins that books like Surveillance Capitalism and documentaries like The Social Dilemma have set up, Jonathan Haidt aims to knock down. It does not intend so much to be a book of why. Rather, it is a book about what to do about it regardless of the why.
It’s a call to action, not for tech companies but for parents and school administrators—those most empowered to make an immediate impact on children's lives.
To deliver this call, Haidt pays only the scantest of lip service to criticisms of the tech companies whose business models overwhelm us today. So much so that you might find my relating of his The Anxious Generation to Zuboff’s work specious. The only common point of contact, really, is in fingering of the direct culprit: the always-on, always-connected, sensor-laden supercomputers that have been smuggled into everyone’s pocket and purse in the past fifteen-odd years in the poor disguise of being “phones.”
Shame on us for falling for it.
Squint your eyes, and almost every book or conversation on in this subject can be reduced to four words: “It’s the phone, stupid!” That’s the only commonality the entire field of discourse has in common. But talk is cheap. Haidt’s book, and the larger publicity campaign surrounding it, intends to organize society to actually do something about it; at least regarding our most vulnerable population.
“Put the phone down, kiddo!”
I’d like to see children playing more, growing up more, socializing more, and fiddling on their phones less. So would you, I assume. I flinch when I see young children mindlessly flicking through endless streams of garbage on a tablet or phone.
I die a little inside, actually. You know it’s a bad thing. I know it’s a bad thing.
The only thing stopping us from doing something about it is the head games of rationalizing it all. In that arena, there is room for a dozen more books as dense and thick as Zuboff’s to argue these affairs—and while we’re reading and arguing, the kids are losing one more year of their education to the internet.
It seems to me that a) we all want the same thing, b) we even agree about what we might do to get that thing, and c) we only disagree over why it would be good; why we’re doing it.
This is a very fortuitous place to be if those concerned about technology and media and its effects can overcome their differences. If we can, we can actually collectively organize and force saner norms and rules regarding free, embodied playtime for kids and phones in schools.
We need only do the work while remaining careful not to thwart our common goals with pedantry about who can articulate the justifications for achieving those goals the best.
That’s my worry: territorial squabbles over whose domain of expertise best explains why this is a good thing will cause fissures in what should be a unified movement. Or that fear of association, say, with Luddites or conservatives or radicals or progressive activists or technophobes or any other epithet, will interfere with solidarity needed to get kids to run around and socialize more and doomscroll less.
Resistance to the unobjectionable changes Haidt proposes will likely take the form of attacking his research or justification or anyone else who shares his goals. These attacks shouldn’t be emboldened by friendly fire. And heavy, defeatist sights that “it’s not enough” or that “it’s too late” will only prevent us from demonstrating that meaningful change is, in fact, possible.
Collective action grows when possibilities are demonstrated. This is a time for yes/and coalitions between schools of thought. If one person’s motivations are shot down, others must bolster their own reasons, not agree with the attacker, and deflate the momentum!
Collective action grows when possibilities are demonstrated. This is a time for yes/and coalitions between schools of thought. If one person’s motivations are shot down, others must bolster their own reasons, not agree with the attacker, and deflate the momentum!
Jonathan Haidt is doing his best to give children, en masse, a better chance at a normal, down-to-earth childhood in the most practical, down-to-earth way possible. And you and I want him to succeed, so we must help in our own way. After all, when it starts working, the nature of the fruits being delivered will inform our arguments about why it is working.
With this in mind, I’d like to constructively take apart his book to understand how he’s delivering this call to action so that even as we challenge his premises or analyses, we might not interfere in the process with ends they are set up to arrive at.
Follow the Wavy Line
Taken by itself, the authority of Haidt’s book rests entirely on his adherence to his profession, social psychology. It is a profession today largely of statistics and statistical analysis. Without outside support, then, it might be by the veracity of his research and data-gathering, his adherence to method, and the quality of his analysis of his data that he manages to convince a skeptic of his case.
His data is mostly hospital intake records and other sources measuring youth mental illness and suicide rates in various countries across the world.
In the book’s opening chapter, he lays out all the charts demonstrating radical upticks coinciding with the release of iPhone and Android smartphones featuring front-facing cameras onto the global markets in 2010.
This is a side-channel attack on the problem: not an analysis of the tech directly, but rather the measure of indicators coming from entirely outside the boundaries of that domain.
This accounts for the sleekness of his book. Fifteen years into the experiment of normalizing computers for virtually everyone, enough data has accrued to demonstrate that the effects are substantial and not merely coincidental.
As a scientist, however, he must be conservative to the extreme in his reading of the facts he has assembled. Since his charts all inflect around the year 2010, it is the technological turn at that point that he must focus on. So, in order to provide context on those impacted at that time, children, he recapitulates themes of childhood interrupted in the ‘80s from his 2018 book The Coddling of the American Mind, co-authored with Greg Lukianoff, before jumping to the dawn of smartphone apps and front-facing cameras.
“There was little sign of an impending mental illness crisis among adolescents in the 2000s. Then, quite suddenly, in the early 2010s, things changed... What on earth happened to teens in the early 2010s?”
Before this deluge of pocket-sized, always online screens, Haidt must tell us, “There was little sign of an impending mental illness crisis among adolescents in the 2000s. Then, quite suddenly, in the early 2010s, things changed... What on earth happened to teens in the early 2010s?”
As I’ll soon demonstrate, there were very good reasons to anticipate such a crisis—just none reflected in his charts. All he can do is tell us what the data shows him.
What his charts show necessarily, by adherence to professional conduct, contours the shape of the story he can tell.
What happened, he can say, was a rise in cases of internalizing disorders, “in which a person feels strong distress and experiences the symptoms inwardly; the person with an internalizing disorder feels emotions such as anxiety, fear, sadness, or hopelessness. They ruminate. They often withdraw from social engagement.”
At the same time, measures of externalizing disorders, where a person acts out or takes needless risks, took a historically unprecedented drop.
Of course, Marshall McLuhan would take the story much farther back, more so than we have space for here. However, consider the nature of internalizing vs. externalizing disorders caused by smartphones when reading this particular passage:
The visual approach enabled [the Westerner] to analyse and fragment every kind of operation and experience, that is, to mechanize it. The electric circuit, however, ended the visual path of mechanization by its instant feed-back and its organic embrace. It took us, as it were, Through the Looking Glass into a world of non-Euclidean inner space. The Western world has typically concerned itself with the ordering of outer space, and with the classification of people and things as they relate to the outer world.
“To lose touch with reality” even now implies that a person has gone on some sort of inner quest which has divorced him from the outer world. We still tend to think of space as a visual thing that is outside of us, regarding it as some sort of a container. These concepts will not bear up under electronic conditions, and the ancient quest for visual orientation in the outer world increasingly yields to the trip through the looking glass into the multitudinous inner spaces and experiences generated by the race. With the electric circuit we leave the age of the wheel which carried us forward, and begin that trip back into ourselves so strongly indicated by the feed-back loop of the electric circuit.
—Marshall McLuhan, The Future of Morality: The Inner Versus the Outer Quest, 1967
...already in progress.
The early criticism I’ve seen of Haidt’s turn to studying youth and “The Great Rewiring” takes place entirely within the empirical domain of meta-analyses on many studies regarding “screentime” and various measures of psychological well-being.
This seems to be the wrong direction to take the discussion of his book.
The absolutely last place the humble reader should go after reading The Anxious Generation is deeper into the weeds of statistics of more scientific papers and the arcane practices and categories of contemporary mental health professionals!
The preponderance of evidence that technology affects our identity, our sense of embodiment, and, as Haidt succinctly puts it, interrupts how the “child’s brain is ‘expecting’ to wire up in a three-dimensional, five-sense world of people and things” does not lie in line charts and longitudinal studies. Again, these are side-channel registers—mental health experts are not computer interface designers.
Instead, I’d propose the reader conduct a cursory study of the vast historical corpus of literature on computers and the social analyses of its effects. From the psychology of interface design to sociological interviews of early computer and internet adopters to theoretical criticism of hypertexts and cybernetics/systems theory, we have decades of texts to draw from.
What you’ll find, as I have, is a very conscious awareness of why and how “virtual reality” was intentionally manipulating the means and proportions by which humans relate to the world and each other.
Hijacking and exploiting our senses—the way we relate to and understand the material, outer world as embodied beings—has always been the explicit intention of computer systems, systems designed to augment or replace our knowledge and rational processes by externalization. That’s the entire point! Criticisms of tech companies have also always been very adamant on this explicit point.
The virtual class possesses a new body type modeled on the requirements of life in the age of the post-human. No longer the body human, but it is the virtual class as cybernauts who register in the flesh every twitch of techno-culture. Always hystericized because driven from within by feelings of hyper-anxiety over demands for the new in technoculture, and partly inferior before the technical momentum of the virtual reality machine, cybernauts are perfect nihilists. A technologically-steered class, they face outwards with an overwhelming sense of contempt, but interact with each other on the basis of real confusion and fear over their constantly changing status in the commercial command hierarchy of techno-culture.
—Arthur Kroker and Michael A. Weinstein, Data Trash: the theory of the virtual class, 1994
A Book for the Mass Reader
The primary value of Haidt’s analysis is not that he’s discovered some previously unrecognized causal relation between personal computing and anxiety.
There were more books from the ‘90s anticipating and detailing the precise reasons for widespread mental health issues resulting from computer usage than would fill a small library.
To all but the most narrow-minded or self-interested of specialists, the effects his gathering statistical data aims to prove are foregone conclusions. But just knowing something is true does not, today, give us permission to argue it is so. What makes The Anxious Generation important is that it speaks the exact language and trades in the correct currency of evidence preferred by today’s mass reader.
What makes The Anxious Generation important is that it speaks the exact language and trades in the correct currency of evidence preferred by today’s mass reader.
By mass reader, I mean the average educated consumer whose intellectual acumen since graduation is comprised largely of pop-science books purchased at airport bookstores. They are short on time but wish to maintain their own credibility to the world at large as both informed and science-minded. Their prudence, then, is expressed in only adopting stances that confer safety in numbers: staying power within top-ten lists and a sufficient series of successful media appearances by the author.
Given a book’s success, they will honor the establishment of its canonical status by adopting its language and trading in new coinages, which usually appear as the titles of such books.
In other words, Haidt is the right person (an already established and known public intellectual) with the right rhetoric (striking the chords of wise parenting and community leadership informed by the scientific authority of empirical data) to inspire a sufficiently large audience to feel confident arguing articulately for change in the world.
Let’s consider the pop-intellectual field within which his book must fit to consider the benefits here.
The most recent book akin to Haidt’s that I can think of is Nicholas Carr’s The Shallows, first released fourteen years ago in 2010, the first year the significant uptick in mental health issues among children is recorded by Haidt.
Again, the same themes occur. (I’ve bolded a word which I’d like to refer back to later.)
In an article published in Science in early 2009, Patricia Greenfield, a prominent developmental psychologist who teaches at UCLA, reviewed more than fifty studies of the effects of different types of media on people’s intelligence and learning ability. She concluded that “every medium develops some cognitive skills at the expense of others.” Our growing use of the Net and other screen-based technologies has led to the “widespread and sophisticated development of visual-spatial skills.”
We can, for example, rotate objects in our minds better than we used to be able to. But our “new strengths in visual-spatial intelligence” go hand in hand with a weakening of our capacities for the kind of “deep processing” that underpins “mindful knowledge acquisition, inductive analysis, critical thinking, imagination, and reflection.” The Net is making us smarter, in other words, only if we define intelligence by the Net’s own standards. If we take a broader and more traditional view of intelligence—if we think about the depth of our thought rather than just its speed—we have to come to a different and considerably darker conclusion…
Whereas the infrequent multitaskers exhibited relatively strong “top-down attentional control,” the habitual multitaskers showed “a greater tendency for bottom-up attentional control,” suggesting that “they may be sacrificing performance on the primary task to let in other sources of information.” Intensive multitaskers are “suckers for irrelevancy,” commented Clifford Nass, the Stanford professor who led the research. “Everything distracts them.” Michael Merzenich offers an even bleaker assessment. As we multitask online, he says, we are “training our brains to pay attention to the crap.” The consequences for our intellectual lives may prove “deadly.”
—Nicholas Carr, The Shallows, 2010
The Shallows singles out the discovery of “neuroplasticity,” providing “the missing link” to explaining how technology usage affects our brains.
The book that popularized the term neuroplasticity, The Brain That Changes Itself by Norman Doidge had come out to wide acclaim three years prior in 2007. It'd be preposterous to assume, as one might, that neuroplasticity—the idea that the brain undergoes physical changes when broad swathes of neurons are culled or reinforced—was a new, unheard-of idea at the time of the book’s release.
What Doidge's book takes account of is the build-up of sufficient evidence to fully integrate this understanding deeply enough across fields so as to signify a basic change across the whole modern scientific paradigm—a cultural change among professionals. Doidge’s book, in turn, was the event that carried this scientific, cultural change into the wider intellectual culture.
The location of the chain wherein any “link” might have been “missing” was not in the science—it was in the wider, socially constructed narratives in popular culture. Among the intellectual culture, that is, who talks about (or even buys) airport books.
The location of the chain wherein any “link” might have been “missing” was not in the science—it was in the wider, socially constructed narratives in popular culture. Among the intellectual culture, that is, who talks about (or even buys) airport books.
Top-ten best selling books sold at airports are not in the business of scientific paradigm shifts.
They enact and ride upon paradigm shifts in popular science. Science as it exists in the public mind. And the world of pop science is always welcoming new readers and always reinventing itself to not be too formidable to entry. That’s just the book market. Every new book must, in its way, either invent and teach its premises from scratch or ride in on the wave caused by some other book that was notable and memorable enough to support it.
And so Carr could summarize “neuroplasticity” in a few words, relying on the lingering notoriety of Doidge’s coinage to grant him authority.
Haidt, in turn, references Carr’s book trusting that it has stayed in cultural memory since its publication.
It’s worth stating the obvious here: you likely remember The Shallows, as I do, as a relatively recent cultural event. But children born at the time of its publication are now entering high school. This, I suspect, is the maximum window for cultural memory on the latest and greatest science as taught by buzzwords and coinages from books most people have heard of, if not actually read.
I bring this up in detail because this phenomenon is a sociological phenomenon—the mass reader and pop-intellectual—is the one created by electronic media, inherited by computers, and pushed to the extremes witnessed today by the smartphone.
It is, I’ll claim, precisely what Haidt’s bar charts are measuring! And so, before moving on to a discussion of Haidt’s chapters on metamorphoses and ritual initiation, spirituality, and the harms and solutions of tech in part two, let’s wrap up part one by examining this phenomenon more carefully.
A Trojan Horse for Postmodern Theory
I am not qualified to say much more about the social psychology that opens Haidt’s book and grounds the arguments made in its subsequent chapters. I trust that he is a professional, and I take it for granted that he has done his work scrupulously. My strengths are computer history and the works of Marshall McLuhan.
From that ground, I’ll do my best to wade into the wider waters of books on computers and their effects on society and children.
So let’s again pick up the line I laid down earlier about how, in Haidt’s words, “child’s brain is ‘expecting’ to wire up in a three-dimensional, five-sense world of people and things.”
Popular books on computers and cyberculture from the ‘90s are long forgotten today. Their is no money in re-re-publishing them, and the language they use feels as quaint as the computer in your attic feels “ancient.”
Whereas we all know today to speak of “social media” platforms” as available as “apps” on “smartphones,” these books speak in the baroque, foreign language of “computer-mediated communities” on “websites” and “newsgroups” and “bulletin board systems” as “software” on “personal computers.” Who talks that way anymore? Who could win an argument in a bar or at a school board meeting debating the nature of 30-year-old technology?
The readers who remember these books are retired and carry no influence.
You’ll never find them in a mall bookstore, and they will not be reissued. They already were. They are on some university reading lists, but those readers know that the public has forgotten them. Everyone assumes that newer science is better than old science, that things have progressed, and that books from thirty years ago must be out-of-date in some way that renders them less optimal to read than some new book on the same subject.
These barriers are, I believe, substantial enough to prevent anybody from realizing that these books have all the answers to everything that the pop-science, mass-reading intellectual is desperate to answer today. And so, these cultural barriers must be broken in order to advance the discourse and inform our conversations today.
To read old books on this subject would introduce depth to the understanding of our technology. Historical depth and technical depth. And depth, rather than surface, is precisely what electronic technology, aided and abetted by easy-to-use computers, obliterated from our collective sense-making. In a post hoc, retrospective way, I’ll refer to this as post-modernism.
After all, this is what all the writers in the ‘90s were doing. Readers looking for answers to what postmodernism is are waylaid by theorists and art criticism—they don’t realize that all their questions are actually answered in old books about media ecology, cyberspace, and computer culture!
As I say, the effects are older than computers.
The human being is no longer the unit.
Take, for instance, this century-old testimony from Modernist artist Wyndham Lewis on the creation of a “one-a-day” world by mass media advertisements:
An obsession with the temporal scale, a feverish regard for the niceties of fashion, a sick anxiety directed to questions of time and place (that is, of fashion and milieu), appears to be the psychological concomitant of the possession of a time-theory that denies time its normal [linear] reality…
The world in which Advertisement dwells is a one-day world. It is necessarily a plane universe, without depth. Upon this Time lays down discontinuous entities, side by side; each day, each temporal entity, complete in itself, with no perspectives, no fundamental exterior reference at all. In this way, the structure of human life is entirely transformed, where and in so far as this intensive technique gets a psychologic ascendancy. The average man today is invited to slice his life into a series of one-day lives, regulated by the clock of fashion. The human being is no longer the unit. He becomes the containing frame for a generation or sequence of ephemerids, roughly organized into what he calls his ‘personality....’
The less reality you attach to time as a unity, the less you are able instinctively to abstract it; the more important concrete, individual, or personal time becomes… The the less ‘individualist’ you will be in the ordinary political sense. You will have achieved a fanatical hegemony with your unique self-feeling.
—Wyndham Lewis, Time and Western Man, 1927
It’s strange how books from a hundred years ago can feel more familiar and relatable than those written by authors we are still in a generational rivalry against.
Haidt’s book, for its part, does the reader a great service by leading them right up to the threshold of understanding and entering into this world of ‘90s cyber-culture mass literature.
He leads the reader right up to the precipice of understanding the situation.
It is my job, I figure, to push them off. (Starting with you, dear reader.)
Haidt leads you to the threshold repeatedly with his recurring reference to “the virtual world.”
We use the terms “virtual” and “digital” so often now that they’ve lost any mystery to us—when, in fact, a moment’s contemplation will reveal the insanity of that comfort.
“Wait... there is a world? Inside my computer?”
So as to feel sophisticated, we pretend to know what that means and not be surprised. It doesn’t seem adult to wonder about the nature of objects we’ve had around us for forty years, to ask questions we think have been answered. We have to get over ourselves for the sake of a generation of kids who never received those answers when society was still naively asking questions.
When you read or talk about The Anxious Generation, point out the term. Count its occurrences.
“Virtual world.”
What does that even mean? What kind of world is it? Is it in this one? Is this one in it? Are we in it? Does it have depth? Or is it all surface? All shallow? Is it real, or is it an illusion?
Parents and school board directors aren’t going to read academic philosophical literature on the subject, of which there is plenty.
But they might be able to handle the sort of pop science that addressed and answered these questions in the ‘90s when the technology was new and exciting.
When the potential dangers were still only a projected potential concern, not a manifest reality.
When they came out, these books were pop culture books. But today, they are history books. History books should be read today to find our place in time and space.
History is important—especially for an age as forgetful and lost in time as ours.
Old Books on Computers
Within their fields, the people designing computers wrote very technical papers and books explaining precisely how they were exploiting developmental psychology to make them “easy to use.”
Studies on the sensory manipulations of VR technology in the ‘90s brought terms like “proprioception,” “cyberspace,” “virtuality,” “avatars,” and “the digital domain” into the common technical lexicon.
This was empirical, lab-grown, old-fashioned Western science, as valid today as it was then. Computer interfaces hacked into our sense of how the real world works, and thirty years ago, we learned more or less everything we needed to know in order to create immersive, interactive illusions of space that our body thinks of as real.
This is not speculation. It was the most popular, well-known, and widely understood aspect of the science behind the world's most profitable industry: the computer industry.
The difference between then and today is that, back then, industrial designers like Don Norman, world-famous for writing The Design of Everyday Things (1988) and popularizing the term “affordances,” were calling for The Invisible Computer while working at Apple in 1998.
Today, we actually have it.
In the so-called “softer” fields of the humanities, writers read and interpreted these technical documents.
The study of these massive technological changes produced a proliferation of criticism and theory regarding the implications for human identity and our sense of embodiment. Popular books like Howard Rheingold’s 1991 book Virtual Reality or 1993’s The Virtual Community neatly bridged the gap between science and the arts by synthesizing both aspects into narratives of vast cultural changes dawning upon humanity.
Also in 1993, Douglas Rushkoff’s book Cyberia established the new cyber-culture as a resurrection of ‘60s psychedelia, marrying its aesthetics with the burgeoning, still-anarchic underground rave scenes. In 1996, with Media Virus, Rushkoff coined the “viral” metaphor of information in his analysis of the Rodney King tape. The language of viral information was long part of the marketer’s lexicon before entering into wider use with the “social media” of “Web 2.0.”
In the humanities departments, William Gibson’s Neuromancer is unanimously credited with founding the cultural bedrock for our understanding of “cyberspace.”
Feminism, given its concerns with the sidelining and marginalization of women’s bodies in a world designed by and for men, has long been the home for theories of divergent conceptions of embodiment. The world behind the screen turns us into cyborgs, said Donna Haraway in A Cyborg Manifesto in 1985.
From this foundational work, the post-humanities developed throughout the ‘90s, elaborating the condition of developing within a world of simulation, where computers had long penetrated into and buried their cybernetic feedback hooks into your body and soul. Rosi Braidotti builds upon the work of her teacher, Gilles Deleuze, with her triumphant account of overthrowing the Western “Vitruvian Man” of rationalism by merging with machines.
Today this discourse continues under the wider umbrellas of diversity studies, especially regarding gender and sexuality. What, after all, is more embodied than that?
Sherry Turkle writes in Life on the Screen (1995) about how her appreciation for French post-modernism—the subject of her first book—was only rendered concrete for her when she saw how the Macintosh interface created a new aesthetic of shallow, non-linear illusions:
In the late 1960s and early 1970s, I lived in a culture that taught that the self is constituted by and through language, that sexual congress is the exchange of signifiers, and that each of us is a multiplicity of parts, fragments, and desiring connections.
This was the hothouse of Paris intellectual culture whose gurus included Jacques Lacan, Michel Foucault, Gilles Deleuze, and Felix Guattari. But despite such ideal conditions for learning, my "French lessons" remained merely abstract exercises. These theorists of poststructuralism and what would come to be called postmodernism spoke words that addressed the relationship between mind and body but, from my point of view, had little or nothing to do with my own…
With computers we can simulate nature in a program or leave nature aside and build second natures limited only by our powers of imagination and abstraction. The objects on the screen have no simple physical referent. In this sense, life on the screen is without origins and foundation.
It is a place where signs taken for reality may substitute for the real.
Its aesthetic has to do with manipulation and recombination…
The notion of worlds without origins is close to the postmodern challenge to the traditional epistemologies of depth. These epistemologies are theories of knowledge where the manifest refers back to the latent, the signifier to the signified. In contrast, the postmodern is a world without depth, a world of surface.
If there is no underlying meaning, or a meaning we shall never know, postmodern theorists argue that the privileged way of knowing can only be through an exploration of surfaces. This makes social knowledge into something that we might navigate much as we explore the Macintosh screen and its multiple layers of files and applications. In recent years, computers have become the postmodern era's primary objects-to-think-with, not simply part of larger cultural movements but carriers of new ways of knowing…
Fredric Jameson wrote that in a postmodern world, the subject is not alienated but fragmented. He explained that the notion of alienation presumes a centralized, unitary self who could become lost to himself or herself. But if, as a postmodernist sees it, the self is decentered and multiple, the concept of alienation breaks down.
All that is left is an anxiety of identity. The personal computer culture began with small machines that captured a post- 1960s utopian vision of transparent understanding.
Today, the personal computer culture's most compelling objects give people a way to think concretely about an identity crisis. In simulation, identity can be fluid and multiple, a signifier no longer clearly points to a thing that is signified, and understanding is less likely to proceed through analysis than by navigation through virtual space.
—Sherry Turkle, Life on the Screen, 1995
I could go on for many more pages of evidence of just how widespread cultural awareness was regarding the effects of so-called cyberspace—or virtual reality, or digitization, or computerization—was in the ‘90s.
It wasn’t exactly the Stone Age. It’s not as though the science or discussions we have today have supplanted or obsolesced or disproven everything we knew then. All that’s happened is that we threw out the old language with the old desktop computer and needlessly started the conversation from scratch.
(This time, without the pesky concerns for privacy that would have killed the social media industry in its crib.)
If I were to advise you to read one book, among those I’ve mentioned, it would have to be—with a very strong caveat—Sherry Turke’s Life on the Screen.
After understanding her mistakes in characterizing Microsoft as a bastion of freedom as I spell out, it stands as the best entry into all the rest. Because once you get from the ground of the computers themselves, the discussion gets more and more psychedelic.
So, too, fortunately, does Haidt’s flirting with the threshold.
For now, consider this passage from Erik Davis, long time scholar of schizophrenic sci-fi author Philip K. Dick (you have seen many adaptations of his work) and recent interviewee on the 2021 documentary A Glitch in the Matrix, from his 1998 book TechGnosis:
There are many books in other fields.
After convincing Congress to plow $30,000 into his project, Morse strung up a wire between Baltimore and Washington, D.C. The first official message careened along that Baltimore—D.C. line in 1844, and it was a strangely oracular pronouncement: “What hath God wrought!”
This bit of scripture was suggested by the daughter of the U.S. commissioner of patents, though Morse himself surely concurred with the sentiment; besides being the son of a staunch evangelist, he would later transfer a good portion of his considerable fortune to churches, seminaries, and missionary societies. Still, the first telegraphed message reads as much like an anxious question as a cry of glee, and today we know the answer: What God wrought, or rather, what men wrought in their God-aping mode, was the information age…
Writing about the telegraph in Understanding Media, Marshall McLuhan argued that “whereas all previous technology (save speech, itself) had, in effect, extended some part of our bodies, electricity may be said to have outered the central nervous system itself.”
For McLuhan, Morse’s electric ganglion was only the first in a series of media—radio, radar, telephone, phonograph, TV—that served to dissolve the logical and individualistic mindframe hammered out by the technologies of writing and especially the modern printing press.
Instead, the telegraph sparked the “electric retribalization of the West,” a long slide into an electronic sea of mythic participation and collective resonance, where the old animist dreams of oral cultures would be reborn among electromagnetic waves. But McLuhan also saw the collective “outering” caused by the telegraph as the technological root of the age of anxiety.
“To put one’s nerves outside,” he wrote, “is to initiate a situation—if not a concept—of dread.”
—Erik Davis, TechGnosis, 1998
McLuhan’s public work was largely an attempt to demystify the sensations of Gnosticism, which he knew could be exploited by cult leaders and malignant “artists.”
Davis is, then, entirely correct to include McLuhan’s analyses in his appraisal of modern gnosticism created by technology.
His strategy of going back to the telegraph is corroborated by Andrew Gaedtke's research in his 2017 book Modernism and the Machinery of Madness. In it, Gaedtke examines the works of Samual Beckett, Flann O’Brien, Evelyn Waugh, and Wyndham Lewis to observe how technology affected embodiment and the sense of reality for sensitive minds from the very start of the 20th century.
These matters have already brought mainstream culture down many strange avenues, such as the popularity of simulation theory.
This theory, that we are living inside a giant machine, was the subject of Glitch, which I reviewed for New Explorations after its release.
It is certainly not the only effect of cyberspace upon our sense of embodiment, and scrambling back to the literature that best meets us at the story’s true inflection point—‘90s computer culture—is a necessary step for gaining our bearings.
It is by entering the depths of history that we can stop skittering along the surface of our world of shiny screens and shallow, circular language for discussing the matter of technology.
We will consider this further while we consider Haidt’s chapters on ritual initiation, metamorphoses, and spirituality, and we will also supplement his calls to action on the part of parents and school administrators in part two of this review.
Clinton writes for Default Wisdom, and on his site, Concerned Netizen. He’s also the best McLuhan tutor around. If you’re interested in tutoring, just respond to this email.
Looking for Katherine Dee? Here’s some new stuff around the web:
How emo set the scene for Gen Z for The Blaze (a much, much, much longer emo piece coming here!)
Reddit’s saving grace for UnHerd
Digital community interviews are BACK, and the book club resumes with Blake Butler’s Molly on April 5 in Chicago, USA
OK, for perspective, I am an older GenXer. This will give context to my perspective.
Three main things:
1) I am extremely skeptical when someone blames a technology for ANYTHING. A hammer can build a house or bash in a skull.
2) Is it the phones, or is it social media? Is social media the medium that causes us to consume "too much screen" or "too much phone"?
3) The cure is going to upset some people, and send a bunch of sacred cows to be ground into White Castle sliders.
Over the last 40 years or so--really, as GenX moved into adolescence--the following ideas have become deeply rooted like weeds:
1) Adolescents and even 18-20YO young adults are incapable of making adult decisions. This has resulted in the progressive "legal ageing" of everything. Here's where the sacred cow goes to the slaughterhouse: young people have internalized this idea that they are incapable of "adulting", not just when it comes to vices but also the positive aspects of "growing up".
2) The world has become a progressively more dangerous place. This is utter B.S. and the statistics prove this out.
3) If harm befalls a kid, it's because of bad parenting, and perhaps the authorities should get involved. God bless Lenore Skenazy and her work.
The net-net result is a LOT less touching of grass, a lot less meeting in meatspace. Why deal with graduated driver's licenses and other restrictions and mall curfews when you can do your own thing online? My list of meatspace restrictions could go on for hours.
All of this needs to be rolled back, immediately. Yes, KIDS WILL DIE from injuries. Deal with it! Yeah, I said it. But it will help prevent kids dying from suicides and diabeetus and other metabolic issues down the road.
And all the safety whores (yeah, again I said it--no offense intended to sex workers, happy to come up with a better word) need to accept the blame for their misdeeds and "receive consequences". As usual, those in authority with the blood on their hands will escape blame.
I’m always excited to see when Clinton posts. And this review certainly delivered. We’re going to sound like old maids banging on about the history of technology, but it seems so crucial to open the historical window into the 20th century. Heck, even the 19th. Educating adults on the historical development and commentary on technology is crucial, but I wonder - how are schools tackling this subject? Where is the k - 12 curriculum covering “digital literacy”? Has it been mandated in any jurisdictions? This feels like a critical piece of education for any subject of a technological society. I suppose the lack of adult education in these subjects delays the implementation of education reform that targets digital literacy. Let’s hope Haidt’s book has the intended effect and gets the mass educated reader to take some action in that general direction.