“I believe that a desirable future depends on our deliberately choosing a life of action over a life of consumption, on our engendering a lifestyle which will enable us to be spontaneous, independent, yet related to each other, rather than maintaining a lifestyle which only allows to make and unmake, produce and consume – a style of life which is merely a way station on the road to the depletion and pollution of the environment. The future depends more upon our choice of institutions which support a life of action than on our developing new ideologies and technologies.”— Ivan Illich, Deschooling Society (1971)
Programming note: Glad so many of you seem to have found the audio version useful. You are now able to follow the newsletter as a podcast on iTunes, Spotify, and Stitcher, if by “podcast” we understand simply me reading the main essay in a less than animated manner. Just click “Listen in Podcast app” above. Hope that covers most of you who find that feature useful. And sure, feel free to leave a rating or review if you’re so inclined.
It’s been just about thirty years since sociologist James Hunter published Culture Wars: The Struggle to Define America. The titular phrase has since become a staple of the public’s discourse about itself, although, as all such phrases tend to do, it has floated free of its native context in Hunter’s argument about the then-novel rifts in American public life.
Lately, I’ve found myself taking recourse to the phrase more than I ordinarily would. I’ve typically avoided it because the phrase itself seemed to encourage what it sought merely to describe. In other words, as an analytical frame, the phrase conditions us to think of cultural dynamics as warfare and thus locks us into the dysfunctions such a view entails. “Metaphors we live by” and what not. Despite my misgivings, however, the phrase is undoubtedly useful shorthand for much of what transpires in the public sphere. So, for example, when someone asks about the face mask fiasco here in the U.S., I’ve simply suggested, perhaps too glibly, that face coverings were regrettably enlisted into the culture wars. But here is another problem to consider. Precisely because it is so useful as shorthand, it may be deterring us from the work of thinking more carefully and more deeply about our situation. Concepts, after all, can both clarify and obscure.
But I’ve also been thinking about the “culture wars” frame because, whatever else we might say about such conflicts, they have not remained static during the nearly three decades since Hunter wrote his book. One key development, of course, is that those thirty years overlap almost exactly with the emergence of the digital public sphere. In fact, several weeks ago what seemed like a useful analogy occurred to me: The advent of digital media has been to the culture wars what the advent of industrialized weaponry was to conventional warfare.
With that thesis in mind, I thought it might be useful to revisit Hunter’s work to see if it can still shed some light on our present situation and, specifically, to explore what difference digital media has made to the conduct of the “culture wars.” For brevity and clarity’s sake, although I’m not sure I’ve succeeded on either front, I’ve chosen to outline a few key points for our consideration. So here we go.
1. The culture wars predate the rise of digital media
This may seem like an obvious point, but it’s worth emphasizing at the outset. One consequence of our immersion in the flow of digital media tends to be a heightened experience of presentism, involving both a foreshortening of our temporal horizons and the presumption that what we are experiencing must be novel or unprecedented. The danger in this case is that we mistake the culture wars for an effect of social media rather than understanding them as a longstanding feature of American public life with a complex, multi-faceted relationship to social media.
In 1991, Hunter, surveying familiar terrain—for example: the family, free speech, the arts, education, the supreme court, and electoral politics—amply documented most if not all of the features we tend to lament about the state of public discourse today: the polarization, the intractable and bitter nature of the debates, the underlying animosities, the absence of civility, the characterization of ideological opponents as enemies to be eradicated, and, notably given present debates, what Hunter called the “specter of intolerance” and the supposed “totalitarian ‘threat’” posed by political opponents.
Describing what he calls the “eclipse of the middle,” he also notes how a need to be “stirred and titillated” means that “public debate that is sensational is more likely to arouse and capture the attention of ordinary people than are methodical and reflective arguments.” “The net effect of loud, sensational clamor,” he adds, “is to mute more quiet and temperate voices.”
Hunter also commented on the high level of suspicion in public discourse: “In today’s climate of apprehension and distrust,” he wrote, “opinions that attempt to be distinctive and ameliorating tend to be classified with all others that do not affirm a loyalty to one’s own cause.” Hunter also offered an incisive analysis of the role media technology played in creating these conditions, but we’ll get to that in another point below.
Finally, while Hunter’s work popularized “culture war” talk, he did not coin the phrase, which, as he notes, dates back to the 1870s. The culture war concept has its origins in the German word Kulturkampf, or “cultural struggle,” which specifically designated a fight between Protestant and Catholic factions for control of the cultural and educational institutions of the newly unified German state.
As Hunter goes on to show, the origins of the American culture wars were also rooted in similar struggles between Protestants and Catholics and later between Christians and Jews in the 19th century. From our perspective, of course, these appear chiefly as intramural squabbles within a larger and decidedly western religious frame of reference. By the late 1980s, however, Hunter perceived an important shift in the nature of these conflicts running through American society and politics. Culture Wars was Hunter’s effort to better understand these developments and map their consequences.
It’s important to remind ourselves that culture war dynamics precede the emergence of social media because we should not fool ourselves into thinking that if only we managed to somehow reign in social media, a temperate and civil public sphere would emerge. Although, as I’ll discuss below, social media is obviously not helping matters.
2. The culture wars are rooted in competing sources of moral authority
It’s useful to recall, as Hunter insists, that the culture wars encompass deep and genuine disagreements about what is good, what is just, and what is beautiful. After opening with a series of dispatches from the front, as it were, Hunter invites us to consider the following: “What if these events are not just flashes of political madness but reveal honest concerns of different communities engaged in a deeply rooted cultural conflict?”
Hunter believed that such was indeed the case. “America,” he claimed, “is in the midst of a culture war that has had and will continue to have reverberations not only within public policy but within the lives of ordinary Americans everywhere.” This has undoubtedly proven to be the case, even as the nature of the culture wars has shifted once again.
Presently, however, we tend to speak of the “culture wars” in a more pejorative or dismissive sense and with more than a little exasperation. I understand the temptation to do so. In fact, I’ve used the term in this way on more than one occasion of late. And, in fact, there’s good reason to do so. Contemporary manifestations of the cultural wars are often characterized by seemingly trivial grievances, regarding, for example, what constitutes an appropriate Christmas/holiday greeting. But even in their mannerist manifestations, they still betray something of the underlying moral concerns. It is also the case—and this is I think the more relevant consideration—that the culture wars are now often driven by what we might think of as mercenary grifters, who inflame and exploit whatever sincerely held moral principles initially animated the concerns of ordinary citizens.
So, while one consequence of the digitization of cultural warfare is a generalized presumption of bad faith, we should bear in mind that the culture wars are at some level still rooted in deeply held moral beliefs grounded in competing and irreconcilable sources of moral authority. This is not to say, of course, that all participants and positions are morally equivalent. This is obviously not the case. But, if with any given culture war skirmish we fail to see what all the fuss is about, that is likely because we inhabit a different moral order than those whose opinions we find so baffling or distasteful.
The key point here is simply this: we will misunderstand our cultural situation if we reduce culture war issues, especially those we care little about, to cases of mere posturing, bad faith politicization, or sincere rubes being duped by nihilist operators.
3. Digital media has transformed the conduct of the culture wars
So if we can’t blame the culture wars on digital media, how exactly should we understand the relationship between the two?
As I suggested above, I think it’s useful to think of the relationship by analogy to the transformation of warfare by the introduction of industrialized warfare.
I’ll begin by noting that media technology plays an important role in Hunter’s analysis. “The media technology that makes public speech possible,” Hunter noted, “gives public discourse a life and logic of its own.” Here is how he put the matter in a lengthy statement that is italicized for emphasis in the text:
The polarization of contemporary public discussion is in fact intensified by and institutionalized through the very media by which that discussion takes place. It is through these media that public discourse acquires a life of its own; not only do the categories of public rhetoric become detached from the intentions of the speaker, they also overpower the subtleties of perspective and opinion of the vast majority of citizens who position themselves ‘somewhere in the middle’ of these debates.
Now, here comes the surprising bit. In a chapter titled Technology and Public Discourse, Hunter devotes the bulk of his discussion to … direct mail, you know, the countless postcards and fliers that arrive in your mailbox around election time. Hunter also talked about television and radio, but he argued that direct mail was the relatively novel media technology that was really stoking the culture wars. I don’t mean to belittle his discussion, far from it. It can be instructive to understand the effects of old media when they were new. And, what’s more, the choice of direct mail makes a striking point of comparison with the conditions of the digitized culture wars in which direct mail finds its closest analog in targeted email messages and social media ads. The relative sophistication, personalization, frequency, and scale of the latter nicely illustrate the consequences of digitization for the culture wars.
So, let’s come back to my analogy to industrialized warfare. While historians tend to locate the origins of industrialized warfare in the late stages of the American Civil War, it is not until World War I that we see the full effects of industrial technology on the conduct of war. Notable features of industrialization applied to warfare include significant advances in weaponry (the machine gun, long distance artillery, exploding shells, etc.), the development of steam powered iron-clad naval vessels, the deployment of armies by rail, instantaneous communication by telegraph, and, later, the advent of tanks, aircraft, and poison gas.
In short, the industrialization of warfare massively augmented the destructive capacity of modern armies by enhancing their speed, scale, and power. Additionally, industrialized war became total war, encompassing the whole of society and blurring the distinction between civilians and combatants. Finally, as all forms of mechanization tend to do, it further depersonalized the experience of battle by making it possible kill effectively at great distances. As a consequences of these developments, the norms, tactics, strategies, psychology, and consequences of modern war changed markedly.
The logic of the analogy is, of course, straightforward. Digital media has dramatically enhanced the speed, scale, and power of the tools by which the culture wars are waged and thus transformed their norms, tactics, strategies, psychology, and consequences.
Culture war skirmishes now unfold at a moments notice. The lines of battle form quickly on social media platforms. Tens of thousands of participants, not all of them human, are mobilized and deployed to the front. They work at scale, often in coordinated actions against certain individuals, working chiefly to discredit and discomfit, but also to confuse, incite, exhaust, demoralize. The older, perhaps always idealistic aims of persuasion and refutation are no longer adequate to the new situation. Moreover, skirmishes that become pitched battles spill out indefinitely, becoming black holes of attention, which become massive drains of resources and energy.
Along these lines we can see that the power of digital media lies in their immediacy and scale, but also in their ability to widen the war. Direct mail may have targeted you, but social media involves you directly in the action. Take up your memes, comrades. We are no longer mere spectators of battles waged for our allegiance by elite warriors of the political and intellectual classes. In the digitized front, we are all armed and urged to join the fray. The elites themselves quickly become the victims of the very cultural warfare they had once stoked to their advantage.
Digitization also yields total culture war. No aspect of our experience goes untouched. This is a consequence of both the wide-scale distribution of the weapons of cultural warfare but also of how these same tools erode the older, always tenuous divide between public and private life. Now, the culture wars are total in the sense that they are all-encompassing and unrelenting. It’s not so much that we’re always on the front lines, it’s that the front lines are always with us. And while it is true that the culture wars have always involved public debate of private matters, the digitized culture wars swallow up even more aspects of private life.
One way of thinking about this is along the lines of sociologist Erving Goffman’s old dramaturgical distinction between front stage social life and back stage private life. In the culture war setting, we might frame that distinction as social life on the frontlines and private life that unfolds in the relative safety of the rear. To the same degree that digital media has blurred the front stage/back stage distinction and involved us in the work of perpetual impression management, so, too, has digital media blurred the distinction between the frontline and the rear in the cultural wars and made all aspects of our experience potential fodder. This also explains the frivolity of some of our culture war skirmishes. The logic of escalation precipitated by digital tools demands that more and more of civilian life be drawn into the fray, regardless of how seemingly trivial it may be.
This is also a useful way of framing the so-called “cancel culture” debate. The debate is spurred precisely by the digitization of the culture wars, which has made it necessary to negotiate a new understanding of how the wars ought to be conducted. Who is a legitimate target? What is a proportionate response? What actions and opinions ought to be legitimately drawn in to fight? The old analog rules no longer work, and we have not arrived at a new consensus.
So while it would be a mistake to believe that digital media has generated the culture wars, it would be equally mistaken to believe that we are now merely experiencing the same old culture wars. It is clear that the new digital battlefield has radically altered the nature of cultural conflict.
It should be clear, too, that the digitized culture wars give every indication of being interminable by nature and design. Given that the culture wars are rooted in longstanding moral and ideological conflicts stemming from fundamentally irreconcilable sources of moral authority, they will not simply peter out. There is little incentive for deescalation (other than mere exhaustion), and it is hard to imagine what exactly a truce might look like, much less a genuine peace or reconciliation. Given that the platforms that sustain the digitized culture war stand to profit from its proliferation and that the culture wars arise from and, however inordinately, answer the basic human need to take meaningful and morally consequential action, especially in a media-political regime that would otherwise render us morally anesthetized producers and consumers—then to that same degree they will tend to persist unabated.
4. Digital media has realigned the culture war
While it’s important to understand how digital media has transformed the way the culture wars are conducted, what I tend to find most interesting and significant is how the lines of the culture war are being redrawn and alliances reconfigured. Again, Hunter’s work was especially useful thirty years ago in explaining how the culture wars were redrawing the lines of culture conflict. We need a similar effort to understand why our old categories no longer work as a guide to the current socio-political field. But, while I think this is the more interesting and important terrain, I’m afraid that what I’ve got to offer definitely feels far more tentative and speculative. That said, here are a few things to consider.
First, by way of background, Hunter recognized that the increasingly acrimonious cultural wars of the late 20th century differed from earlier instances because American society had witnessed both a proliferation of sources of moral authority and, consequently, a realignment of the traditional actors into new configurations. He recognized that the old lines separating Protestants from Catholics and varieties of Protestant denominations from each other no longer held firm. The new distribution of moral authority cut across the old institutional lines. Hunter identified two key groupings which he labeled the small-o orthodox and the small-p progressive. They were characterized chiefly by whether they located moral authority in external and traditional sources, as in the case of the orthodox, or in the determinations of the self or the deliverances of scientific rationalism, as in the case of the progressives. (It’s important to note that Hunter was using the terms orthodox and progressive in an idiosyncratic manner that overlaps with but is not equivalent to how the words were used then or now.) In the then-emerging culture war Hunter was mapping, orthodox Catholics, for example, were more likely to find common cause with orthodox Baptists and orthodox Jews than they were with their ostensible co-religionists, progressive Catholics. It is striking to recall that in the mid-1990s a well-known Catholic moral philosopher wrote a book titled Ecumenical Jihad urging those Hunter would call orthodox Christians, Jews, and Muslims to join forces on the culture war front. It’s hard to think of a better example of the kind of realignment Hunter was analyzing.
Second, while Hunter focused on the way that media technology was deployed to further the existing causes of the new culture war coalitions, I think it’s important to understand how the digitization of the culture wars is itself generating new configurations. Direct mailers, for example, targeted existing mailing lists. In some sense, direct email works the same way even if has enhanced capabilities. Targeted social media ads seems like an almost qualitatively different technique; at least it is less reliant on an existing mailing list. Critically, however, digital media itself generates new identities and groups in a way that postal technology did not. This latter consequences of digital media itself engenders profound changes in the configuration of the culture wars, scrambling what had been the traditional Left/Right spectrum in American politics and generating what would have seemed like bizarre partnerships and affinities across those old lines.
Third, in Hunter’s analysis, the orthodox/progressive divide arose when the dominance of the old religious/theological consensus was challenged by a new locus of moral-cultural authority in subjective experience and scientific rationality. I would suggest that one of the transmutations we are witnessing can be attributed to the splintering of the old new locus of authority into its constituent parts: subjective experience now to some degree set against a modernist version of scientific rationality. So, New Atheist types who in another age bore the mantle of progressive resistance to traditional authority are now cast as conservative defenders of a traditional and oppressive morality.
In fact, to the degree that the older individualist spirit of scientific rationalism can be understood as the foundation of the modern moral order, what we are witnessing is precisely its displacement by a new, still-emerging moral order. But again, I would argue that its decline was not simply a function of an intellectual victory on the old terms. It was rather a product of the tacit challenge posed by the experience of digital media to both the primacy of the modern ideal of individualism and to the rules of rationalist discourse in the public sphere.
Fourth, while digital media facilitates the emergence of virtually constituted small scale groups of affinity, it tends to have a disintegrating effect on the cohesion of diverse, large scale bodies such as a nation state. Consequently, another flash point around which the lines of the culture war are redrawn may be understood in terms of the diminishing plausibility of the nation state as the product of a shared history and shared ideals and, thus, as a locus of identity. Once this shared history is contested and the shared ideals lose their hold on the public imagination, one might either seek to reground national identity along ethno-racial lines or else abandon or demote the ideal of national identity and patriotism.
Finally, another aspect of the emerging terrain may be described as the difference between those committed to technocratic modes of governance directed at the perpetuation of patterns of production and consumption, on the one hand, and, on the other, those animated by explicitly moral concerns about justice and equality, between those, in other words, who are determined to exercise authority without responsibility and those who desire the satisfactions of meaningful action toward the realization of justice and goodness as they understand it.
The deeper critique here may be to recognize that the culture wars, while rooted to some important degree in the genuine moral concerns of ordinary citizens, are themselves the product of the longstanding industrialization of politics and the triumph of technique. In both the case of institutionalization and the capture of politics by technique, the operations of the system become the system’s reason for being. Industrialized politics are politics scaled up to a level that precludes the possibility of genuine and ordinary human action and thus becomes increasingly unresponsive to human well-being. The culture wars are in this analysis a symptom of the breakdown of politics as the context within which fellow citizens navigate the challenges of a common life. In the place of such genuine politics, the culture wars offer us the often destructive illusion of politically significant action.
The Convivial Society is free to all and supported by the generosity of readers. If you find this work helpful and valuable consider becoming a paid subscriber.
News and Resources
Sean McDonald on what he has termed “technology theater”: “There’s a well-documented history of the tendency to hype distracting, potentially problematic technology during disaster response, so it’s concerning, if not surprising, to see governments turning again to new technologies as a policy response to crisis. Expert public debates about the nuances of technologies, for example, provide serious political cover; they are a form of theatre — ‘technology theatre.’ The term builds on security expert Bruce Schneier’s ‘security theatre,’ which refers to security measures that make people feel secure, without doing anything to protect their security.”And: “The ultimate vulnerability for democracy isn’t a specific technology, it’s when we stop governing together. The technological responses to the COVID-19 pandemic aren’t technologically remarkable. They are notable because they shed light on the power grabs by governments, technology companies and law enforcement. Even in the best of circumstances, very few digitally focused government interventions have transparently defined validation requirements, performed necessity analyses or coordinated policy protections to address predictable harms.”
Jackson Lears on “Quantifying Vitality: The Progressive Paradox”: “Our days became numbered long before the rise of Big Data and algorithmic governance. Indeed, the creation of statistical selves in the service of state and corporate bureaucracies was well underway by the early twentieth century, in the midst of what US historians still call the Progressive Era (in deference to the self-description of the reformers who dominated it). Eli Cook, Sarah Igo, Dan Bouk, and other gifted young historians have begun to explore sorting and categorizing institutions that branched out from their nineteenth-century predecessors, which had focused mainly on criminals and deviants. The new sorters were more catholic in their scope—life insurance actuaries quantifying the risks of insuring individual policyholders, pollsters using survey data in an attempt to construct a ‘majority man’ or ‘average American’—with their efforts culminating in the most ambitious tabulating scheme of all, the Social Security system, in 1935 …. The difference between Progressive Era biopolitics and contemporary biopolitics involves the intensification and acceleration of tendencies underway for more than a century—more powerful technology, but similar strategies for management and surveillance of the population.”This essay appeared in the latest issue of the Hedgehog Review given over to questioning the quantified life.
“The Atlas of Surveillance is a database of the surveillance technologies deployed by law enforcement in communities across the United States. This includes drones, body-worn camera, automated license plate readers, facial recognition, and more.”
July 16th was the 75th anniversary of the Trinity test, otherwise known as the first successful detonation of a nuclear weapon. Here are two pieces on the subject: “What If the Trinity Test Had Failed?” / “A Bomb In the Desert.”
Nick Paumgarten writes a compelling essay (2008) about elevators with the story of Nicholas White, who in 1999 was trapped in one for nearly two days, as the frame: “Two things make tall buildings possible: the steel frame and the safety elevator. The elevator, underrated and overlooked, is to the city what paper is to reading and gunpowder is to war. Without the elevator, there would be no verticality, no density, and, without these, none of the urban advantages of energy efficiency, economic productivity, and cultural ferment.”
Zito Madu, drawing on film and literature, reflects on the question of justice and race: “Each time I engage in these recurring protests, I think about how absurd they are. Not the protests in themselves, but the fact that they have to exist. The demand seems so simple, like Souleiman asking for his wages, that needing to make it is degrading. It is begging for something that already belongs to you.”
Swiss police automated crime predictions but has little to show for it.
“I Am a Model and I Know That Artificial Intelligence Will Eventually Take My Job”
Call for papers from the International Journal of Illich Studies for their next issue: Conviviality for the Day After “Normal.”
Shortlist for best astronomy photographs of the year.
Cambridge University has digitized its archive relating to the excavation of the ancient city of Mycenae.
“Reading Station” by Charles Hindley & Co., сirca 1890:
— “We live in a world where there is more and more information, and less and less meaning,” writes Jean Baudrillard in Simulacra and Simulation (1981). “Information devours its own content,” he adds, “It devours communication and the social.” More:
Rather than creating communication, [information] exhausts itself in the act of staging communication. Rather than producing meaning, it exhausts itself in the staging of meaning. A gigantic process of simulation that is very familiar. The nondirective interview, speech, listeners who call in, participation at every level, blackmail through speech: ‘You are concerned, you are the event, etc.’ More and more information is invaded by this kind of phantom content, this homeopathic grafting, this awakening dream of communication. A circular arrangement through which one stages the desire of the audience, the antitheater of communication, which, as one knows, is never anything but the recycling in the negative of the traditional institution, the integrated circuit of the negative. Immense energies are deployed to hold this simulacrum at bay, to avoid the brutal desimulation that would confront us in the face of the obvious reality of a radical loss of meaning.
— Mark Boyle writes about the “not so simple life,” that is a life without most modern technologies. “As Kirkpatrick Sale wrote in Human Scale,” Boyle explained, “my wish became ‘to complexify, not simplify.’” He’s made choices the majority of us will not and probably cannot make. But we may still learn something from his experience. There were several passages I could have excerpted. Here is one of them:
As I have no clock, my relationship with time has changed dramatically. Things do take longer. There is no electric kettle to make my tea in three minutes, no supermarket to pop into for bread and pizza. But here’s the odd bit: I find myself with more time. Writing with a pencil, I can’t get distracted by clickbait or advertising. Life has a more relaxed pace, with less stress. I feel in tune not only with seasonal rhythms but also with my own body’s rhythm. Instead of an alarm clock, I wake up to the sounds of birds, and I’ve never slept better. If I want to drop everything and go hiking, I can. I am finally learning to “be here now.” There’s more diversity, less repetition. Mindfulness is no longer a spiritual luxury, but an economic necessity. While this may not be the most profitable career path, it’s good for my own bottom line: happiness.
Folks, this was long and it took me a bit longer to compose. I hope you found the effort and delay worthwhile. As always, your feedback is welcome. Feel free to reply to this email. And, naturally, feel free and encouraged to share this newsletter as you see fit.
I hope you and yours are well.
This is a public episode. Get access to private episodes at theconvivialsociety.substack.com/subscribe
We are presently in the midst of another wave of free speech/cancellation discourse, this one prompted by an open letter published in Harper’s warning against a rising tide of illiberal constraints on free expression.
While debates about free speech are as old as the idea of free speech, a case could be made that they have taken on a different character in recent years. This may be a matter of frequency and intensity, but I suspect that the nature of the debate has shifted substantively as well. It seems that more recent clashes have less to do with specific applications of the principle than with the relative merits of the principle itself.
If so, it should not come as a surprise. When the technological infrastructure sustaining public speech is radically altered, so too is the experience and meaning of speech. Because this debate is framed by the conditions of the Database—the superabundant, practically infinite assemblage of data in our externalized collective memory, otherwise known as the internet—it is nearly impossible to navigate through every continuously unfolding aspect of even a seemingly narrow and contained instance like the Harper’s letter. So I won’t even make the attempt. Instead, I’m going to take a path that has been less frequently trod by examining a handful of underlying dynamics driving the controversy.
To be clear, I’m not suggesting that I can explain the causes of the debate, they are many and complex. Much less do I aim to settle the debate one way or the other. In fact, if I’m right, the debate can’t properly be settled at all. Rather, I aim to understand the deeper material conditions that generate the context for the debate.
My overarching thesis regarding free speech crisis discourse, including debates about “cancel culture,” can be put this way: this is what you get when the word is re-animated under the conditions of digital re-enchantment.
That’s a pretty jargon-heavy claim, so it obviously needs to be unpacked. I’ll start the process by distinguishing the two key theoretical components: the re-animated word, on the one hand, and digital re-enchantment on the other. These two distinct but related developments together generate the conditions driving our seemingly intractable and increasingly acrimonious free speech skirmishes.
By speaking of the re-animated word, I’m thinking in terms of media ecology and the basic premise is that we experience speech differently depending on the medium that bears it. Speech grounded in the face-to-face encounter is one thing. Speech inscribed in writing is another. And, more to the point for our purposes, print produces an experience of speech distinct from the experience of speech generated by digital media. It is in this difference that we find the root of our present re-litigation of the nature and value of free speech. Our previously regnant ideals regarding freedom of speech arose in the context of print culture and they are now, for better and for worse, floundering in the context of digital media.
In short, writing, and especially print, renders the word seemingly inert and thing-like. It tames the word in a very specific sense: by removing it from the potentially volatile and emotionally laden context of the face-to-face encounter. The difference has less to do with the content of the printed word than with its phenomenology, or how we experience it. It is absolutely true that you can find all manner of vitriolic and combative speech in print, as is evidenced, for example, in the political pamphleteering of the early republic. But, experientially, it is one thing to encounter this content in written form at a temporal and spatial remove from the author, whose very significance becomes dubious, and it is another to encounter these words directly and immediately from the mouth of the speaker, whose personal significance is unavoidable. In other words, even the plausibility of the claim that you should challenge the ideas and not the person, for example, is sustained by the conditions of print.
Let’s reflect on this a bit further. What is a word? This is not a trick question or a sophomoric dorm room provocation. When you think of a word, what do you think of? I’d be willing to bet that if you were asked to think of the word “cat,” for example, you would almost certainly think of the three letters C-A-T (or whatever the equivalent might be in your native tongue). Now ask yourself what would be the answer to that question in the era before writing was invented? Clearly not a set of symbols.
When you think of a word as a set of letters, you’re thinking of the word as an inert, lifeless thing. Before the introduction of writing, the word was not a thing but an event. It was powerful and effected irreversible change. Nothing better illustrates these different attitudes to the word than when modern readers encounter the biblical narrative of Isaac and his two sons, Jacob and Esau. When Jacob deceives his father into conferring his spoken blessing on him rather than Esau, the eldest son, a modern reader is likely to ask, well, why not just take it back, they’re just words. But when the word is an event rather than a thing, you can’t just take them back just as you can’t undo an event.
Not surprisingly, then, modern free speech ideals are historically correlated with emergence and internalization of print culture. Print encourages the notion that the content of ideas can easily be and, indeed, ought to be distinguished from the one who presents them. Print abstracts the act of communication from the lived experience of communicators and thus fosters the sense that words alone can do no harm. The well-known proverb about sticks and stones, for example, is most plausible in the context of print culture (and, in fact, seems to arise precisely in this context). The time and space necessary to the labor of communicating in print itself has a diminishing effect on the felt intensity of communication.
Digital media changes all of this. It places the word back into the heated context of relative immediacy. It is true, of course, that most digital communication still happens at a physical remove, but the temporal remove is collapsed, renewing a measure of immediacy to the act of speaking in public. Moreover, the word is reanimated in the sense that it becomes newly active: active and ephemeral on the screen, enlivened by image and audio, and active in its intensified emotional consequences. To speak into the digital public sphere is to potentially invite an immediate and intense assault not simply upon your ideas but upon you and your livelihood and well-being because, after all, you and your ideas and your words are now more tightly bound together.
There is a paradox here, though, that is worth noting. In one respect, one might just as easily say that digital speech does nothing, transpiring as it does in a hyperreal context. If the word once again takes on the aspect of an event, it is may be more like a pseudo-event. If so, then, this dynamic, too, threatens to escalate and intensify the character of online speech. The more powerless it appears as speech, the greater the temptation to ratchet up its intensity and escalate its hostile character.
So, then, the reanimated word is a different beast than the printed word. Consequently, when we internalize its dynamics, we’re likely to begin with a different set of assumptions about freedom of speech than those fostered by print culture.
The matter of digital re-enchantment is a bit more complex, but I’ll try to keep this relatively brief.
In the sociological tradition, modernity is characterized by the disenchantment of the world. This is a matter of serious debate, which I’ll sidestep here, but, needless to say I think there is a good case that can still be made for the theory. The general idea is that in the modern world, we’re less likely to think the forest is populated by fairies, that magical amulets can ward off disease, that a relic can protect us on a journey, or that evil spirits can bring harm upon us. The enchanted world was also a locus of meaning and significance as opposed to disenchanted modern world, which appears chiefly as raw material for our technological projects.
For my purposes, I’m especially interested in the way that philosopher Charles Taylor incorporates disenchantment theory into his account of modern selfhood. The enchanted world, in Taylor’s view, yielded the experience of a porous, and thus vulnerable self. The disenchanted world yielded an experience of a buffered self, which was sealed off, as the term implies, from beneficent and malignant forces beyond its ken. The porous self depended upon the liturgical and ritual health of the social body for protection against the such forces. Heresy was not merely an intellectual problem, but a ritual problem that compromised what we might think of, in these times, as herd immunity to magical and spiritual forces by introducing a dangerous contagion into the social body. The answer to this was not simply reasoned debate but expulsion or perhaps a fiery purgation.
Just as digital media reanimates the word, so to does it re-enchant the world, although in a very different specific sense. Taking Taylor’s model as a template, it reverses the conditions that sustained the plausibility of the buffered self. In the digitally re-enchanted world, as I wrote in a recent essay for The New Atlantis,
“we are newly aware of operating within a field of inscrutable forces over which we have little to no control. Though these forces may be benevolent, they are just as often malevolent, undermining our efforts and derailing our projects. We often experience digital technologies as determining our weal and woe, acting upon us independently of our control and without our understanding …
We are troubled not by spirits but by bots and opaque algorithmic processes, which alternately and capriciously curse or bless us. In the Digital City, individuals may be refused credit, passed over for job interviews, or denied welfare on the basis of systems built on digital data against which they have little to no recourse.
We are, in other words, vulnerable, and our autonomy is compromised by the lines of technologically distributed agency that intersect our will and desires.
This means, then, that the experience of the self that emerges out of this technologically enchanted milieu more resembles the porous self of the previously enchanted world than the buffered self that corresponded to disenchanted modernity. And the newly porous self is more closely correlated to the virtues of communally regulated speech while the buffered self was more neatly aligned with the spirit of individualized free speech idealism.
There’s obviously a great deal more that could be said about free speech in digital contexts. All of the following deserve careful consideration: the scale and immediacy of consequences in terms of both actions elicited and retributions exacted, the shifting power differentials occasioned by digital media, the precarity of employment, the form of digital platforms designed to elicit passionate engagement and discourage thoughtful conversation, the presumption of bad faith engendered by the overtly performative character of communication on digital platforms, the collapsing of different communities with their distinctive codes for speech and conduct into one digital space, as well as the relative permanence memory and lack of obscurity generated by searchable databases.
That said, I think the deeper undercurrents shaping how we experience the word in relation to the self that I’ve outlined here play an important role in setting the stage for our free speech travails.
I write The Convivial Society on a patronage model. The main offerings remain public, but I welcome the support of those who find value in the work. Paid subscribers also get access to our newly launched reading groups. In any case, I’m toying with the platform’s features, so here’s a discount offer for any of you who care to kick in a one year subscription.
If you’re reading this and you’re not already on the mailing list, by all means please do sign up for the free plan and you’ll still get pretty much everything I write on here.
This is a public episode. Get access to private episodes at theconvivialsociety.substack.com/subscribe
In addition to my scribblings here and elsewhere, I occasionally give talks about the role technology plays in our private and social lives. If there’s a Q&A time afterwards, one of the questions I’m most likely to get will be about how parents should regulate their kid’s use of digital devices. Sometimes the underlying anxiety and frustration is palpable.
For a long time, I was hesitant to address these sorts of questions because I wasn’t a parent myself, and I had enough good sense to know that it was best not to opine on how to raise children if you didn’t have some firsthand experience. Having now been a parent for nearly five years, I feel a bit less sheepish about addressing some of these questions, and, of course, the questions have also taken on a more personal and urgent quality.
I don’t think I’ve got this business figured out, of course. Far from it. But I have a few thoughts on the matter that might be helpful. And, honestly, while these will be framed by the question of children and technology, I think you’ll find the underlying principles more broadly applicable.
First, let’s get a few preliminary observations out of the way. Raising children can be a challenging, thankless, anxiety-ridden affair. Most of us are doing our best, often with limited resources and support. The last thing any parent needs is to be made to feel badly about one more ostensible failure or shortcoming on their part. This is especially true during a pandemic, which has radically restructured household arrangements and routines for parents and children both. So, please, do not hear any of what follows as anything more than one parent, given his own circumstances and aspirations, thinking out loud about these questions in the event that it proves helpful to other parents thinking through these same issues.
The following considerations are generally ordered from those claims that I think are pretty solid and broadly useful to those that stem a bit more idiosyncratically from my own perspective. And, no, in case you’re wondering, I don’t live up to these in my own practice as a parent, but I still aspire to their fuller realization as the vagaries of life allow. Finally, these are, for me, certainly not rules to be followed, but ideals to be daily negotiated in the trenches. Comments are open to all, and I’d be happy to read your own thoughts on these matters.
Resist technocratic models of what it means to raise a childIn my experience, parents are almost always looking for concrete and practical advice to follow, which is the kind I’m least likely to offer. Not because I like to be factitious, but because I think it’s important to recognize how questions about how much screen time is too much, for example, actually hint at a more subtle consequence of the technological framing of the task of raising children. In other words, while we focus on specific devices in our children’s lives, we sometimes miss the technocratic spirit we are tempted to bring to the task of raising children.This spirit was captured rather well a few years back by Alison Gopnick, who distinguished between two kinds of parents: carpenters and gardeners. Gopnick has a rather specific set of anxious middle class parents in view, but the distinction she offers is useful nonetheless. In the carpenter model, parents tend to view raising children as an engineering problem in which the trick is to apply the right techniques in order to achieve the optimal results. In this view, “parenting” is something you do. It is work. And the point of the work is to manufacture a child to certain specifications as if the child herself were simply a bit of raw, unformed material. In the gardening model, parents do not conceive of their children as a lump of clay to be fashioned at will. The focus isn’t on “parenting” as an activity, but on being a parent as a relationship structured by love. While the carpenter by their skill achieves a level of mastery and control over the materials, the gardener recognizes that they cannot ultimately control what the seed will become, that much is given. They can only provide the conditions that will be most conducive to a plant’s flourishing. Of course, any discussion that starts with “There are two kinds of x” will undoubtedly have its limitations, but I think it’s useful to remember that we do not make our children, we receive them as gifts. Naturally, this does not alleviate us of our responsibilities toward them. Far from it. But it does change how we experience those responsibilities, and it does relieve us of a particular set of anxieties that inevitably accompany any project aimed at the mastery of recalcitrant reality. Parents have enough to worry about without also accepting the anxieties that stem from the assumption that we can perfectly control who our children will become by the proper application of a various techniques.
(The Little Prince)
Resist a reactionary approach to technologyIn this arena, but may be as a general rule, it’s better to let your choices flow from what you are for rather than what you are against. In other words, when thinking about something like children and smartphones, say, it’s better to imagine yourself working toward particular goods you would like to see materialize in your child’s life than simply proscribing the use of smartphones out of some justifiable but murky apprehension. Don’t get me wrong, it’s not that there’s no such thing as “too much time on a smartphone,” it’s just that figuring what that means can’t happen in abstraction from a larger vision of what is good. “Too much” implies a relative standard. Relative to what, then? Is there also “too little”? What would “just right” look like? I don’t believe it’s possible to answer those questions, or questions like them, in the abstract. The point is to ask yourself what are the goods you desire for your children and your family. With those clearly in view, you can then think more deliberately about how certain tools and devices move you toward the good or undermine its realization. Of course, implicit in this is the assumption that we will have some fairly clear sense of what we’re for, as well as a decent grasp of how our tools can become morally and intellectually formative (or de-formative). Infants and toddlers won’t be able to deliberate about such matters with you, but my sense is that the sooner you’re able to bring children into some meaningful conversation about this kind of thing the better. Invite them to pursue the good and teach them by example to subject their use of any tool or device to that higher end. In this way we can inoculate them against one of the most pervasive disorders of a technological society, the temptation to make technology an end in itself.
Resist technologies that erode the space for childhoodI’m a fan of Neil Postman, but I tend to have a few more quibbles than usual with his 1982 book, The Disappearance of Childhood, in which Postman argues that childhood (and adulthood) as it had been imagined in modern western societies was tied to print culture. Consequently, as print culture fades in the face of electronic media, Postman argued its attendant models of childhood (and adulthood) faded, too. Quibbles aside, I think there’s something to the claim that certain techno-social configurations generate different experiences of childhood. It also seems that the experiences of and boundaries separating childhood, adolescence, and adulthood have been in flux. I sometimes talk about this in terms of what I’ve called the professionalization of childhood and the infantilization of adulthood. The professionalization of childhood is related to the technocratic modes of parenting for safety and optimization discussed above. It’s evident in the amount of resources, time, and expertise that, in certain segments of society, is often brought to bear upon every aspect of a child’s life. So we do well to think about the qualities and experiences that constitute a desirable childhood, one that neither rushes children toward the responsibilities, pressures, and anxieties of adulthood nor fails to adequately prepare them for such. It’s a delicate balance to be sure, but it seems to me that children must be allowed to be children if they are then to grow into a reasonably mature and stable adulthood. Along these lines let me quote at length from Robert Pogue Harrison:
“It may appear as if the world now belongs mostly to the younger generations, with their idiosyncratic mindsets and technological gadgetry, yet in truth, the age as a whole, whether wittingly or not, deprives the young of what youth needs most if it hopes to flourish. It deprives them of idleness, shelter, and solitude, which are the generative sources of identity formation, not to mention the creative imagination. It deprives them of spontaneity, wonder, and the freedom to fail. It deprives them of the ability to form images with their eyes closed, hence to think beyond the sorcery of the movie, television, or computer screen. It deprives them of an expansive and embodied relation to nature, without which a sense of connection to the universe is impossible and life remains essentially meaningless. It deprives them of continuity with the past, whose future they will soon be called on to forge.”I realize Harrison makes a number of sweeping claims in those few lines. I’m not suggesting we accept them at face value, but I am suggesting that they’re worth contemplating, especially with a view to the role of technology in these dynamics.
Resist technologically mediated liturgies of consumptionI probably take inordinate umbrage at the little carts with a “Shopper in Training” flag at Trader Joe’s. But, honestly, the “Shopper in Training” thing really irks me. Many of the challenges presented by digital technologies stem from their participation in already existing socio-economic patterns of endless consumption and the effort to initiate children into these same patterns. To whatever degree the use of a certain technology amounts to participation in a liturgy of endless consumption, I would think twice about adopting it. This one is tough, I admit. But at the very least, a balance of sorts ought to be struck between activities and technologies of preservation and production, however simple or rudimentary.
Be skeptical of running unprecedented social experiments on childrenWhile the social scientific data is still being gathered, analyzed, and debated, it is evident that we are running a society-wide experiment on our children by immersing them in a world of digital devices without any clear sense of the long-term consequences. Whether we’re talking about ubiquitous visual stimulation, unrelenting documentation, networks of monitoring and surveillance from infancy to adolescence, or offloading our care of children to AI assistants, for example, I’m not keen on thoughtlessly submitting children to this experiment. There’s no need to be alarmist here, although sometimes that may not be altogether unreasonable, but we should be judiciously skeptical and cautious. In practice this means being a bit suspicious about the panoply of devices and tools we introduce into our children’s experience, even from the earliest days of their life. And don’t forget to consider not only the tools that mediate your child’s experience, but also those that mediate your experience of being a parent.
Embrace limitsIf you’ve been reading my stuff for any length of time, you know this is a principle that is near and dear to my own understanding of human flourishing. In short, I think we do well to respect certain limits implicit in our embodied status as creatures in a material world. I tend to think it is good for our minds and our bodies when we don’t flagrantly disregard foundational rhythms associated with our earth-bound existence. Chiefly, this amounts to finding ways to better order or experience of time and place and human relationships. In the modern world, of course, we tend to experience limits as taunts inviting their own transgression. This is, in my view, a destructive dead end. Better to see things as Wendell Berry puts it: “[O]ur human and earthly limits, properly understood, are not confinements but rather inducements to formal elaboration and elegance, to fullness of relationship and meaning. Perhaps our most serious cultural loss in recent centuries is the knowledge that some things, though limited, are inexhaustible. For example, an ecosystem, even that of a working forest or farm, so long as it remains ecologically intact, is inexhaustible. A small place, as I know from my own experience, can provide opportunities of work and learning, and a fund of beauty, solace, and pleasure — in addition to its difficulties — that cannot be exhausted in a lifetime or in generations.”Consequently, I hope to both demonstrate and convey to my own children a way of being with technology which resists the temptations of a self-defeating pursuit of limitlessness and a willingness to receive time as a gift rather than an enemy to be defeated.
Embrace convivial toolsNeedless to say, none of this is about being anti-technology. Rather, it’s about being judicious in our introduction of technology to our children. So if we are thinking about what tools or technologies to invite into the life of a family, Ivan Illich’s concept of convivial tools gives us a good guide. “I choose the term ‘conviviality,’” Illich wrote, “to designate the opposite of industrial productivity. I intend it to mean autonomous and creative intercourse among persons, and the intercourse of persons with their environment; and this in contrast with the conditioned response of persons to the demands made upon them by others, and by a man-made environment. I consider conviviality to be individual freedom realized in personal interdependence and, as such, an intrinsic ethical value.” Elsewhere, Illich writes, “Convivial tools are those which give each person who uses them the greatest opportunity to enrich the environment with the fruits of his or her vision. Industrial tools deny this possibility to those who use them and they allow their designers to determine the meaning and expectations of others.”Albert Borgmann’s focal things and focal practices would work just as well here. The point is to embrace tools that generate a deep, skillful, and satisfying engagement with the world, tools which also sustain a substantive experience of community, belonging, and membership.
Cultivate wonder Wonder at the world that is is an indispensable feature of childhood that adults should fight to preserve. The best way I know to do this is is simply to attend lovingly to the world on the assumption that it has something of value to disclose to us and a reservoir of beauty to enrich our lives. As I’ve mentioned recently, attention is one of our most precious resources and we should do what we can to help our children become good stewards of this resource. So, I encourage myself and my children to look, to listen, to smell, to taste, to touch. I want them, just as I want myself, to cultivate a capacity for literally care-ful attention, an attentiveness that stems from a deep care for the world and those we share it with.
Tell stories, read poetryTake this one as a kind of added bonus. Good stories and poems do more than convey “content.” By their form, they embody, sustain, elicit, and encourage the very habits and virtues discussed above. To go a step further, I’d add memorize poetry.
Fin. I hope you found this useful. Again, I welcome your own thoughts, critical or otherwise, on these matters. I’m ready to learn from what you’ve discovered in your own experience.
This is a public episode. Get access to private episodes at theconvivialsociety.substack.com/subscribe
“Never has the individual been so completely delivered up to a blind collectivity, and never have men been less capable, not only of subordinating their actions to their thoughts, but even of thinking. Such terms as oppressors and oppressed, the idea of classes–all that sort of thing is near to losing all meaning, so obvious are the impotence and distress of all men in face of the social machine, which has become a machine for breaking hearts and crushing spirits, a machine for manufacturing irresponsibility, stupidity, corruption, slackness and, above all, dizziness. The reason for this painful state of affairs is perfectly clear. We are living in a world in which nothing is made to man’s measure; there exists a monstrous discrepancy between man’s body, man’s mind and the things which at present time constitute the elements of human existence; everything is in disequilibrium.”— Simone Weil, “Oppression and Liberty” (1955)
Programming note: On the recommendation of a friend, I’m experimenting with Substack’s podcast tool. You’ll note that nothing has changed with regards to the content, except that you now have the option to listen to the main essay should that prove more convenient. If you hit play above you’ll be taken to the webpage for the newsletter from which the audio will play. Nothing fancy, just me reading the text. If you have any thoughts on this, feel free to pass them along.
In the wake of the American failure to contain or manage COVID-19, I’ve begun to encounter the recurring refrain, “We’re going to have to learn how to live with this virus.” The tone may be indignant, exasperated, defiant, but the general point is the same: the virus is with us for the foreseeable future and people need to figure out how best to get on with their lives.
Regrettably, this is probably correct. A web of interconnected failures, stemming from the highest levels of government down to individual citizens, have more or less assured this outcome. We can hope for a vaccine to arrive sooner rather than later. We can hope for better treatment options. We can hope the virus unexpectedly fizzles out, “despite ourselves” as Zeynep Tufekci recently put it. But, as she added, hope is not a plan, and we’re more than likely stuck with COVID for at least another year.
But that’s not what I’m going to talk about here. Rather, I want to begin by discussing how this sentiment, “We’re going to have to learn how to live with this virus,” suddenly struck me as a useful way of framing an approach to the personal, social, and global challenges posed by the present configuration of digital society—challenges to the conduct of our everyday lives, to fabric of our communities, and to political and economic order.
So here’s the thing, we’re going to have to learn how to live with digital technology. We can hope for legislative action and regulation. We can hope for a radical transformation of the industry stemming from a labor insurgency at tech companies. We can hope that a renewed focus on humane technology may bear fruit in the long run. We can hope that digital technology, despite ourselves, doesn’t (further) accelerate the corruption of the political and social order. But hope is not a plan, and we’re more than likely stuck with the existing techno-social configuration of digital technology for the foreseeable future.
Don’t get me wrong. Just as I support efforts to develop a vaccine, discover therapeutic options, or restore governmental leadership to manage COVID-19, so too do I find merit in the various efforts I mentioned above to better navigate the social consequences of digital technology. But in the same way that I cannot simply hope and do nothing with regards to COVID-19, so I cannot simply hope for these various measures regarding digital technology to materialize and do nothing myself in the meantime.
For one thing, I am personally ill-positioned to do very much of consequence with regards to efforts either to develop anti-viral therapies, for example, or to draft legislation to regulate the tech industry. It’s not that I can’t do anything at all, of course. I can donate to organizations supporting vaccine research and I can contact my representative. But, these actions will not help me today or tomorrow or next month.
Over the last few years, I found myself occasionally writing in defense of a multi-faceted response to the challenges of digital technology. Chiefly, this amounted to a defense of individualized efforts to address such challenges from those who insisted that such efforts were unnecessary, on the one hand, and, on the other, from those who believed them to be inadequate and perhaps even counter-productive. I readily granted that individualized action alone was insufficient to the full range and scope of the challenges in view, and I granted, too, that we should resist a consumerist framing of the problem in which better informed, ethical consumption would be the answer to our problems. But I was baffled by those who in their defense of collective and political action seemed bent on discrediting individualized or even localized action.
It now seems to me that COVID-19 presents an opportunity to make an instructive variation of the case I sought to make in these instances. The health threat is collective and it requires all manner of responses in order to be met, and some of those responses materialize at the level of individual or household choices. Precisely because of the interdependent nature of human society, not despite of it, we are urged to act responsibly with a view not only to our own health but to the health of our neighbors and our community. Our membership in a community of mutual inter-dependencies does not diminish the need for personal responsibility, it heightens it.
Consider, too, how the same veiled distribution of consequences plagues our response to the virus and to the various manifestations of digital infrastructure. I must think of the virus not only as a threat to me, which I may be free to discount, but as a threat to others through me. Likewise, I must think of certain digital technologies in light of the unequally distributed consequences to which my personal choices may contribute. Perhaps I have no reason to fear any adverse effects from my adoption of a front proch Ring camera, but I must be able to imagine how the widespread adoption of this technology will have adverse effects for already marginalized members of the community and how it further depletes the fund of communal trust.
So here is the paradox: Certain digital technologies should be resisted not merely for their personal consequences, which may be negligible for certain individuals, but for their collective consequences. But for this reason, I should not simply wait for collective action, I should personally resist these tools in order to mitigate their deleterious consequences locally. Will my resistance alone solve the challenge posed by these tools? Obviously not. Should that keep me from doing what I can to confront the problem? Again, in my view, obviously not. Similarly, will my wearing a mask make the coronavirus disappear? No. Bu should that keep me from wearing one? No, again.
I’m reminded of Solzhenitsyn’s rule for the common citizen seeking to live with integrity in a repressive regime: “Let the lie come into the world, let it even triumph. But not through me.” He thought the artist could do more, but this much at least the average person could pledge to do.
Ivan Illich’s discussion of the question of public versus private ownership of industrial technology is also instructive. “It is equally distracting,” Illich wrote in Tools for Conviviality,
to suggest that the present frustration is primarily due to the private ownership of the means of production, and that the public ownership of these same factories under the tutelage of a planning board could protect the interest of the majority and lead society to an equally shared abundance. As long as Ford Motor Company can be condemned simply because it makes Ford rich, the illusion is bolstered that the same factory could make the public rich. As long as people believe that the public can profit from cars, they will not condemn Ford for making cars. The issue at hand is not the juridical ownership of tools, but rather the discovery of the characteristic of some tools which make it impossible for anybody to ‘own’ them. The concept of ownership cannot be applied to a tool that cannot be controlled.”
Now substitute Mark Zuckerberg or Jack Dorsey or Jeff Bezos for Henry Ford. The illusion to be combatted is that the tool itself is not at least part of the problem and that if it were only managed more ethically or regulated more effectively we could retain the benefits it confers while sidestepping its ills. What may be harder to countenance is the possibility that the tool itself may be destructive and corrosive of society, that its ills are essential to its nature rather than accidental.
So where does this leave us? It leaves us in the position of having to figure out how we are going to live with digital technologies rather than simply waiting for resolutions we cannot effect and which may or may not materialize.
To put this another way, yes, the most important problems we face are far greater than you or me. Yes, they require ambitious collective action. But when that action is not forthcoming, then our response cannot be to do nothing at all.
So where to begin? There are any number of possible answers, and they will vary greatly depending on your own circumstances. But allow me to make a modest suggestion: begin with your attention, because it may be that everything else will flow from this.
Attention is something I’ve written about on numerous occasions, so I’m hesitant about taking up the theme again here. But it’s been awhile, maybe two years, since last I wrote about it at any length, and I remain convinced that it’s a critical and urgen issue. So, as they say, hear me out.
I won’t comment on digital distractedness or social media platforms designed for compulsive engagement or the inability to get through a block of text without checking your smartphone 16 times or endless doomscrolling, as it’s now fashionable to call it (really just a new form of the old vice acedia), or our self-loathing tweets about the same. These matter only to the degree that we believe our attention ought to be directed toward something else, that in these instances it is somehow being misdirected or squandered. Attention, like freedom, is an instrumental and penultimate good, valuable to the degree that it unites us to a higher and substantive good. Perfect attention in the abstract, just as perfect freedom in the abstract, is at best mere potentiality. They are the conditions of human flourishing rather than its realization.
David Foster Wallace, who I realize has become a polarizing figure, was nonetheless right, in my view, to understand attention as constituting a form of freedom. “The really important kind of freedom,” Wallace claimed, “involves attention and awareness and discipline, and being able truly to care about other people and to sacrifice for them over and over in myriad petty, unsexy ways every day.” “That is real freedom,” Wallace claimed, and I’m inclined to agree.
Freedom that’s worth a damn is the freedom to attend with care to what matters. “Effort is the currency of care,” as Evan Selinger so eloquently put it some time ago, and, I would add, the preeminent form such effort takes is attention. And, yes, of course, I’m going to quote Simone Weil again, “Attention is the rarest and purest form of generosity.” What is jeopardized when our capacity for attention is compromised and hijacked is not our ability to read through War and Peace but rather our ability to care for ourselves, our neighbors, and our world as we should.
This, then, at least gives us a useful heuristic by which we might think about attention. Does it feel to you as if you are free in the deployment of your attention throughout any given day? Allow me here to speak out of my own experience: I know that it often doesn’t feel that way to me. I frequently find myself attending to what I know I shouldn’t or unable to attend to what I should. This is not a function of external coercion, strictly speaking. I experience it chiefly as a failure of will, as a form of unfreedom stemming from a regime of conditioning to which I’ve submitted myself more or less willingly.
And I feel the loss. The loss of focus, yes. The loss of productivity, yes. But also the loss the world and the loss of some version of myself to which I aspire.
I find myself needing constantly to ask, “What is worthy of my attention?” or, better, “What is worthy of my attention given what I claim to love, what I aim to accomplish, and who I hope to become?” If by our attention we grant the object of our attention some non-trivial power over the shape of our thoughts, feelings, and actions, then this may be one of the most important questions we can ask ourselves.
Several years ago, reflecting on this very matter, I wrote about the need for what I then called attentional austerity. Austerity is not a warm or appealing concept, of course. But once again, Illich can help us better frame the matter. “‘Austerity,’” he writes, “has also been degraded and has acquired a bitter taste, while for Aristotle or Aquinas it marked the foundation of friendship.” “In the Summa Theologica,” Illich continued, “Thomas deals with disciplined and creative playfulness … [defining] ‘austerity’ as a virtue which does not exclude all enjoyments, but only those which are distracting from or destructive of personal relatedness.” “For Thomas,” Illich concluded, “‘austerity’ is a complementary part of a more embracing virtue, which he calls friendship or joyfulness. It is the fruit of an apprehension that things or tools could destroy rather than enhance [graceful playfulness] in personal relations.”
From this perspective, then, austerity becomes not a deprivation but a virtue in service of a greater good and a higher joy, a virtue we do well to recover.
As we draw to a close, I want to add that it is not only a matter of consciously and austerely ordering my attention toward some greater good, of wresting it back from an environment that has become an elaborate Skinner box, it is also good for me to cultivate a form of expectant attentiveness to what is, a form of attention that commits itself to seeing the world before me.
The Polish poet Czeslaw Milosz once observed that “In ancient China and Japan subject and object were understood not as categories of opposition but of identification.” “This is probably the source,” he speculated, “of the profoundly respectful descriptions of what surround us, of flowers, trees, landscapes, for the things we can see are somehow a part of ourselves, but only by virtue of being themselves and preserving their suchness, to use a Zen Buddhist term.”
Further on in the same essay he wrote about the wonder that arises when, as he put it, “contemplating a tree or a rock or a man, we suddenly comprehend that it is, even though it might not have been.” This kind of wonder, a wonder at the givenness of things, the sheer gratuity of existence is perhaps its own reward as well as the gateway to the love of wisdom, as the ancient philosophers believed.
I hear in Milosz’s words an invitation, an invitation to step away as I am able from the patterns of digitally mediated reality, which while not without its modest if diminishing satisfactions, can overwhelm other crucial modes of perception and being.
The question of attention in the age of digital media may ultimately come down to the question of limits, the acceptance of which may be the condition of a more enduring joy and satisfying life. What digital media promises on the other hand is an experience of limitlessness exemplified by the infinite scroll. It tempts us to become gluttons of the hyperreal. There is always more, and much of it may even seem urgent and critical. But we cannot attend to it all, nor should we. I know this, of course, but I need to remind myself more frequently than I’d care to admit.
Pine Trees, Hasegawa Tōhaku, c. 1595
News and Resources
Ten years after publishing The Shallows, Nicholas Carr talks to Ezra Klein about the book and its enduring relevance.
In “How We Lost Our Attention,” Matthew Crawford explored how our understanding of attention was shaped by early modern philosophical polemics in epistemology and political theory.
A few years back, Alan Jacobs presented 79 theses with commentary on the subject of attention and digital technology. They will repay whatever time you can give them, much here to spur thought and reflection. I was delighted to be part of a colloquium in which Jacobs presented these reflections and to offer a response, which you may find here.
This essay probably could’ve been about half as long but it includes some interesting reflections on the rise of closed group chats:“As Facebook, Twitter and Instagram become increasingly theatrical – every gesture geared to impress an audience or deflect criticism – WhatsApp has become a sanctuary from a confusing and untrustworthy world, where users can speak more frankly. As trust in groups grows, so it is withdrawn from public institutions and officials. A new common sense develops, founded on instinctive suspicion towards the world beyond the group.”
I stumbled recently on two recent essays by Bruno Maçães, who was Portugal’s secretary of state for European affairs from 2013 to 2015. The first, “The Attack of the Civilization State,” appeared in a Noēma, a recently launched journal from the Berggruen Institute. The second, “The Great Pause Was an Economic Revolution,” appeared in Foreign Policy. I found both to be stimulating and I may have more to say about both of them in the future, although I confess they are outside my own areas of relative expertise. This in the former essay caught my attention: “Western civilization was to be a civilization like no other. Properly speaking, it was not to be a civilization at all but something closer to an operating system … Its principles were meant to be broad and formal, no more than an abstract framework within which different cultural possibilities could be explored … Tolerance and democracy do not tell you how to live — they establish procedures, according to which those big questions may later be decided.”This recalled some of what I attempted to articulate in a 2017 post, “I will put it this way: liberal democracy is a “machine” for the adjudication of political differences and conflicts, independently of any faith, creed, or otherwise substantive account of the human good. It was machine-like in its promised objectivity and efficiency.”
A look back at the advent of the Walkman: “The Walkman instantly entrenched itself in daily life as a convenient personal music-delivery device; within a few years of its global launch, it emerged as a status symbol and fashion statement in and of itself. ‘We just got back from Paris and everybody’s wearing them,’ Andy Warhol enthused to the Post. Boutiques like Bloomingdale’s had months-long waiting lists of eager customers. Paul Simon ostentatiously wore his onstage at the 1981 Grammys; by Christmas, they were de-rigueur celebrity gifts, with leading lights like Donna Summer dispensing them by the dozens. There had been popular electronic gadgets before, such as the pocket-sized transistor radios of the fifties, sixties, and seventies. But the Walkman was in another league.”
Not the usual sort of link here, but I appreciated this essay about the enduring insights of the ancient Roman historian, Tacitus—“Thrones Wreathed in Shadow: Tacitus and the Psychology of Authoritarianism”: “Shame, guilt, a lingering sense of powerlessness, and self-loathing: These are all emotions common to individuals living under tyranny. And, for all his literary brilliance and psychological acumen, Tacitus is no exception to this rule. In The Annals, when the historian describes the soul of a tyrant such as Tiberius, which he poetically envisions as crisscrossed with deep ‘lacerations’ and ‘wounds,’ he projects this state of invisible scarification onto Roman society as a whole. Indeed, the historian’s genius lies in his demonstration of how authoritarianism is, first and foremost, a collective malady — one that infects almost everyone, from the maniacal tyrant to the stolid local official, anonymous informer, or jeering spectator at the local theater. As Tacitus notes in a moving passage of The Annals, ‘the ties of our common humanity had been dissolved by the force of terror; and the rising surge of brutality drove compassion away.’”
— From a review of two recent books about walking, In Praise of Walking and In Praise of Paths :
I am a city walker, which is to say I walk to root myself. I define my neighborhood by walking, both its boundaries and my place within them, my connection to community. Even in the middle of a lockdown, I am out most mornings, to get exercise, yes, but also to remind myself of where I am. This is the hard part — to pay attention, to remain in the present, to look outward as well as inward, now from behind the forbidding filter of my face mask, while recognizing, as Torbjorn Ekelund reflects in “In Praise of Paths: Walking Through Time and Nature,” that “the path is order in chaos.”
— Talk of paths recalled an old post in which I reflected on the way of the tourist and the way of the pilgrim as paradigmatic modes of experience:
The way of the tourist is to consume; the way of the pilgrim is to be consumed.* To the tourist the journey is a means. The pilgrim understands that it is both a means and an end in itself. The tourist and the pilgrim experience time differently. For the former, time is the foe that gives consumption its urgency. For the latter, time is a gift in which the possibility of the journey is actualized. Or better, for the pilgrim time is already surrendered to the journey that, sooner or later, will come to its end. The tourist bends the place to the shape of the self. The pilgrim is bent to shape of the journey.
— From Richard Thomas’s “From Porch to Patio” (1975). As you read this consider what an interesting coda the advent of Ring makes to this argument:
“In this transition from porch to patio there is an irony. Nineteenth-century families were expected to be public and fought to achieve their privacy. Part of the sense of community that often characterized the nineteenth-century village resulted from the forms of social interaction that the porch facilitated. Twentieth-century man has achieved the sense of privacy in his patio, but in doing so he has lost part of his public nature which is essential to strong attachments and a deep sense of belonging or feelings of community.”
We’re half-way through the year now. What a year. I could not have imagined, for one thing, that I was going spend so much time commenting on a virus. But I also did not foresee this newsletter growing quite the way it has or that so many of you would support the work. So thanks. Thanks for reading. Thanks for letting others know about the newsletter.
P.S. Recent subscribers may be interested in a collection of essays I put together at the end of last year. It’s free to download, you can pay something for it if you like.
This is a public episode. Get access to private episodes at theconvivialsociety.substack.com/subscribe