Future Political Trends: A Study
For roughly the past six months, I have been repeatedly mentioning, in my posts on this blog, my intention to write up something about current and future political trends on a global level. I have not done so for a number of reasons, including (in no particular order) disinterest, boredom, anger, disgust, Lent, Easter, the death of Pope Francis, the election of Pope Leo, my desire to write short stories, a school field trip, my birthday, and the onset of spring. Central to my delays, however, has been the fundamental grimness of the topic itself.
Another thing that happened in the interval, however, was Easter; which is, properly considered, the only thing that has ever really happened. It struck me, on Easter night, that Easter is, perhaps, the best standpoint from which to consider present political realities. It is certainly the best standpoint from which to consider the sweep of human history and human life as a whole.
In any case, I firmly believe that eternal novelties like Easter are a much better means of understanding than the faded abstractions of political and economic ideology that dominate so much of discourse.
As Chesterton said in the Daily Herald, quite rightly, political ideology and analysis nearly always lag at least a half-century behind actual political systems. In the 1910s, he pointed out how profoundly unsuited the 18th and 19th century categories of Capitalism, Socialism, Democracy, and the like were for the era of syndicalism and great strikes and great states and secret societies and global warfare. In a similar vein, but even more so, the categories that we use for unraveling the tangled events of our time are practically all hoary 20th century abstractions such as Fascism, Naziism, Communism, Totalitarianism, Authoritarianism, and the like--when they are not the same, even more faded 19th century abstractions such as Capitalism, Socialism, Democracy, Liberalism, and so forth.
I would suggest that one of the greatest threats to our political life today, a threat that has again and again allowed evils to burgeon and flourish undetected, is simply the enormous gap between reality and our ability to analyze it. This may seem a rather distant and abstract threat, but is in reality among the most practical causes of the practical evils of our time. The year 2025 does not lack for crimes and tyrants--but it does profoundly, I am tempted to say unprecedentedly, lack for both practical recognition of these evils and practical efforts to counter them. And a foremost reason for this lack, I increasingly think, is simply that people cannot understand these evils, cannot recognize them, frequently do not even seem to notice them, because they happen to fall into gaps in their abstract, categorical understanding of such things.
For some bizarre reason, the real estate, media mogul, and brand icon Donald Trump continues to be analyzed, again and again and at ever greater length and with ever greater portentous seriousness by ever more prestigious intellectuals, entirely by comparison with a mid-20th century Italian movement of ex-socialist, WW1-veteran-populated paramilitary squads turned revanchist dictatorship. Like any historical comparison, there are certainly truths to be drawn from this one--but the gap between reality and analytical abstraction is, nevertheless, so vast that nearly the whole of Trump's actual ideology and program and even legitimate crimes can be, and have been, and continue to be buried within.
Nevertheless, in carrying out an analysis of present trends, and their likely future results, I would like to be absolutely clear about what I am doing, and why. I am not a historicist, let alone a historical fatalist: I do not believe in memetics, or Hegelian dialectics, or progress. When I speak of trends, I am speaking ultimately of either ideas or habits residing in the actual intellects and wills of actual people: ideas and habits which exercise great power over those people's actions, but never fully determine them.
People can and do reject ideas they have held, especially when they are ideas that they have never consciously understood, but only passively absorbed from their environments. People can and do change their habits, including habits that have become deeply engrained in their minds and hearts and wills over many years.
On the most abstract level, I consider history to be first and foremost the study of human actions and the motivations behind them; so that the fundamental historical question is not merely the positivistic query of "What happened?," but the much more intrusive demands "What did they do?" and "Why did they do it?"
What is true for historical actions writ large is even more true for the subset of human actions that make up political systems past and present. Governing, particularly in the modern world, is a highly complex and technical set of actions attempting to shape and respond to constantly shifting conditions. Still, it always depends first and foremost on conscious, considered human action; and conscious, considered human action depends first and foremost on rational ideas and goals.
Yet people are not always, or perhaps even often, aware of the ideas and goals underlying their own actions, let alone the broader social conditions and trends to which they are responding. It is for this reason, above all, that this kind of analysis is useful. As anyone knows who has ever tried to change a deeply-engrained idea or habit, one of the most important steps is often merely recognizing the actual ideas one unconsciously holds, and the actual habits that one unconsciously possesses. Only then, as a rule, can one then set out to change them.
Hence, while I am engaged in this essay in modestly claiming to understand contemporary trends and their likely future impacts, I am not engaged in actually trying to predict the future. To do so would be to fall under the curse of Chesterton's game of Cheat the Prophet: the game whereby smart people predict the future by extending current trends indefinitely, and the human race thwarts them by the simple expedient of going and doing something else. In this post, I am quite self-consciously teeing up to play a round of this game with the human race, providing them with a helpful listing of the trends they will need to know about in order to defy them and cheat me. In this, I heartily encourage the human race to cheat me: nay, I demand it. That is, in fact, the entire point of this exercise. If all my predictions are vindicated, I will be deeply, profoundly disappointed in you all.
Of course, the trends I discuss below are not uniformly positive or negative. Some are in my judgment evil, some few are good, some are, in themselves, merely neutral. Nonetheless, my modest claim is merely that if we wish to exercise some control over our collective destinies, it is helpful to know what is happening: only then can we choose to aid what is good, to resist what is evil, and, hopefully and above all, to repent and seek the good. This is my exhortation.
The New Narrative
About three years ago now, on the cusp of the Pandemic, I made a series of predictions about what the next years would most probably be like: predictions that, I can now modestly say, have been vindicated in both broad strokes and particulars.
Most crucially, I predicted that the aftermath of a true, nearly universal-scale Event, an event that effected nearly everyone in extremely direct and personal and extreme ways, would lead to a period of political and social instability spurred ultimately by what I referred to as the twinned dynamics of extreme risk-averse and extreme risk-prone behaviors. Put more simply, I posited that people's basic, psychological sense of the world would shift from a very stable and optimistic model to a very unstable and pessimistic model; and hence, they would be more likely both to engage in behavior aimed at avoiding to all possible danger and to risk everything on extreme chances. In all this, I believe, with all due modesty, that I was absolutely, positively correct and have been proven right again and again by nearly every event to occur.
What I did not see as clearly at the time, but have seen since, is the degree to which this psychological shift has led to a much more fundamental shift in political and social worldview, a shift so great that it has the potential to constitute, in and of itself, a fundamental change from one era and epoch to another. Put simply, progressive historical narratives, in both micro and macro dimensions, have almost overnight lost nearly all their compelling force for the vast majority of people across the world; and decline narratives have gained an immediate, intuitive, and pervasive force in people's view of the world as a whole, the fortunes of nations and classes, and their own individual lives.
This was brought home to me most forcefully by an interaction with students, where I repeated the hoary Latin-teacher-monologue that I have used too many times, beginning in the year 2016, whenever teaching the word novus; which is that while, for us, the word "new" has positive connotations, based on our general view of history as a progress toward higher things and our accompanying assumption that new things are better than older things, for the ancient Romans decline narratives were far more frequent and compelling, old things were generally assumed to be better than new things, and consequently the word "new" was frequently used in pejorative senses of what was "strange" "unsettling" or even politically "revolutionary." Delivering this monologue most recently, however, I was struck by the sudden suspicion that this might be no longer true at all; and was immediately confirmed in this by my students, who assured me gravely that the world was in fact getting worse and worse every minute and that their fondest wish was merely to arrest this decline and stay at the current level of badness.
What they told me I have since had confirmed for me by many interactions with younger people, with older people, and increasingly, with people of my own generation, trained more than any other in a firm belief in historical progress. Hence, I have come to believe that nearly everyone today, like the ancient Romans, now assumes that things, by their own nature, are bound to get worse and worse, that nearly everyone now views new things and new trends and new technologies with suspicion, as likely to be harmful and unsettling rather than beneficial, and hence that nearly everyone now situates their own goal or vision in the past rather than the future.
It is impossible to overstate the potential impact of this shift in fundamental worldview on nearly everything in our society, emphatically including politics and governance. Nearly all the basic dynamics of the liberal governance of the post-WW2 20th and 21st centuries, as I recently argued in this space, were premised upon a vast, unstated faith in some immanentized divinity or another, whether that was the free market, technological development, the Geist, or history itself. It is this, more than anything else, that explains the bizarre mixture of stability and disruption, of extreme interventionism and extreme indifference, that has constituted governance over the past decades.
Here I am thinking less of such governments' day to day actions (which have in many cases not differed greatly from previous types of governments' actions) than their, and our, basic sense of what they are, and what they are for: not the protection or preservation of any good, or the fighting of any evil, but rather the enabling and regulation of processes of change and liberation.
While Medieval governments generally saw themselves as charged with actively protecting and preserving and fighting for and vindicating the existing, long-standing culture and customs and traditions and families and clans and towns and cities and guilds and religious entities of their society, liberal governments have seen it as their sacred duty to strip away every protection of law and custom from all such things, to passively but self-consciously permit or even actively enable their breakdown at the hands of external and internal forces. Likewise, while Medieval governments saw themselves as fundamentally tasked with ensuring justice, which meant first and foremost the equitable distribution of resources and wealth and power within society, liberal governments have generally seen it as their equally sacred duty to not interfere with the sacred rights of property and trade and individual preference and the inscrutable operations of the Invisible Hand of the Free Market.
Both singularities are ultimately only comprehensible given a background assumption of the imminently progressive nature of history, where all the processes that, like bacteria, break down and destroy human institutions and cultures and the wealth and happiness of families and individuals are assumed to be natural and inevitable and blessed and always in the process of producing even better things from their corpses. Given such an imminent deity in our midst, embodied in rust and mold and in the buying and selling of bodies and souls, it is eminently rational that rulers should do whatever it might take to break down every opposition to this deities' beneficent operations, every law and tradition and custom that might stand in its way, secure in the knowledge that the deity would make it all worthwhile, while at the same time making sure that neither they nor anyone else in society ever interfered with its beneficent operations or sought to impose their own petty ethical values or rational ideologies on it.
It is, I confess, impossible for me to see how this view of governance can be coherently maintained in the absence of this imminent deity; and if this is so, then the fundamental concept of a government that merely presides over and regulates social and technological processes by aiding in the liberation of latent forces, may be well on the way to obsolescence. And that gives me pause for thought.
For my part, I think that the basic assumption of a progressive immanentized divinity is both false and harmful; and so are nearly all the concepts of governance that have emerged from it. Nonetheless, the very varying historical dynamics and values that we now call, very clumsily, by the name of liberalism do not merely contain lies and violence, but a great deal of universal human tradition and a large amount of inherited prudence and caution about the real dangers of unconsidered cultural custom and reckless governmental action alike. The danger whenever any tradition of human life or governance becomes suddenly meaningless is that this will lead to the elimination, not of perversions and singularities, but rather of the entire practical human traditions now bound up with them.
In any event, it is not the case that the unconscious assumption of an unstable, violent world where things are always getting worse necessarily leads to wise and prudent politics either: quite the contrary! This assumption and its opposite are in themselves ideas that operate above the level of typical political action, and which are compatible with any number of positive or negative political trends. Given that most people in the ancient world held to a broad assumption of decline over time, this assumption could lead as naturally to an Athens or a Sparta, a Rome or an Assyria.
Still, if this is in fact the baseline assumption of our emerging 21st century politics, then it should be possible to make certain predictions about the shape and scope and dynamics of such politics: assumptions that may well seem extreme from the standpoint of the 1990s, but which are, in my judgment, quite likely to not be nearly extreme enough from the standpoint of the future.
If we assume, for the purposes of argument, that the world is in fact defined by powerful, perhaps even inevitable, processes that are changing things continually for the worse, then we, as human beings living in such a world, would naturally tend to look for certain things from our government. First, to the degree that it is possible to slow these processes or even reverse them, we would be looking for leaders capable of doing so: which is to say, leaders whose actions are not bureaucratic or managerial or regulative, but heroic and conflictual and violent, focused on reversing trends, fighting the forces of time and change and history, defying gods and men, and in the process recreating some past state of affairs real or imagined. Secondly, to the degree that it is not possible to retard these processes, we would want a government that would at least ensure that we could in some measure profit from these processes, harness them for purposes that are, by necessity, not good in the abstract or for most people or for humanity as a whole, but at least might be good for us. And thirdly, to the degree that neither of these things are possible, we would want a government that would, at least, enable us to live as well as possible for as long as possible, to find and maintain at least some temporary, ultimately illusory comfort or security or victory or satisfaction before our inevitable failure and defeat.
When I look around me at the political scene of the Year of Our Lord 2025, at the most successful and powerful leaders not just in the West but around the world, I confess that this is already largely what I see. I see leaders with no vision at all for a better future, who promise not so much to turn back the tides as to slow them down a little, or maybe make a buck off of them. I see leaders presiding over the demographic decline and the eventual extinction of their nations seeking, not to reverse these trends, but to achieve a comfortable and managed decline or find some temporary, violent victory elsewhere. I see leaders presiding over increasingly violent, divided, poor, and unhappy societies seeking, not general peace or unity or prosperity, but rather distraction and profit for themselves and their friends.
As I said, political discourse is nearly always well behind the times; and it is quite likely that what I am describing as the future is in fact the present, or even the recent past.
The New Nostalgia
One already contemporary feature of politics and discourse closely connected with the above, I think, is the degree to which it is defined by backward-looking nostalgia for progressive, anti-nostalgic periods. In this, we have achieved a fundamental contradiction, unique in my experience: something that may be described, according to its various flavors, either as a regressive progressivism or as a progressive regressivism.
American politics, at the present moment, have never been quite so incoherent as at the present; though this incoherency is in part due to the unavoidable coexistence of old and new elements during a period of drastic political and social realignment. On both right and left, older, more managerial forms of politics coexist uneasily with more heroic and messianic brands of politics; politics centered on trust in institutions and the government and the military coexist with politics based around pervasive suspicion of the powerful and elites; politics based around technological advancement coexist with politics based on suspicion of all technology; and so on and so forth.
And these political realignments are far from over: it is, presently, as difficult to imagine the future of the Republican party post-Trump as it is to imagine the future of the Democratic party post-Biden. Much will be decided by internal political battles in the years to come; and much, too, will fluctuate with the movements of fashion and mass-media and simple personal corruption. We are very far from the era of the great 20th century ideologies; politics is less based around ideas, and more around personalities, than it has ever been.
Still, over the past few years, particularly since the Pandemic, I have been struck more and more by the realization that the most coherent rational core of each emerging political brand centers on an intense nostalgia for a particular past era; that to the degree it is even possible to understand, say, Donald Trump's politics, or Joe Biden's politics, or Tucker Carlson's politics, or Jon Stewart's politics, or Ross Douthat's politics, or even Alex Jones' or Richard Spencer's or Bronze Age Pervert's politics, the most coherent approach to each one is to simply attempt to discern what past political era they see themselves as trying to reconstitute in the present day.
The MAGA movement is many things, most of them silly; but as even their slogan indicates, at the very core of their brand identity is the idea of reconstituting a past era when America was "great." What precisely this era is or was has differed over time: but when asked about it, Donald Trump has generally given some combination of the immediate post-war era, the 1940s and 1950s, and the turn of the 20th century, the time of his hero William McKinley. When one sees this, one immediately sees a certain degree of sense, as it were, behind the nonsense: or rather, one sees what connects aspects of Trump's politics and policies otherwise nearly impossible to understand. For instance, in both of these periods America was the industrial powerhouse of the world, with an overwhelming trade balance maintained in large part by strong tariff regimes and an economically assertive state that cooperated with and directed private financial power for the purpose of ensuring mass industrial production. Also in both of these periods (though more in the era of McKinley, America was an expansionist power, taking and holding new territory in explicitly self-interested ways, to secure resources, to project military force, or merely to increase prestige. In both of these eras, too, America laid the foundations of the modern policing and security state.
Seen from this point of view, what is perhaps most striking about the present moment is the fact that a large and growing element on the American Left more or less agrees with Donald Trump that the thing to do is to as much as possible reconstitute the industrial, expansionist America of the '40s, '50s, and '60s. In these discourses on the Left, more emphasis is sometimes placed on the need to undo the financialization of the American economy in the '70s and '80s; but more frequently, the accent is on the very MAGA-compatible issues of economic globalization and government regulation. While the specific analysis often differs, the basic recipe is more or less the same: use governmental policy to re-industrialize America.
Other elements on the American Left, of course, have rather different recipes: with the now-largely-discredited faction of Biden and Hillary Clinton pushing merely for the recreation of socially-progressive, globalized, militarily interventionist, democratically-eschatological, securitized, thoroughly financialized world of the '90s and the 2000s. These are matched, of course, by very similar groups on the Right, groups that still control a large proportion of Congressional power and continue to push for global interventionism and financial protection at every opportunity. At present writing, when large groups on both the Right and the Left are pushing for war with Iran, the symmetries are rather obvious.
Then there are the purported members of the New Right, who at their most coherent gesture at the New Deal as embodying their vision of a new economic and political order that will take care of the ordinary Americans and redistribute wealth downwards. Many members of the New Left have more or less the same vision.
The so-called Alt-Right, to the degree it still exists as a politically coherent viewpoint, has long since crystallized around a nostalgic desire to recreate the 19th century, when racialized Imperial politics were at least the rising thing, if not always the norm. Even in such circles, profound ruptures and conflicts are the order of the day: which can perhaps be most coherently understood as a debate between 19th century racial-purity and cultural-purity ethno-nationalism on the one hand, and the Imperialist goal of the domination and tutelate of "white" Empires over other races on the other hand. This was a fervent debate already in the 19th century, and continues to be carried out, in less rational and considered ways, on the Internet today.
Even then, much that passes for "Alt-Right" in popular discourse is in essence entirely different in its vision: and consists of various political postures centered, if anywhere, on nostalgia for the political and personal and economic liberalizations and deregulizations of the 18th century and/or the era of Ronald Reagan. These may be complemented by the increasingly-uncommon, genuine Leftisms centered on nostalgia for the great eras of the Great Strikes and Labor Union politics: or, more commonly, mere hazy memories of the Soviet Union and Leftist politics from the 1970s.
Of course, to say that the most coherent part of each of these groupings is their nostalgia for past eras is not to say that these nostalgias reflect accurate, let alone detailed and comprehensive, viewpoints on past eras. Nor is it to say that their application of the lessons purportedly learned from these eras to the present day lead to things that are not frequently novel. Rather, it is in part to recognize the fact that we no longer live in an era of ideas and ideologies and abstract analyses: and that therefore coherency must be sought most frequently, not in ideas or policies, but in visions: and that these visions are, almost without exception now, visions not of the future but of the past. And that itself is perhaps the thing most worthy of all to note, and take into account, for the political era to come.
The past, after all, is a very big place, full of very different things: and if, so far, Americans' political horizons have been more or less limited to various eras of American power since the 18th century, there is no reason why they need stay there. Certainly on a global basis, imagination, if not genuine politics, has already far surpassed the limits of the modern. India's Hindu Nationalists combine hyper-modern political networking with a profoundly kitschy and vague vision of the past glories of Hindu civilization; and, increasingly, the Chinese government is engaged in very similar, if somewhat better-funded, efforts. Russia is largely engaged in reconstituting the Soviet Union as a nationalist Empire; but has already learned to appeal to the Tsars as well as to Stalin. As doubt about the progressive arc of history sets in, I highly suspect that such pre-modern appeals will grow more and more common, even in the West, even, perhaps, in America, which properly speaking has no pre-modernity to look back to.
Of course, as I have already suggested, when a hyper-modern technological society tries to nostalgically hearken back to a past era, the results are usually something more hyper-modern than modernity, a simulacrum of the past by way of technology and raw power. One can certainly see this with Trumpism, which even in attempting to recreate the industrial and tariff policies of the 20th century has mostly produced somnambulistic hypermodern theater, Presidents holding large cards with AI-generated numbers and engaging in Internet-speed brinksmanship via Twitter while passing financialized tax cuts and gutting the state capacity necessary for actual large-scale reindustrialization. The same basic features, however, can be seen on the Left as well; of which the hilarious, social-progressivism-powered internal purity wars and accompanying practical disintegration of all American Left-wing organizations over the past ten years is another obvious example.
As I said, though, for the moment I am less interested in fully describing the consequences of this change as merely pointing out that it is occurring. To say that our politics is becoming more and more nostalgia-driven is not to make any particular prediction about what ideological or practical shape it will actually take--particularly since the past is a very large place indeed, and since the visions of the past on offer are almost ubiquitously inaccurate, shaped by the concerns of the present, and normatively driven by hypermodern technological representations, the Internet and mass media rather than genuine historiography. Almost anything can be justified as the means to recreate a past golden era, just as almost anything can be justified as a progressive pathway into the future.
Still, the change in itself is notable; and recognizing it is at least potentially a major aid to understanding. As people are driven farther apart and more incomprehensible to each other by mass media, it is well to look to the past: and to ask what era people wish to return to.
The New Information
Considerations of art, of media, and how these things fit into human life, society, and governance are as old as art itself. The Book of Deuteronomy is as much a reflection on what it means to have a written law code as a listing of actual laws; the central, internal debate of Homer's Iliad is over how worthwhile epic poetry is; and so on.
Nor is it the case that we are the first to take the products of media seriously, to make them central to our lives and social and political structures and worldviews, or even to polemically contest this centrality and its deleterious effects. Certainly societies that practiced regular cultus to images can hardly be accused of treating images as trivial; and those who critiqued and polemically assaulted such worship can hardly be made out as naive about the effects of art on human psychology, society, and politics.
Hence, though it is tempting to treat the new centrality of media as defining for modernity and post-modernity alike, there are obvious ways in which this is merely a continuation of an ancient human principle, if not a return to an ancient standard.
Still, there can be no question that in my lifetime, the everyday, daily consumption rate of media, by most members of the human race, has been drastically increased by the influence of, first the Internet, and then smartphones. At present writing, the number of smartphone users in the world is well over six billion, and the proportion of cell phone numbers to population in Nepal stands at approximately 134%. While the number of words consumed by most people likely peaked, in Western societies, some time ago, there can be little doubt that the images and media in general consumed by most people in the world has never been higher.
If media consumption is particularly high at the moment, so too is concern about media consumption. In the last five years or so, anti-smartphone discourse has proliferated at a rate surpassing nearly every other genre of writing; it has become perhaps the single social concern universally shared across all classes and parties and political ideologies. Nearly everyone agrees that smartphones are at least a, if not the, root cause of nearly all social ills, from radical politics to liberalism to illiberalism to anorexia to racism to violence of every sort.
This is, in itself, a stunning reversal of the assumptions of the immediately preceding period of American and global society, in which nearly everyone assumed nearly all the time that nearly all problems were caused by ignorance and would be inevitably cured by more accessible information. Given the basic falsity of this worldview, there can be little question that the new emphasis on the dangers of information technology constitutes an improvement over the previous status quo.
It would be highly premature, however, to say that our society, or even our intellectuals, have turned against information as such, or even against the more fundamental idea that the root problem of human life is ignorance. Rather, as with any eschatological belief threatened by failed predictions, we have been forced to craft ever more elaborate narratives to explain the failure, and coin ever newer and harsher terms for the shadow counterpart(s) of our gods: conspiracy theories, hate speech, misinformation, discredited claims, fake news, cringe, AI slop, brainrot.
What sets all these terms apart from earlier historical terminology involving speech or writing or media, however, is that they are not based around any concept of truth as such. Conspiracy theories are, as everyone knows by now, sometimes true; and hate speech is censured, not for its content, but for its emotional context. Misinformation may at first glance appear to be a term defined by a false relationship between information and reality: but in practice is almost universally adjudicated with reference to procedure and authority, by the presence or lack of sanction by one or more of our informational priesthood(s). "Discredited claim" makes this basic complex even clearer, "bad" information defined by a maledictory action by an implied authoritative or priestly agent. Fake news designates first and foremost a defective product of consumption, and, as such is measured with reference to the desires and preferences of the consumer.
All the above terms originated with our informational priesthood(s), and were designed to function first and foremost as tools of control; but increasingly have become freestanding weapons used and useable by all sides of informational conflicts, majority and minority, left and right, expert and ignoramus, priestly and lay. They stand clearly apart, however, from the increasingly common popular terms for negative information, which, as in most cases, are far clearer both in basic conception and in deployment. They all designate essentially the same thing: information considered apart from any relation to truth as such.
Here, though, is the fundamental issue: that information severed from any relationship from truth has been, for the past century at least, the defining cultural product of our society. I have called this fundamental generic form advertising-pornography; and though the scope covered by this term as I use it is much broader than those terms considered separately and narrowly, this name, I think, captures the fundamental features of this form of information better than most.
Considered philosophically, information, far from constituting a fundamental analyzable feature of reality like existence, is rather one particular form of the more fundamental relation of truth or intellect: a relation by which one entity possesses the form of another without being it. Considered as such, information is always and everywhere defined by truth, which is to say, the correspondence between mind and reality, between knower and known.
Put simply, advertising-pornography is a form of information that aims not at conveying truth, or producing a correspondence between mind and reality, but at reliably producing an extrinsic effect in a subject. Advertising may include true statements and representations, or (more commonly) false statements and representations; but the purpose of the statements is not to convey truth, but rather to get us to buy a car. We all intuitively know this; which is why the idea of fact-checking claims or representations made in an advertisement seems to most of us, most of the time, simply an absurdity, a superfluity, a category error, like critiquing the text in a manual for its lack of euphony. To point out that buying a car will not, in fact, allow us to joyride around a large city racing a beautiful woman that immediately falls in love with us, is an infallible sign of media illiteracy, or at least, culturally speaking, in decidedly poor taste.
This has all been true for some time; and is as much the case for most technical writing and journalism as for literal pornography and advertising. The health communications during the COVID pandemic, as has now become clear, were not intended to inform the public, but merely to get them to do certain things; and the same holds true for nearly all political communications, ranging from ads to policies to State of the Union addresses.
Nevertheless, the rise of smartphones and accompanying algorithmic technologies, including but hardly limited to LLMs and other "AI"-branded technologies (the vague, overly broad, theoretically nonsensical, but catchy term AI itself being a wonderful example of advertising/pornography in itself), does seem to presage a fundamental shift in our relationship with information as such.
The 20th century was above all marked by the twin ideas and practices of information science and information technology, beginning with the newspaper and continuing almost without hitch through TikTok. The mid-20th century in particular saw a massive effort to theoretically and practically refine informational methods of control, an effort carried out indistinguishably and interconnectedly in government offices of propaganda and corporate advertising centers. In both cases, the explicit goal was to get people to do things: to believe, to obey, to buy, to fight, to die.
If the goal is to get people to act, however, then a great deal of attention had to be paid to human motivation as such, to the things that make people feel good, inspire them, tempt them, entice them, get them to take extraordinary risks and make extraordinary sacrifices. Put simply, this kind of informational technology, while perverse in its ends, was in its means necessarily human-focused, anthropocentric.
Accompanying this growth in informational science, however, was an accompanying growth in what may be called techniques of informational pacification. In contrast to advertising and propaganda, these forms of media aimed less at getting people to do things as getting them not to do things: getting them to let go of and forget about efforts and ideas and beliefs and motivations and actions.
In the beginning, these two forms of information were deeply intertwined and frequently indistinguishable: since governments and businesses alike were deeply concerned about the potential for dangerous, revolutionary action, action possibly even inflamed by their own efforts at propaganda and advertising, and worked very hard indeed to find ways to forestall this. A central concept that emerged from these efforts was entertainment, a novel idea that gradually was embraced as the overriding goal of all art and literature. This idea would have surprised more or less every prior society to ever make use of art (which is to say, all of them): who, when they engaged in theorizing about said art, nearly always appealed to some ritual or religious purpose, some sort of social or public function of unity or catharsis, some moral and ethical effect, and/or some communication of truth to justify and explain art.
Entertainment, though, expresses well the essential pacifying and finalizing purposes of information technology and science over the past decades. Rather than getting people to act by proposing an end or goal beyond them, entertainment has generally functioned as a kind of faux end in itself, a facsimile of fulfillment and happiness increasingly taken for the thing itself. There is no need to go to war: rather, one can get the excitement and heroic feelings and fulfillment of war by watching a war movie. There is no need to get married, or even have sex: one can get the excitement and erotic and amatory feelings by watching pornography or reading romance novels. Entertainment in this sense is essentially aimed at closing off desire and transcendence and the goods sought through action, replacing them with a closed, finalized feeling of wellbeing.
It is hardly too much to say that our society, and even our governance, has increasingly come to depend upon this faux-finality for its basic stability as a society. Fewer and fewer people now achieve the ends for which people once were expected to act: community and family and honor and success and money and status and power. All things being equal, when people lack these things, they inevitably act to get them: and in our society, such action would have devastating consequences. It is hardly too much to say that if our society did not have at its disposal a potent and powerful false finality to give to people, society would quickly be shattered to pieces by human efforts to achieve the goods denied to them by our society. Entertainment in this sense is the great tranquilizing drug, aimed at forestalling action as such.
In the beginning, as I said, these two techniques were developed in tandem, and worked together to preserve the brutal, disciplined, industrialized and militarized societies built up during the long 19th century. The early to mid 20th century saw governments faced with extraordinary crises and engaging in extraordinary efforts, efforts that required the action of nearly everyone in society. The period immediately after WW2 saw these efforts only gradually relaxed, redirected into building and trade and business and science on a scale never before seen, and likely never to be seen again. In this era, governments desperately provided jobs and women and money and amenities, prospects for people to do and have the goods they wanted, out of a desperate and largely justified fear of what would happen if they did not.
Over time, however, the balance between propaganda and entertainment, between information aimed at getting people to act and information aimed at getting them not to act, gradually at first, but eventually definitely and overwhelmingly shifted. This change is often associated with the cultural and generational shift from the World Wars to the Boomers, but more truthfully reflects changes that took place across the breadth of the period, and which were still taking place in the '90s and 2000s. Even such things as the hippy movement and the peace movements of the 1960s were based, ultimately, on taking action; and produced art that, looking back, reads by contemporary standards as remarkably unsophisticated and ham-fisted and propagandistic.
Even the sexual revolution, though driven by and leading to pornographic art, was in its beginnings a matter of action more than entertainment. People did leave their spouses and have sex with their bosses and abuse their children; and they were encouraged to take these actions by media that directly and propagandistically told them to do it. Even arts movements like rock and roll and folk presented themselves in a propagandistic light, with musicians putting themselves forward not as mere entertainers, but as messianic beings reading the signs of the times, pointing out political and social and religious pathways to the future, and driving people towards them. "The times they are a-changing."
Sometime in this period, though, the times did change, and the balance between entertainment and propaganda altered forever. The age of action and collective effort gave away, as if by magic, to the age of entertainment and collective relaxation. Looking back on the '60s and '70s, older Boomers and Gen X people alike redefined nearly everything that had happened as a matter of art and entertainment for its own sake; yes, they might not have made the Revolution, but they did make some good albums, didn't they? The depressive, pornographic '70s, the morning after the orgy, gave way to the energetic, fantabulizing '80s, then the tranquillized '90s.
In the midst of all this, information technology was reimagined, not as a means for producing human action, but as the perpetual, infinite seeking of connection and information for its own sake: a feverish effort that culminated in the creation of the smartphone, one of the strangest things any society has ever made, something that only our particular society, out of all those in human history, ever would have made. Pacification of human desires and action by way of faux finalities became, not a discrete goal of science or technology, but the only goal of science and technology and the economy and governance and human life itself. Everyone, at all times, must be entertained; and lo, they were. An object was put into every human hand whereby every human person, at all times, night or day, might look, and immediately behold an infinite variety of entertaining algorithmic products, and eat, and be satisfied. Amen!
This is all, as I indicated, the past; what the future will bring is, at the moment, uncertain. This, though, can be said: that while the technologies and sciences of human action produced by the 20th century were inevitably anthropocentric at least in means, entertainment has in our time been enabled to pass beyond human limitations in both means and ends.
After all, to get someone to act, one has to know something about him, about what he wants, about what he is afraid of, at least about what he will respond to. There is a necessary and intrinsic interiority about human action; and the art of propaganda and advertising alike was defined by terrifying intrusions into that interiority.
Getting someone not to act, however, is a far easier task, and one much more amenable to merely extrinsic factors. Efforts initially focused around producing false versions of human happiness and fulfillment considered broadly, facsimiles of human action and its ends: heroic quests, exciting romances, victorious struggles, suffering giving way to triumph, death and resurrection. There are, however, it turns out, much easier ways to pacify people: including various ways of shocking people, stimulating them, frightening them, entrancing them, upsetting them, intriguing them, exciting them, disgusting them, occupying them, distracting them, and/or confusing them. In my lifetime, I have seen art, first by way of television and then by way of the Internet, expand vastly to fill all these niches and more.
In the process, art has moved more and more towards a simple attention-economy approach, where even the production of specific feelings in people is rather besides the point. Negative feelings, ugliness and disgust and violence, have rather noticeably and pervasively replaced positive ones, even in the domains of politics and pornography; and, increasingly, mere attention has replaced even negative feelings. Art does not even seek to cause us pleasure, let alone catharsis; it merely distracts us, occupies our attention for a brief moment before passing us on to the next algorithmic nugget.
Considered most properly, this approach represents the finest and highest refinement of the essential information-technology science of the 20th century; the most effective approach imaginable for preventing people from taking action whatsoever. Infinitely and perpetually distracted, divided from their own desires and their own feelings, many people have lost the moment-to-moment continuity and self-consciousness necessary to take even vicious action, let alone virtuous action. Teen pregnancy, as they say, is down.
In other respects, however, the present moment shows the beginnings of what may be a fundamental shift in information science and technology itself, one that threatens to undo the bases of everything our society has built over the past centuries. All of our information-technological science, all of our advertising and pornography, has in the final balance functioned only because human beings are fundamentally rational beings, beings defined by the act of knowing, which means necessarily by a desire and action for truth and goodness, for a good that is objective and outside of themselves, which transcends and goes beyond them. People learned how to motivate us to take action, people learned even how to make us not take action, only, in the final balance, by producing facsimiles of real, objective things that we wanted, or at least that we were afraid of, or at least that would interest us, or at least that would successfully occupy our minds for a moment.
There is a legitimate sense in which human desire is infinite, aimed at God: but the infinity which it seeks is necessarily and properly transcendent, and cannot be reached by any mere indefinite accumulation of momentary things. In this sense, as Jean Baudrillard put it quite correctly in 2005, the human ability to be entertained is in fact finite; and its limits fall far, far short of our technological and economic ability to produce entertainment. The first diverting algorithmic nodule may legitimately occupy our attention; it is far less likely that the ten-thousandth will do so. At a certain point in the procession of fundamentally similar, fundamentally uninteresting bits of truth-independent media nothingness, the mind becomes inevitably sated; at a further point, it might become overwhelmed; but at some point, it will simply become bored.
Hence, over the past decades, the mere profusion of entertainment content, its penetration into every area of human life and every moment, has inevitably threatened the entire enterprise of entertainment: and the rise of algorithmic and AI content has now produced a veritable crisis.
This crisis is measured well in the popular rise of terms like AI slop and brainrot: terms that designate the increasingly common, and what is more, increasingly recognized status of art and media that fundamentally fails to fulfill its function, even its advertising-pornographic function, even its entertainment function, even its occupying-our-mind function.
Increasingly, our society produces media that is not centered on human minds, but rather on machines: computers making images and text for other computers. A great deal of the Internet, at this precise moment, consists of AI bots producing content that is interacted with only by other AI bots: and this proportion is growing with each day. To encounter these dark places of information is to encounter strange, new worlds: worlds, however, that are not at all interesting, that do not succeed in, and often could not succeed in, even briefly occupying any human mind.
Revolutionary shifts in human society are often presaged by small symptoms: and are often centered on simple losses of skills and competence across generations. Technology as such, as Plato recognized long ago, is among other things the direct and necessary cause of loss of human competence: as people simply forget to do things that were once done by human beings, but which are now done by machines. Most human effort is not a matter of knowledge, of gnosis that can be easily transcribed and then easily learned from transcriptions; it is rather a matter of technique, of techne: which is to say, of skills and habits applicable to vast, indefinite domains, that must be not only learned but skillfully applied by a master, and that are lost unless they are constantly practiced. The number of human skills lost to technology over the past centuries is vast: and it is growing every minute. Considered in the long run of history, the era of the growth of technology has also been the era of the loss of human competence.
The late 20th and early 21st century, I am convinced, will appear in retrospect as the great era of human mass media skill and accomplishment: an era where more people were more skilled in producing media that at any other time in human history. Already, however, one can see these skills fading.
Art, as everyone complains constantly, has gotten worse: which is to say, increasingly we as a society find that we can no longer produce the kind and degree of entertainment that we once could. Millions of craftsmen and artists found that they could not produce a decent Star Wars movie. Thousands of craftsmen and artists and actors and writers tried to make a new Star Trek series, and produced bizarre nonsense. The television genre, as it once existed, has ceased to exist: virtually no one now, I suspect, could produce a 20-something-episode budgeted season of television in real time. AAA video games are increasingly rare, now that fewer people can appreciate the effort and skill of elaborately-polished games with their own scores and voice acting, and even fewer people see the point in investing in them, and even fewer people, I suspect, still know how to produce them. Even where people can and do still produce the kind and degree of entertainments that marked society decades ago, they are increasingly produced by older people, older hands, with smaller budgets, and for smaller and more niche audiences.
Increasingly, the hallmark of our society is boredom, dissatisfaction, and ennui: people complaining about the lack of entertainment that interests them. That effect, of course, comes less from the actual dearth of good art and entertainment as the actual proliferation of bad art and AI slop. Good, masterful music is being made, but is increasingly difficult to find when everyone listens to the same AI-generated Spotify playlists increasingly full of music specially generated to attract selection for AI-generated playlists. Good, even masterful television is being made, but has to compete with ambient wallpaper shows about Mormons redesigning houses to be more monochrome, Youtube videos of Mr Beast building and sinking the Titanic in thirty seconds, and Emily in Paris. Good, masterful books are written, but fewer people have the attention span to read them when they cannot stop scrolling their phones every day and hour. Good visual art is produced, but no one sees it when Instagram is packed full of AI-bots following each other and posting and re-posting AI-generated kitsch.
Most fundamentally, as with any drug, the restless, addictive mindset produced in people by the constant pacification effect of smartphone use is in the long run simply destructive of any ability to be actually entertained, actually informed, actually satisfied, with anything, fake or real.
As AI--which is, in the final balance, merely a technology for automating the production of bad and ineffective art and making it worse in the process--takes over more and more of our society, I suspect all these trends will only accelerate.
But what will happen when our society no longer can tranquillize its people from action? What will happen when it can no longer suppress human desire and human energies with an infinite array of faux finalities? What will happen when generations of dissatisfied smartphone addicts the world over decide to take action to seize the real, scarce goods they desire, that they feel they were denied? What will happen when we see through it all?
That is perhaps the most important question of our times: but it is a question that even I cannot begin to answer.
The New Economy
"It's the economy, stupid," a man said once upon a time; and he had a point. If there is one consistent, overriding concern in American politics across my lifetime, it is The Economy; and particularly the idea that it is the job of those with power to make The Economy good, and their fault if The Economy is bad.
Back in the day, this responsibility was imagined mostly in liberal-managerial terms, as either a question of Professional Competence or a matter of opposing theories of How to Make the Economy Good.
Were tax cuts good for the economy? People mostly agreed that yes, they were; though some of the time people questioned if Tax Cuts for the Rich might perhaps be not so good for the economy as Tax Cuts for Working Americans. And then, of course, there was the question of how to balance important Values like Tax Cuts with other Values that we had, like Balancing the Budget, a consideration that was very important because it would ultimately Make the Economy Good too; though some people, confusingly, argued that Not Balancing the Budget would Make the Economy Good even more.
Even these considerations, of course, had to be balanced against other Values like making sure Less Privileged Americans people worked hard and got jobs and contributed to the Economy, a goal best achieved through some perfect, managerial combination of Welfare and Government Programs and Education and Welfare Reform involving Work Requirements. Of course, all this, people argued, would ultimately Boost Productivity and Lower the Unemployment Rate and Increase Workforce Participation and thus Make the Economy Good as well. Really, there were very few good things one could do that did not ultimately benefit the Economy; and one might be forgiven for thinking that the definition of a good government action or policy was simply one that Made the Economy Good and raised GDP. Still, all this was very complicated, and best left to Experts and Elected Officials, the only real concern of Voters being to Inform Themselves and ensure that they in fact voted for the person with the best track record on the Economy.
These were fierce enough debates, in their own way; and we have not yet precisely left the era they embody. Donald Trump clearly won the 2024 election in large part based on the idea that he would for some reason be better for the economy than Kamala Harris; since after all, when he had been in office, the Economy was Good, and that since under Joe Biden the Economy had been Not So Good, and that therefore he must have some kind of expertise or skill or perhaps merely magical charisma ensuring that the Economy would be Good again for any given four-year period when he was in office. As of this writing, Trump has just passed a massive Tax Cut, which his opponents are accusing of being merely a Tax Cut for the Rich; and numerous serious people are criticizing not only his tariffs, but also his immigration policies for their dangerous potential to Make the Economy Bad.
As I have argued at greater length elsewhere, the Economy (also known as the Free Market, the Invisible Hand, and by other less savory terms) is one of the great, imminentized deities of modernity: and it will no doubt be with us for quite a while longer. Still, there can be no doubt that the past decades have seen a growing loss of faith in this deity; a loss of faith less pervasive and definite, but in its own way nearly as great as the one that marked the 1930s of the past century. In the long run, though, this loss of faith may well prove even greater: particularly if the shift from an overall positive and progressive to an overall negative and nostalgic perspective on the world remains and deepens.
Whether or not this faith remains, however, the entity in which people put their faith itself has changed in drastic and perhaps irreversible ways: ways that cannot help but have great effects in the long run. The connection between the financial sector and the actual productive economy has always been more distant than usually acknowledged; and it has been a very long time indeed since most of what went on on Wall Street was people giving money to companies that made things to sell to consumers in the hopes that they would make a profit.
Still, over the past decades, this gap has grown increasingly large: and in the aftermath of the Great Recession, it was deliberately made bigger in an effort to bolster financial markets, insulate them from the shocks to the productive economy, and convey a sense of public economic recovery despite the absence of such recovery in reality. A trusted friend who works in and reports on finance informs me that a smaller proportion of money in the financial markets is bound up in economic production of any sort than ever before, with increasing amounts of money bound up in ever more abstract financial instruments ever less connected to anything outside of financial markets themselves. He highly suspects that the actual productive capacity of the American economy is less than in the past, and has been declining for some time; a reality that can at best be gestured at, but which can be seen historically in the visibly lessening power of the American economy or government to build anything at scale.
The 20th century was the era of collective effort and central planning; characteristics as visible in the Interstate Highway system as in the World Wars, made incarnate as much in strip malls as in giant dam projects. It does not take much effort to notice that the actual quality of commercial building, both for corporate uses and for housing, has drastically declined over the past few decades: a trend associated as well with the proliferation of national or global shared corporate cookie-cutter building plans and an increasing dearth of working architects. The breakdown of infrastructure, at least in the US, has already become proverbial; and Biden's purported Build Back Better infrastructure financing recently provided a universally popular facade for corruption, graft, and electric vehicle incentives.
On a cultural level, however, this trend has been accompanied by the massive, runaway success of various embodiments of what may be called the grift economy: economic activity that by its very nature produces nothing real. Most striking, perhaps, both in itself and by its correspondence with the increasing nature of financial markets, has been the astonishing growth of legalized gambling industries over the past decade. American sports are now economically a subsidiary of sports-betting organizations; and this is only the visible portion of a much larger iceberg of online betting on nearly everything imaginable. Meanwhile, the new frontier of what was once called e-commerce, has, after the Amazon boom, long since become a stable sector of influencers--which is to say, advertisers--which is to say, con-men and pornographers. The most marked financial trend of the past decade or so has been the rise of Cryptocurrency and NFTs; which is to say, financial pyramid schemes and mini-markets that by their very nature are detached from actual production and add little or nothing of any value to the world. These are only the most visible manifestations of a much more pervasive shift in American and Western culture as such; where more and more people are focused, both personally and as measured by self-help and financial media, not on producing anything or adding anything to the world, but merely on getting paid by tricking or exploiting others.
In any case, I am less interested, once again, in present realities as in overall trends: and as current trends have emerged over the past decades, they have led to a recognizable picture of an allegedly booming economy, as measured by financial markets and public numbers, presiding over a populace of unemployed and chronically underemployed workers living in cheap housing amid failing institutions and crumbling infrastructure. The more this trend continues, the less reason anyone will have to pay attention to the financial markets, let alone to take their success as indicative of the health of The Economy.
One of the most difficult things to predict is what will follow a loss of faith: which nearly always means a putting of faith into something else. Faith in the economy of the 19th and early 20th centuries failed in the 1930s, and was in the short term largely redirected toward faith in central planners, in governments, in leaders, and, at times, in dictators. Perhaps the most important question of the emerging century is where 21st century faith will go.
One answer, of course, already visible in the already emerging magical-grifting economy, is that faith in the abstract economy will decline, but not faith in money and its power to shape the world. From being a beneficent god, however, it will appear more like a genie, or even a demon; one to be courted with magical rites, or even propitiated by blood sacrifices. Getting rich will appear, as it often has, less like the inevitable result of financial policy or the inevitable reward of a life well-lived, and more like a desperate act of theurgy, a desperate deception, a desperate theft, or even a desperate murder.
Viewed in this light, one can already see the beginnings of a new type of financial and economic figure, one defined as much by their media position and personal charisma as by wealth as such. Elon Musk is certainly one such figure; a businessman who has made his money less from creating companies that make things that make profits, and more from charming venture capitalists and financial markets and government incentives with a public persona and a brand and at times even an ideology. In an earlier generation, Donald Trump himself was a founding example of this genre; though one preceded and shadowed by the more general growth of celebrity culture.
In all these examples, however, what is most notable is the tethering of financial power to a kind of personal charisma or magical power: the belief that The Economy is a god who plays favorites, who is capricious, who may be propitiated and who may even be controlled, but can never be trusted.
So far, then, the loss of faith in The Economy as an imminentized divinity has been partial at best, and has tracked very closely with both the rise of the grift economy and the rise of charismatic financial figures. If current trends continue, I suspect we will see more of the same; though it is also likely that once a certain threshhold of faith is crossed, we will begin to see entirely different, and quite possibly much better, things. Forms of economic power that do not depend on The Economy as we know it, that are even opposed to it, may be on the horizon as they were in the 20th century. And that is harder to predict with confidence.
The New Monarchy
I have been saying for almost fifteen years now that modern politics and society and culture and especially mass media is by its very nature more suited to monarchical than democratic or parliamentary governance; and have been only confirmed in this belief by each further political development.
At the present moment, I would say, more of the world's population lives under a substantively monarchical government than at any point since the 19th century. Of course, these monarchical governments rarely, if ever, call themselves monarchies; but that is hardly a significant break. Hitler, Mussolini, and Stalin, three of the greatest monarchs in human history, all referred to themselves by more or less equivocal titles; and, to this day, entirely powerless figureheads the world over refer to themselves by the ancient, revered titles of monarchy.
Without definitions, there is no point in talking about anything, politics especially. By monarchy here I do not refer to the property of wearing pretty clothes, or a shiny crown, or being called "king," or any such thing; rather, I refer, according to the Greek etymology, to the actual "rule," "leadership," or "precedence in causation" of one, and only one, person.
In the real world, of course, monarchy has always been at best an abstraction; since no one person can constitute, or ever has constituted, the one and only source of precedence, rule, or leadership in any human community. In measuring monarchical tendencies, then, it is essential to look at the actual, practical role(s) played by people and institutions in the actual functioning of governments. Considered in this light, we may simply notice that there have been many governments in human history where one person could and did make important decisions and initiate important government actions as a single and sole person; and also many governments where one person could never do such a thing. The former tendency we may call monarchical, though there are other words for it.
When the President of the United States signs a piece of legislation into law, he is performing a function that he alone can perform, and a function that he must perform alone; and hence he is acting monarchically in a pure and perfect sense. On the other hand, when a Late Antique Christian elected a bishop in assembly by acclamation, he was performing an action that he could not in any sense perform alone, since it was neither an individual voter, nor even a majority of individual voters, who had the power of electing, but rather the assembly taken as a single unit and expressing itself solely through collective actions such as shouting, chanting, etc. Of course, taken in this sense, many governments have features of monarchy without in fact being monarchies overall--even in ancient Athens, there were actions that could only be performed by individual magistrates, and not by the assembly as a whole.
Perhaps the best example of a political figure that is certainly not in any sense and to any degree a monarch is the present king of England; a person who, as Chesterton long pointed out, has been systematically and uniquely singled out and excluded from ever initiating any political act or expressing any political opinion or making any political decision whatsoever. On the other hand, General George Washington, as the leader of an army, certainly made many consequential decisions, including crossing the Delaware River and stepping down from office, as a single and sole person.
The question of who in fact originates, or even performs, a governmental action may not be quite as simple as the above example. All governments feature quasi-fictions by which an action in fact performed by one person or group of people is treated as in some sense performed by someone else: sometimes in the sense that the action is authorized by the former, sometimes in the sense that it is performed by their more or less permanent delegate, and sometimes via actual legal or public pretenses and deceptions. Constitutionally speaking, the President of the United States simply is the executive branch; and all the executive functions performed by the millions of people working for the federal government are treated legally as carried out by him via people he has delegated.
Still, even here one can distinguish monarchical tendencies based on the degree to which the monarch is the actual origin ("arche") of the decision, even if carried out by someone else, its actual first cause, even if directly effected by someone else. In practice, we can judge this by the degree to which a single person has the practical power to initiate governmental actions and determine their content and shape and execution.
I apologize if this comes off as dry; but without some theoretical and definitional grounding, there is little point in discussing the question at all. It is in this sense, of the practical power of a single individual to make and influence governmental actions, in particular, that I mean that current trends are friendly to monarchy, and that governments in our world have been growing more monarchical, and are likely to grown even more so.
Part of the reason for the growth in monarchy, I would argue, is in fact the trend that, considered from another perspective, might be seen as making it less monarchical. As I have argued elsewhere on this blog, the hallmark of modern governments, particularly in the 20th century, has been the growth of complex, technical action and hence a growing communication gap between what governments actually do and their ability to convey what they do to their subjects: a trend that I summed up under the (deliberately inflammatory) title "conspiracy."
Almost as important, however, as the gap between what governments do and what they can communicate to their subjects has been the gap between what governments do and what official governmental office-holders do. Insofar as governments have been engaged in complex, technical tasks like regulating the lead content of plastic toys and building dams and managing trade policy and developing new weapons and planning for military campaigns, they have been engaged in tasks that one person, and even more disparate or less organized or less trained groups of people, could not possibly perform, and indeed to a great extent which they could not possibly even understand in detail, let alone meaningfully regulate. This has been even more true, of course, in systems where leaders are popularly elected and taken from the populace or traditional elites, rather than from the technical professions.
In a real sense, this gap between the increasing rise of technocracy and the continuing prominence of elective and elite leadership constituted the crisis of governance in the 20th century. Joseph Stalin was certainly dictator of the Soviet Union; but even Joseph Stalin could not possibly keep up with the actual operations of the Communist Party organizations under his command, and hence was famous among Soviet leadership for his caginess, vagueness, centrism, and frequent self-contradiction in official meetings where such things were discussed and regulated. What was true in the oligarchical cadres of the Soviet Union was even more true in official democracies such as the US, where the complete defiance by the CIA and other intelligence agencies of all the official rules of their existence, and nearly all their oversight by elected officials, has recently been laid bare in startling terms by the release of the so-called "JFK files." These, though, are only exceptional cases in a more basic, systemic clash between the public and official on the one hand, and the technical and private and conspiratorial on the other, visible in the day-to-day operations of nearly every government over the past century or so.
Still, the most relevant thing to me now is the degree to which this increasing privacy of government has begun to augment the actual power and status of monarchs in increasingly extreme and ever more unpredictable ways. Put simply, technocracies and private conspiracies certainly have power--in certain senses, more power than ever before. What they entirely lack is authority; and it is this that is the fundamental and indispensable political concept, not power. Indeed, in the proper sense, power is not a political concept at all. A gun and a bottle of whiskey can both make someone do something they would otherwise not; but neither ever engages in governance.
As I have multiple times and at length discussed in this space, politics depends essentially upon communication: both on the communication of needs, desires, wishes, ends, and goods from subject to ruler, and on the communication of intentions, goals, and commands from ruler to subject. The CIA may have the power to kill people--but so, after all, do organized crime outfits and private corporations. Indeed, the history of the last century has seen a precipitous and continual decline in any real difference between the methods and organization of corporations, organized crime, and conspiratorial government agencies. None of these, though, regularly engages in the communicative efforts necessary to clearly convey intentions and commands to the public at large. And so none are governments, or can possess political legitimacy, which is in the final balance simply the ability to give actions moral force, to have them viewed as good or binding or obligatory by those who carry them out.
As government has gotten more and more technical and more and more conspiratorial, then, the public and communicative force of governance has quite naturally been stably attached to fewer and fewer people. The CIA is a small group of people by comparison to the public at large, but still has more than 20,000 employees; of those, however, only the CIA Director and Deputy Director are publicly known and could ever conceivably communicate anything to the public, and so wield authority in their own persons.
Of course, even where officials are publicly known, they do not frequently wield any actual authority with the public. The irony of governance in the modern era is that, while in one sense the publicity of government has vastly increased--with social media posts, laws, decrees, policies, comments, press releases, recommendations, complaints, and numerous and sundry forms of advertising all posted onto the Internet daily and hourly--the actual authority reliably wielded by government officials and acts has vastly, precipitously declined.
The truth is, of course, that the two trends are to a large extent functions of each other. Herodotus features a famous story whereby the Persian King increases his authority by gradually withdrawing from public life, appearing ever more rarely and in ever more attenuated form, covering his body, veiling his face, appearing behind curtains, and so forth: thus granting his presence, voice, actions, and words a special, divine charisma entirely beyond of the ordinary mortal who first assumed the kingship. Rulership, after all, does not consist in the mere prevalence of communication, but in the force those communicative acts possess. Yet to possess such moral force, the ruler's official communicative acts must be sufficiently rare, sufficiently clear, and sufficiently distinct from ordinary communicative acts to be reliably recognized, heard, and obeyed.
This is, to say the least, no longer the case in the age of the Internet. In the Year of Our Lord 2025, headline news stories frequently consist of no more than glosses on the rambling personal statements, written in an idiosyncratic personal style and posted on a personal social media page, of the President of the United States.
This is, though, only the tip of a massive and ever-growing iceberg: one perhaps most visible during the pandemic, when public health authorities flooded the Internet and the airwaves and the walls and windows of the world with a shifting, indistinguishable mélange of ever-changing and frequently contradictory laws and decrees and recommendations and suggestions and threats and stories and fables and generalizations and facts and models and lies and simplifications and projections and predictions and guesses and advertisements. Though in the short term, buttressed by the panic generated by a genuine crisis, this led to an enormous and unchecked growth in the authority wielded by such officials, the utterly predictable result of this communicative strategy has been the near-total loss of authority by all public health authorities everywhere, a crisis of legitimacy that has increasingly spread to regulative bureaucracies of every type, academic and intellectual institutions, and organized science itself.
The pandemic, though, was a genuine global crisis, one whose effects are frequently noted by intellectuals and scholars and will continue to be. That this crisis of legitimacy is in the deepest sense merely one symptom of a much broader crisis, with its roots in technology and mass media, is much less often noted.
I lived through the era when public and political figures began taking to social media en masse, creating both private and public accounts and joining group chats and communicating as never before with ordinary people. This strategy at first seemed an unmitigated bonanza: allowing numerous people to build their own political movements and spread their political ideas directly to rulers, and allowing rulers to influence their people more directly than ever before. In this, Barack Obama and Joe Rogan are merely two conspicuous examples from a class of millions.
Very quickly, however, things soured: for the simple reason that very few public figures retained their charism and specialness while posting daily and hourly on social media. I, like many other people, was surprised to find that Harvard Professors, governors, Senators, public intellectuals, journalists, and even Presidents on social media all behaved more or less the same: which is to say, more or less like whining, cringing, bullying, gossiping, trend-chasing middle-schoolers. To anyone who got online during this period, the disillusionment was more or less instant and more or less complete, with decades of careful civic education and centuries of political rituals wiped out in a matter of minutes.
To see the Emperor only via statues and images on coins and an occasional glimpse of a strangely still, veiled figure covered in gemstones surrounded by purple dragons and armed soldiers in a public procession was, most of the time, for most people, to have little doubts about his charisma, his right to rule, or even his divinity; to hear the Emperor's every grumbling private thought is to see little difference between him and you, or even between him and your middle school bully. And this shift, an epochal one in the conceptualization of political authority as such, is now more or less complete. Rulers are merely people with social media accounts engaged, like all other people with social media accounts, in personal promotion, corporate messaging, pornography, and advertising.
What, then, is the difference between the President and me? Increasingly, it is merely that he has a better brand, with more skilled messaging and more followers, than I. Social media offers its own forms of charisma, to be sure, from rapid-cut editing to talking in funny ways to being really hot; but thus far, no mass media register of power has been developed that would or could clearly set apart rulers from ruled.
Hence, the last few decades have seen an increasingly bizarre arms race among rulers to develop media methods of projecting and demonstrating their power, setting themselves apart from their subjects, and communicating their ideologies and commands and wishes. In India, Narendra Modi has worked hard to build a personal and political and cultural brand through everything from slogans to masks to cheap entertainments to History-Channel-style conspiracies to live-streamed lynchings to AI deepfakes; in Russia, Putin has developed his own brand using everything from Tsarist nostalgia to American music to Internet pornography to mob-film bluster to Orthodox fantasia to Soviet kitsch. Donald Trump has experimented boldly and endlessly with different varieties of American patriotic kitsch, nouveau-riche wealth-display gaucherie, pornographic obscenity, filmic camp, reality-TV manufactured drama, grade-school-bully putdowns, meme-magic, crypto-currency grifting, and so on: and in the process has developed a recognizable brand and become an epochal figure in American politics. Still, in all this it does not seem to me that anyone has yet developed a single, stable method of asserting power capable of reliably utilized by all officeholders.
Rather, what all these figures have done, through all their infinite symbolic efforts, has been less to develop a language of power applicable to any ruler as to turn themselves into brands. It is impossible to say precisely what Trump stands for, or even how he communicates his power to others--both are, by necessity and to an extent by design, constantly changing to keep abreast of the waves of culture. Trumpist politics have by this point run the gamut from race-baiting to racial utopianism, from anti-interventionism to regime change, from reindustrialization to refinancialization. The only thing that has stayed constant and increased in power through all this has been Trump himself: which is to say, an individual person: which is to say, a monarch.
Here, though, there can easily be a great deal of misunderstanding about what makes a figure like Trump a monarch. It is not primarily that he has broken or gone beyond law in his official acts--or at least, not drastically more than previous Presidents. Private conspiratorial technocratic governance is much more likely, by its very nature, to transgress public law, and has frequently if not normatively done so over the past century. Similarly, as partisanship has increased and the speed of global economic and political and media events have increased, the 18th century institution of Congress has proven itself both unable and unwilling to function in its constitutionally-mandated role of the maker and initiator of laws for the United States. Hence, it has long since transferred nearly all its practical powers to the President and the executive bureaucracy presided over by him--retaining merely the partisan power to either ratify or fail to ratify or attempt to block the President's initiation of legislation. Even this power has, in recent decades, grown weaker and weaker as Presidents have come to rely rather on decrees, known euphemistically as Executive Orders, to initiate all but the most far-reaching and complex legal reforms. Still, through this change the United States has retained an effective two-branch government, with courts taking up the pervasive role of reviewer and approver of all legal actions and even gaining the power to initiate legislation on their own count. In none of this has Trump made a significant difference other than to reinforce the existing system.
In nearly every instance, the difference in what Trump has done has been not so much substance as style: the crucial fact that what previous Presidents did privately and technocratically, Trump has done publicly and communicated to everyone. In immigration policy, Biden and Obama deported just as many or possibly more people, working the system to its maximum capacity, but doing it bureaucratically and secretively and technocratically and on the down-low: Trump, though, has rendered deportation a truly public act, and increasingly a public spectacle, instructing ICE agents to dress up and pose for the cameras, spreading entirely fake images of deportations, making examples of prominent students at universities, engaging in performative and public cruelty and boasting about it to everyone. Similarly, previous Presidents, especially Biden, used their bureaucratic power rather frequently to promote and enrich their favored causes and institutions and to undercut institutions and groups they disliked: Trump, though, has actively handed out money and power to people he likes, publicly boasted about taking down Harvard University, and publicly picked fights with the Governor of California.
How this actually affects Trump's power is to a degree complex. To a large degree, the publicity of Trump's monarchy has actually decreased his effective power, as courts have jumped in again and again to block his actions and institutions and authorities and ordinary people alike have rallied against him. I highly suspect that when all is said and done, most of the marquee actions of Trump's Presidency thus far, from cutting government spending to breaking Harvard to deporting a million people, will have proven failures, precisely because of the publicity he deliberately courted and his own lack of institutional and technocratic skill.
Still, this should not make us underrate the true importance of Donald Trump as an epochal figure for both America and the world. Everything that Trump has touched, including the most secretive and technocratic policies, has by magic become public knowledge; everything he says has become a decree, read and listened to and pored over by millions both in America and around the world. Inasmuch as law is what is actually communicated to and obeyed by the public, then the average Trump Truth Social post possesses more legal authority than the average Congressional law. In this, he has helped create the shape of the new monarchy of the future.
As Chesterton pointed out, as people have known from the beginning of human civilization, the great advantage of monarchy has always been its communicative simplicity and force: the fact that there is in the final balance no symbol or image or emblem as immediately recognizable and powerful as a particular human person. A flag, in contrast, is merely a rag on a pole: and I, for one, can attest to my complete inability to remember or tell apart the different combinations of colored stripes that make up most of the flags of the world.
Both popular assemblies and representative governments are, or ought to be, embodiments or representations of the people as a whole; and indeed, there have been cultures, sociable and institutional ones as a rule, where people genuinely thrilled at the sights and sounds and smells of the demos, of the ordered gathering and deliberation and decision of mass, collective entities. Modernity alas, has all but destroyed the sociability and tolerance that would allow that vision to have any force with nearly anyone. The isolated, fearful neurotics that make up modern American society cannot handle the sounds of crying babies in public, cannot handle family gatherings, are afraid of crowds, afraid of strangers, afraid of other people, and so naturally cannot bear the sight of a vast group of people deciding together. Modern political theory, meanwhile, has demonized collectivism and collective action and exalted the individual so thoroughly as to make democracy in any meaningful sense theoretically offensive to most people; while mass media has presented for the last fifty years again and again only stories of heroic, Nietzschean individuals defying the popular will.
Meanwhile, modern mass media has flooded people's lives with advertising and pornography; and nearly all public authorities have either embraced technocratic privacy or else simply joined themselves indistinguishably to that same, indelibly private flood of advertising and pornography. The inevitable result of all these trends to the degree they continue--and I mean this entirely literally--is to render republicanism and representative government and democracy in any sense impossible, and hence to render monarchy the only possible form of governance.
Or, put a different way: insofar as modern society is based on the pervasive individualization of people and the pervasive destruction of public communication, the final and sole thing that can be truly public, and therefore truly political, is an individual person. Hence, for such a society, monarchy is the only conceivable form of governance.
This is the theoretical statement. The practical statement would that over the past decades, one can simply note that the number of public decisions, about law and policy and war and peace and morality and social change alike, made or initiated or authorized by a single person have increased exponentially--and that this trend must continue so long as the broader trends it depends on do so.
I am not, personally, a monarchist--but nor am I an anti-monarchist in a strong sense. In basic perspective, I share the Catholic doctrine of Leo XIII, namely, that while government depends essentially on justice and the natural and divine law for its force, the choice of different governmental forms is an exercise of prudence that differs legitimately for different groups of people at different times and places. In this prudential decision, there are certain basic advantages to monarchy, as well as certain basic disadvantages.
While I personally favor more genuinely democratic institutions and forms of governance, since they are most effective at achieving and retaining the social goods of communal harmony and unity that I think ultimately most valuable, very little in the modern world writ large appears to me particularly suited for them. The pressures towards monarchy discussed above are not entirely negative--they are in part the result of genuine reactions towards essential political goods such as publicity, responsibility, and justice. Politics must be public and common to have any meaning; political actions must be considered human actions for which certain people publicly take responsibility and for which they are in fact held responsible. If the only practical way for political actions in the contemporary world to be public and responsible is for them to be authorized by a single person known to all, then so be it.
More importantly, however, monarchy is itself an equivocal term covering very different forms of government that differ enormously among themselves for good or ill. A public, responsible monarchy validated by democratic votes and checked by a powerful judiciary is very different from a violently partisan monarchy embodying powerful minority interests against the interests of other partisan minorities and/or the majority; both are, in real senses, the reality of American governance in 2025. The future of the 21st century is likely to depend less on whether or not we avoid monarchy tout court as on which form(s) of monarchy are ultimately chosen and reinforced in different governments around the world.
Or, of course, there may be in different degrees genuine revivals of democratic institutions and sentiments: trends that I, for one, would heartily welcome.
The New War
The long-term trend towards monarchy can be in different degrees positive and negative. The current, present crisis, however--a crisis that is not only imminent, but actually here, nearly fully formed, and by every indication is set to only get worse over the next decade regardless of who is in power around the world--is the crisis of war, what it is, how it is carried out, and how it relates to public and governmental power.
The power to initiate and make war has always sat uneasily alongside the public responsibility of rulers for the common good of their people. While the necessity of defending that common good from external threats, of establishing and retaining basic social peace, and enforcing laws has always been theoretically and practical central to the role of governments, it is far from the case that war or even force more generally always or even often serves those purposes.
Unless carefully controlled, men in arms kill and plunder their own people as much as foreigners, turning quickly into bandits or worse; and while war with foreign states can be an economic bonanza, resulting in plunder and slaves and establishing trade networks, it can also destroy trade, prevent regular productive labor, and invite foreign devastation and plunder and slave-taking in turn. Insofar as rulers have been soldiers, committed to warfare as a basic way of life, their track record has historically been rather poor, with little distinguishing bandits from barons in 11th century or juntas from colonial plunderers in the 20th. Yet the potential of warfare, in terms of wealth, power, trade, glory, and even art and culture, is frequently so enormous that it has, as if by magic, drawn all the dreams of rulers even in peaceful societies towards itself.
Still, for every successful Empire built on warfare, there have been thousands of failed warlords and failed states, reminding us that, both theoretically and practically, governance aims at peace and the common good, the real substance of social and political life, and not at war or conflict, which are always at best means and more frequently obstacles to this end.
If there is one practical lesson that can and should be drawn from all this, it is that war is and can only ever be justified via a public, political, and responsible ordering to the common good. Whenever and wherever societies have accepted a privatization or de-politicization of warfare--whether via castle-building barons in the 11th century, or mercenary companies in the 16th, or corporate plundering in the 20th, or private terrorism in the 21st--the predictable result has been devastation. There are many reasons for this, but the most fundamental is that once war is untethered from common goods, it naturally attaches itself to private goods: which are, by their very nature, as numerous as the number of individual people, and frequently in competition with each other. If I fight for my country, that country's good unites me with all others in that country, and prevents conflict with at least these people; if I fight for my own wealth, however, then I am at war with all others. Hence, to accept the privatization of war is to proclaim a universal war.
This is another theoretical proem to a very immediate and practical problem: which is that as governance has gotten harder and more complex over the past decades, warfare has only gotten easier and easier. We stand now on the verge of what may be a new era of barbarism, one made possible by the increasingly widespread acceptance of the privatization of warfare.
The 20th century saw the World Wars, the greatest outlay of collective human effort in human history, encompassing the world and embracing every aspect of human life. If any events in human history could be said to be public and political, it is surely these: events that nearly everyone participated in, consented to, and sacrificed for. The paradox is that the very scale and universality of warfare in the 20th century also produced the conditions for warfare's privatization.
The aftermath of the World Wars saw the promise of a new era of peace, and a near-global commitment to "no more war," buttressed by the new reality of the H-bomb, a weapon so powerful that it could not be used without entirely destroying the peace, common good, wealth, and very existence of the state that had used it. The existence of this weapon, and the reality of Mutually Assured Destruction, necessarily led to a complete rethinking of the nature of warfare and its relationship with governance: a rethinking that could have led in a number of different directions.
The most rational direction this effort could have led would have been the universal decision to never use these weapons, since their usage was contradictory to any rational ends of warfare and governance alike, and some kind of mutual pact to destroy all such weapons everywhere and never create their like again. This, of course, did not happen. Nor did another possible alternative happen, namely, the escalating attempt to find every possible use case of these weapons short of total global annihilation.
Rather, the world chose perhaps the strangest possible option: the removal of the use of nuclear weapons entirely from the realm of public, political responsibility and action and their establishment as the bedrock foundation of a new form of global sovereignty. The decision to develop, possess, and use nuclear weapons was never, in fact, allowed to exist within the realm of public political debate and decision-making; it originated in the realm of scientific and technical experts, then was gradually removed to the secretive structures of the military as a whole.
Considered soberly, the choice to drop a bomb from a plane in WW2 already involved little human input into the decision to take a particular human life as opposed to others. In the so-called "strategic bombing" campaigns of that war, the decision was made to bomb indiscriminately not only in an effort to terrorize civilian populations, but also because targeting methods of the time rarely allowed for the effective hitting of military targets. Such bombing campaigns, although quickly proven mostly ineffective for any strategic use, continued and escalated throughout the war in part due to bureaucratic and technocratic inertia, with the USAF wanting to demonstrate its own importance and use the ordnance built for it and shipped at great expense to the frontlines.
This basic logic, at first admitted as an exception by Allied governments, became by the end of the war, and with the development of the Atomic Bomb, a new public rule of warfare. In particular, the dropping of the Atom Bomb on Japan, and the total lack of discrimination involved between military and civilians, guilty and innocent, and even between one group of people and another, was greeted with universal approbation and praise by nearly everyone in America. It still remains, to this day, a kind of public impiety for anyone to suggest that this act of mass-murder was anything but moral and responsible.
A little-known aspect of these acts, however, is the fact that neither bombing was in fact directly decided upon or authorized by the President of the United States: instead, Truman merely gave his general consent, after the success of the Trinity Test, that new Atomic Bombs be dropped at the discretion of the military as they became available. This in itself was nothing unusual, as so-called "strategic bombing" campaigns had already killed hundreds of thousands of civilians in Japan without much in the way of public or political input, and it was at first assumed that Atom Bombs would be no different. It was in fact the public acclaim produced by the dropping of the first bomb, which Truman was desperate to claim for himself, combined with the surprise of the rapid second bombing of Nagasaki, which had been rushed into operation by local military officials despite technical problems and poor conditions, that produced the eventual policy of the President taking personal and public responsibility for the decision to use nuclear weapons in the future.
In a paradox, then, the very scope and power of nuclear weapons produced a kind of reaction away from the general trend of military affairs being handled logistically and technically, and a demand that at least the decision to kill hundreds of thousands with a single bomb, and (with the development of the H-Bomb) to possibly end the human race, be one for which someone somewhere had taken personal and public and political responsibility.
In America today, then, the decision to use nuclear weapons is invested theoretically in the President alone, but in practice is distributed throughout the military chain of command based on a complex, reactive decision chain designed to ensure that any nuclear strike would be met with a massive nuclear response and the likely destruction of the human race as a whole. Similar plans were made in the Soviet Union, and ultimately by other powers to acquire nuclear weapons, and remain to this day.
This decision had vast effects, which are still proliferating to the present day. A small number of heads of state were publicly invested with a new, indelibly monarchical power: the sole authority to at any moment, for any reason, kill millions and destroy the human race. This power and its accompanying authority was and is entirely new, and bears no relationship with any previous conception of either governance or warfare.
This power was and is invested publicly, was and is communicated to the people at large, and hence constitutes a genuine form of authority: however, this form of authority is by its very nature tyrannical, since it is impossible to justify in any rational terms, ethical or political.
At the same time, however, the same practical power was invested, privately and conspiratorially, in the government and military at large: not invested in any person as part of their public responsibilities, but rather invested in the system as a whole, in terms of operations that are entirely secret and kept entirely from the public. Many low-level personnel had and have the power to initiate nuclear war, to respond to a nuclear strike, and hence to kill millions and likely destroy the human race; what restrains them is merely the governmental system itself, its professionalization and discipline and organizational charts and contingency plans and complex, secretive operations. There is absolutely no public or political check on these operations; nor has any large-scale effort ever been made along these lines.
The universal acceptance, in Western societies, of this state of affairs, has in my judgment had impacts that it is impossible to overstate. One of these impacts has been to gradually but pervasively change the nature of even "conventional" warfare and its relationship with public political authority, making this relationship more and more correspond, not to any historical rational or ethical principles, but rather to the irrational principles that have underlain the creation, authorization, and use of nuclear weapons.
In the present day, in the United States of America, the public authority to authorize warfare and acts of war, ranging from ground campaigns to bombing missions to drone strikes, now rests solely with the person who also has the authority to use nuclear weapons: namely, the President. At the same time, the practical power to carry out most acts of warfare, ranging from bombing missions to special forces attacks to drone strikes to arming of militias to active direction of campaigns to sabotage to assassination to mass murder, has been pervasively devolved to ever more technocratic and private realms, where acts of warfare are carried out by ever smaller and more distributed and more secretive groups without any public or political oversight or consequences.
Once upon time, warfare was treated as a public act which the public as a whole should choose and for which all people in a society must accept responsibility. This conception was embodied both in formal, limited declarations of war made by political authorities, but also in more spontaneous and popular demonstrations. As Chesterton said, few democratic movements in British history ever equaled the spontaneous mass volunteering for the military in the early months of WW1; and even with the proliferation of brutal mass-conscription systems, the public enthusiasm and pressure for prosecuting the World Wars cannot be doubted throughout much of Europe and (at least in regards to the Pacific War) the US.
All that has changed. The very idea of a formal declaration of war is now, in the year 2025, legally obsolete and practically absurd. Even when people suggest bringing back the practice, it is generally on the basis of archaicizing commitments to ritual and tradition, not any particular public responsibility. What is more troubling, the amount of public interest and responsibility for acts of war undertaken by governments has never, I think, been lower, both in the US and elsewhere in the world.
The United States of America was formally at peace in the year 1999, even as American and NATO forces engaged in a two-month bombardment of Serbian territory. The United States was formally at peace in 2011, even as a force of 11 ships, numerous warplanes and bombers, and undisclosed numbers of CIA agents bombed and bombarded Libya and ultimately brought down its government. The United States of America is formally at peace in the year 2025, while it provides weapons and ammunition and technical assistance and training and active campaign oversight for the Ukrainian army against Russia, and even as it provides weapons and ammunition and technical assistance and training and active campaign oversight for Israel in its wars against Iran and the people of Gaza, and even as it has in recent days actively bombed Iranian nuclear sites.
The point of these examples is not to debate the justice or prudence of any of these particular interventions, but to merely point out that all of them constitute warfare by any reasonable definition and yet none of them were in fact decided upon or proclaimed by regular public and political means. To the extent that any political or public decision was taken in these cases, and any political or public responsibility accepted, it was taken solely by the monarchical President of the United States; yet in all of these cases, precious little such responsibility was ever in fact taken, considering the consequences, and the wars themselves were largely treated, both by the government and the public at large, as not public or political matters at all, but rather as the private, technical concerns of the government and military. In the general public, few people chose to get involved, to enlist or object or protest or obstruct, few people cared, and life went on throughout much as before. The limited proportion of people who voted in elections voted not based on such distant "foreign policy" considerations, but based on domestic culture-war concerns or The Economy.
In retrospect, what sets the Presidency of George W. Bush, and his Wars in Iraq and Afghanistan, apart from earlier and later such interventions, is merely the fact that George W. Bush actively and publicly took responsibility for his decision to initiate these wars, and actively solicited public responsibility and investment in the wars by the rest of the government and the public at large. This move, though it benefited him in certain ways in the short term, was correctly seen as a huge political mistake, and has not been repeated.
The Syrian Civil War, in which the United States was an active participant from beginning to end, arming numerous groups with American weapons and imposing sanctions and engaging in targeted military actions, has resulted in one of the largest refugee and migration crises in history, at nearly 14 million, vastly more than the impact of the Iraq War or similar confrontations; yet no one in particular in America has ever taken responsibility for it, or ever will.
In practice, American and European public life has decided to arbitrarily draw a distinction between, on the one hand, operations that involve "boots on the ground" (which always means regular soldiers, and not, say, CIA agents or special forces or funded terrorist groups or paid mercenaries) and, on the other hand, all other acts of warfare, with the former constituting wars that presumptively someone somewhere must take public responsibility for at some point, and the latter merely constituting security actions that no one has to justify to anyone ever. This distinction, though, is not principled, but entirely arbitrary, and is unlikely to last. At present writing, Russia has been engaged in a massive war with Ukraine, one involving mass conscription and ground campaigns on a scale not seen since WW2, but has nonetheless generally chosen to label their conflict, not a war, but merely as a "special military operation." Meanwhile, Russian mercenary groups continue to fight throughout Africa without any particular declaration of war being considered necessary by anyone; and, as Pope Francis pointed out repeatedly throughout his pontificate, arms continue to flood every region of the world almost unchecked, allowing almost any group of people to sign up with some multinational provider or another and become a militia fighting in an undeclared war.
Again, though, the point here is not primarily to morally evaluate these actions. The entire point, rather, is to point out that the trend of the past decades has been for warfare to be entirely removed from any realm where its acts could be evaluated ethically.
Governance is is a matter of people who wield public authority, and hence take on moral responsibility for their political acts and accompanying consequences. If acts of warfare are merely things that happen, carried out by systems rather than people according to abstract considerations, without anyone in particular publicly authorizing them, proclaiming their purposes, or noting their results, then they are de facto removed from the domain of the ethical as such.
What is withdrawn from the ethical, however, is necessarily also withdrawn from the realm of conscious, rational human action. Hence, in the present day, even the public justifications for acts of warfare have grown ever thinner, from the complicated humanitarian and ideological and intelligence cases made by Clinton and Bush to ever thinner appeals to emotion and passion and narrative and and, increasingly, mere caprice. When Donald Trump discussed bombing Iran in recent days, he provided little in the way of reasoning, pro or con, and ended by noting, mysteriously, that no one knew what he was going to do. Acts of war were ultimately decided upon not based on law, or policy, or even ideology, but merely on the personal whims of a grandstanding reality-TV star.
This would be of concern at any time--but is particularly perilous in the present day, as new technology seems poised to fundamentally change not only the goals but even the means of warfare.
Put simply, autonomous weaponry is here. Drones, of course, have been used by militaries for some time now, operated remotely to kill and destroy at a great distance by people at an even greater distance. As such, they have increased the ease of committing acts of warfare to levels greater than ever seen before, and allowed for the greatest split yet between the realm of the public and the political, on the one hand, and the realm of technocratic warfare, on the other. The American military has long since decided that blowing up someone with a drone in a country with which America is not formally at war is not something that requires even a public press release, let alone a declaration of war. As such, the "drone operator" is not a political or military figure, but a kind of technical apparatus for a broader technocratic system engaged as a matter of course in decisions of life or death.
Over time, even that person, and even that technocratic mechanism, have become less essential for actual acts of warfare, as fewer expensive drones have given way to larger and larger numbers of cheap drones travelling only short distances. The war between Russia and Ukraine has been the testbed for this tactic, showing that cheap ordnance attached to cheap drones can be very effective at destroying expensive, human-controlled machinery. In May of this year, the Ukrainian military innovated the tactic of launching numerous cheap, small, short-range drones at short range, smuggled into Russian territory, and using them to destroy expensive warplanes with little possibility of interception. Meanwhile, other conflicts--particularly the genocidal campaigns of the Azerbaijani government against Armenia--have already involved increasing use of so-called "loitering ordnance," drones left in an area to autonomously attack targets at their own choice.
All this pales in comparison to the technologies already developed by the American and Chinese governments, and already being manufactured in large numbers for a possible confrontation over Taiwan. This new model of warfare, which required incredible technical sophistication to develop, is of drone swarms, huge masses of cheap autonomous vehicles that operate together as a distributed collective, attacking targets and denying areas via huge numbers of reactive, AI-coordinated actions. While the military use of such ordnance should be obvious--allowing air and infantry defenses to be simply overwhelmed, countering jamming and other technical countermeasures, reducing human casualties, and so forth--the ethical dangers should be equally visible.
Concerns of course exist about the use of such technologies on both sides--concerns that have led and are leading at least the US Military to develop complex protocols for deploying such ordnance in ways that continue to normatively involve supervision and direction by humans. Still, even in a best-case scenario, such ordnance reduces the human input into decisions, including decisions to kill and destroy, significantly--with one operator directing a swarm rather than a single drone--and in a worst case scenario could conceivably lead to a complete human withdrawal from the tactical realm.
In any case, I am, again, less concerned with the immediate potentials as the overall trend: which is to remove military decisions to destroy and take life more and more from any public or political realm of ethical responsibility and more and more into a purely abstract, technical realm that is de facto private and so necessarily irresponsible and irrational. Within that realm, the question of whether a decision to kill is made by an abstracted technical operator or an autonomous weapons systems becomes less and less significant. What is significant is that the act and its consequences need never be decided upon by anyone according to any conscious, rational consideration, that no person at any level need take any ethical responsibility either for the act or the consequences.
The real danger of the growing proliferation of autonomous weaponry, then, is not that this technology in itself will sever the relationship between human responsibility and warfare: it is that it will help produce political and public and social conditions whereby this severing will be accepted and put into practice more than ever before. The sheer destructiveness of the Atomic Bombings of Hiroshima and Nagasaki, despite decades of government cover-up to disguise their extent, have always been so obvious as to be impossible to ignore; and that is even more true for the destruction of the human race by nuclear missiles projected in the decades following. They produced a genuine reaction towards at least the bare fiction of some kind of public responsibility.
A drone, though, is a small thing; and what is more, visibly and publicly a technological thing, a manifestation of technical power. One can simply focus on the technological skill and sophistication, the complexity of operations: one need not, most of the time, think of what the drone is doing, or who it is doing it to. Even swarms of drones great enough to darken the sky and kill millions, one suspects, would not produce the kind of public reaction that greeted nuclear weapons. The very existence of AI-driven precision in targeting and operations--so unlike the clumsiness of WW2-era bombs--can easily act as a disguise for acts of moral abnegation vastly greater than any in history.
For what matters most, in the long run, is not what is done but the basic relationship between human acts and human reason and moral responsibility. And the real danger of the coming century will be the possibility that we will able to easily make war, on whatever scale we choose for however long as we wish, and never even consider the question of whether it is right or wrong or what its goals are or who is responsible. Considered soberly, autonomous warfare technology aims at and enables and embodies this goal better than any means yet conceived of.
The New Tyranny
The future is uncertain. A reader of this essay will have noted that lying behind all the above trends and concerns rests an essential conflict between different visions of the political as such: and the history of the 21st century, I believe, will depend more than anything else on which of these visions prevails.
On the one hand, there is a basic human vision of the political as a matter of rulers publicly taking considered, rational action for the common good and accepting public responsibility for their acts; on the other hand is a vision of the accelerating privatization and individualization and technologization of power and of the complete abnegation of responsibility by those who wield it. Genuine, responsible politics can take many forms, from the more progressive and managerial to the more reactionary and heroic, from the more democratic to the more monarchical. What is absolutely essential, though, what constitutes the political as such, is the ethical responsibility of those with power to the public and the common good.
I cannot help but seeing our age as to a large extent one of tyranny, not merely accidentally, but pervasively and essentially. This is not due to the presence or absence of one particular political form or another, but to a much more pervasive ethos of power and responsibility, one found in ordinary people as much as elites, in corporate managers as much as government bureaucrats, in CEOs and parents and teachers and cops as much as in presidents and kings.
By the basic human moral sanity of reason, power and responsibility are everywhere correlatives: one is morally responsible for what has the power to do, and for no more or less. According to the prevailing ethos of our society, however, power is constituted precisely by the ability to avoid any personal or ethical or practical responsibility for one's acts, the potential to foist all responsibility and all consequences onto those without power. To be without power is to be subject to responsibility and blame and consequences; until one gains power, and can make others suffer as one has suffered. Parents do not take responsibility for their children; rather, they force their children to accept responsibility for their own and their parents' actions and the consequences thereof. Rulers do not take responsibility for their laws and policies; rather, they demonize those who suffer because of them, immigrants and the poor, and make them suffer the consequences. Soldiers bomb civilians, and blame them for it; tech mavens force smartphones on the poor, and then decry smartphone addiction and send their children to tech-free schools; billionaires spend millions on nothing, and then discuss the lack of thrift and industry among employees; poor people get lucky, and become rich, and then blame the poor for not doing as they did; CEOs blame managers, and managers in turn blame employees.
Daily and hourly, one can go on social media or read the news and watch people with power refusing to accept responsibility for any decision they have made; daily and hourly, one can go out into America, and watch rulers blaming the ruled, the rich blaming the poor, parents blaming their children, teachers blaming their students, cops blaming victims of police brutality, murderers blaming victims of mass murder, the housed blaming the homeless, the powerful blaming the powerless. Certainly, no one embodies this ethos so well as Donald Trump, who has publicly refused to take responsibility for any decision he has ever made, for any power he has wielded, for any negative consequences for anything he has ever done or said.
There is a name for this ethos, and it is a very old one: tyranny.
There is also a more modern name for the trend: security.
Security in the etymological sense is freedom from cura, from care, concern, anxiety, or responsibility, and is not a term that has any prominent place in any pre-modern political lexicon. This is because it should be obvious that, while the powerful can to a limited degree provide security to those without, those with power can and should never be free from cura, should never be secure and without concern for those they have power over. And yet, in our present age, security has become perhaps the most foundational and pervasive concern of politics, and is presented daily and hourly and minute to minute by nearly everyone as the goal of their political efforts and systems. And what is most striking, to me, about our usage of the term is that it is applied always and exclusively to those with power, and directed against those without.
When discussing security, one never speaks of ways to provide security to the poor and weak and oppressed by freeing them from their pressing cares, but always and only of preserving the security of the powerful by freeing them from all responsibility, care, or concern for the powerless. One does not speak of providing security for homeless people by freeing them from their perpetual, grinding stress and anxiety to find food and shelter for the day, but only ever of preserving the security of business owners and middle-class trippers and rich businessmen and politicians by freeing them from any responsibility to provide housing or food or money, or, increasingly, of freeing them from the slight moral concern and anxiety produced by even seeing or hearing about the homeless. One does not speak of providing security for displaced and dispossessed refugees, for the stateless and the exiled, but only of providing security for the securely enfranchised, for natives and citizens and rulers, from delivering them from suffering any consequences or inconveniences from the existence of the former categories of people. One does not speak of providing security for the brutalized and oppressed and murdered, but only for their brutalizers and oppressors and murderers. In its modern usage as much as the original Latin, securitas might be legitimately translated as "irresponsibility."
Though it may seem strange to say it in our post-Pandemic world, cura, care, concern, and responsibility, is not in itself a negative thing or negative feeling. One cannot love anyone or anything without experiencing a sense of concern and responsibility for them: one cannot take any action without taking some care in doing it, and accepting some responsibility for its consequences. Indeed, one cannot exist at all as a human being, cannot act, cannot think, cannot love, cannot fear, cannot see or hear or think of any human person, cannot relate to any reality positive or negative, without experiencing some care or concern or responsibility.
Hence, the true telos of our current obsession with security is always and everywhere some combination of retreat and irresponsibility and physical or spiritual murder. This, alas, is the psychological and personal lesson far too many people learned from the Pandemic: that a world where one could be infected and die is too frightening to live in, that other people who might infect one are too frightening to deal with, that taking any action that might infect others or invite infection is too frightening to do and take responsibility for. Even where one cannot help but act, help but love others, help but live, one must do it surreptitiously, secretly, unconsciously, privately, and never take responsibility for it or its consequences.
As always with modernity, one finds, at the heart of it all, the same spiritual sin of despair. All one can really say to the contrary is that human life is really not so unlivable as that: that it is perfectly possible to take action, to love others, to live a life, even to have a politics where people take public action for the common good, to take responsibility for one's own acts and powers, and experience care and concern, and be happy: indeed, that this is in the final balance the only way to be fully and truly human, in private or in public, and so the only way to be happy. We were not made to be prisoners of inevitable historical trends positive or negative, or to be entertained and anesthetized for our own powerlessness. We were not made to serve lawless kings, or to do murder and wage war in private. We were not made to be secure.
Easter in an Age of Tyranny
I do not want to end on so grim a note as the last section. I do not, really, think that our political trends all lead in the direction of tyranny. If there are trends pointing in the direction of private and lawless security, there are also trends pushing in the opposite direction, towards greater publicity for public acts and a revival of public accounting for those with power. Even Donald Trump, as much as he has been personally embodied the tyrannical mindset, has at the same time been a figure of publicity, whose acts have been publicly proclaimed and publicly known, and who has thus been held to account for them much more than more bureaucratic and privatized politicians. And around the world, outside of the US, outside even of the West, the Catholic Church on earth endures, and at the present is engaged in raising many peoples to political and social responsibility and maturity.
I began this essay with the reminder of Easter, and all that it represents: I will end it the same way. If there is a lesson to be learned from Easter, it is certainly not that we ought to put our faith in any particular historical trend, positive or negative: rather, it is that there are forces beyond and outside of history that offer hope to us all. Whatever the historical and political trends and events of the 21st century, we are not bound to them: we will rise from the dead, and go elsewhere.
If there is a second lesson, though, it is that when things happen, even in history, they happen quite often all at once, and involve a complete reversal of all that has gone before. When a man is tortured to death by the sovereign power, particularly through the very lengthy and torturous method of crucifixion, it is a long process, a long series of events and acts and decisions, that from its very beginning points to and can end in only one way: the permanent death of the man crucified. Resurrection, though, when it occurs, takes no notice of the direction of history, positive or negative, pays no attention to the political and social trends embodied in public execution, and shows no concern for the sunken costs of the act of crucifixion: it merely undoes what has been done, because what was done was unjust, and begins again from the beginning. Repentance works very much the same way.
As I said at the beginning of this essay, I will be very happy if my predictions all fail. My hope is not that I will be proven right by history, but that we will all have the consideration and the wisdom to recognize the trends, in our societies and our own lives alike, to judge them according to eternal standards, to defy what is evil, and to do what is good.
Prove me wrong! Consider this essay a request, an exhortation, nay, a provocation to do so.