Robin Berjon

Back to having a future

Retrofuturism

Abstract art looking a bit like a maze

As I’ve mentioned before, I’m an avid reader of L.M. Sacasas’s Convivial Society which I feel offers some of the best thinking on technology while always remaining clear and accessible, with no sensationalism and an abiding humanity. So when Mike wrote in the latest instalment (III.9) that he was offering seven theses “less cautiously formulated” with the thought that they might elicit some responses, I found jumping in hard to resist.

The way I approached this was by stealing reproducing each original thesis (as they feel right even though I have different reasons for seeing them as such) and then headed off into my own personal tangent. Evidently, I strongly encourage you to read the original We Are Not Living in a Simulation, We Are Living In the Past first!

1. On the internet, we are always living in the past.

The future cannot be controlled, but it can be shaped. Our predicament is not a question of living in a simulation versus living in the past, but rather about living in a reality designed through past simulations of the future. The mechanics of this permanent state of retrofuturism are simple: if you have access to detailed data about the behaviour of people, editorial control over what information people receive (as social or search recommendation engines), and the means to nudge people using designed affordances then you have the power to shape people’s behaviour. The process is then to use data from past behaviour to predict future outcomes given editorial and design interventions.

Rather than being freewheeling random walks over rich landscapes of potential, our lives inside a digital world designed this way are statistically-controlled decision trees built by those who used their knowledge of our past to simulate a forest of our futures and made the ones they preferred more likely. Living on the Internet is a process of ceaseless foreclosure of our futures, it is the continuous pruning of possibles by forces unnoticed.

2. On the internet, all actions are inscriptions.

There is a time dimension to the digital remanence of action but also a spatial dimension, or at least the Internet’s equivalent of space which is context. If remanence were only in time then you could relatively easily move on to the next two-server town and start afresh. But because every last one of our utterances is not only archived and searchable but also has a direct, public URL there is no other town. Instead of building rooted, meaningful communities what we have is a worldwide eternal mosh pit. You can log out any time you like — but you can never leave.

The past need not be totalising. We can break it into parts. As I said in this blog’s inaugural post, “Not that I intend to make this new blog any less embarrassing, just that I plan it to be so in entirely new ways.” Identity is contextual and, if we are to live, breathe, and grow, it has to remain contextual. The Internet of the “authentic self” — a loathsome, aberrant idea if there ever was one — is an exercise in slowly getting strangled by your past selves.

3. On the internet, there is no present, only variously organized fragments of the past.

Both algorithms and traditional bureaucracy can be thought of as machines for making categories and applying them.” (Henry Farrell & Marion Fourcade, High-Tech Modernism) Our digital environment has become a massive bureaucracy, classifying people and directing their experiences accordingly. Bureaucracies are strong on meaningful narrative — just ask Kafka — because they are meant to be rational agents, and under this ideological commitment rational decisions don’t have a history, they just are. This create a sense of a world without trajectory, that is just variously organised fragments, and therefore arbitrary and without future.

The entrenchment that computers can bring to bureaucratic processes isn’t a new idea, Weizenbaum noted in 1976: “I think the computer has from the beginning been a fundamentally conservative force. It has made possible the saving of institutions pretty much as they were, which otherwise might have had to be changed. Superficially, it looks as if [things have] been revolutionised by the computer. But only very superficially.” Bad systems that are automated enough persist because of the automation rather than because it’s best, leading to frozen ideas across the landscape.

Another feature of bureaucracies is the optimisation of arbitrary metrics (that were probably not arbitrary at the time they were picked). This recent Reddit post about the state of ML research is a great example: a 0.03% improvement achieved by throwing massive compute at a problem isn’t an achievement, it isn’t invention, it’s just what happens in the mindless pursuit of any given metric. Again, this leads to static incentives in a dynamic world, and a sense of mindless repetition.

4. On the internet, fighting about what has happened is far easier than imagining what could happen.

Hell isn’t so much other people as it is seeing the whole of everyone across time and context, it is living and having to live with one’s forever authentic self. I can only construct my self if I can play, explore, experiment. But if my experiments are forever, I risk looking bad in front of those who I will want to be my friends so the safest bet is to seek to align more with people with whom I already find myself aligned on topics that matter to me. The result is sorting: the increasing alignment of everyone in homogenous groups along all dimensions.

And since you can also see the forever authentic selves of everyone else, it won’t be long before you notice that there are pretty much only two sides you can pick on any issue, and that the same people are on the same sides of each — a destruction of intellectual intersectionality. That’s life in the big mosh pit. And once we are sorted, there is nowhere for us to grow and change.

5. On the internet, action doesn’t build the future, it only feeds the digital archives of the past.

One of the philosophical tools that I find the most useful in understanding digital technology is that of domestication. The idea evokes the harvesting of humans in sci-fi pods, but it is more subtle. Domestication (be it of plants, of animals, or as per James C. Scott’s Against the Grain of ourselves by ourselves) is a process of iteratively offering convenience in exchange for pliability. It isn’t long before what was convenience becomes necessity.

Domestication offers an angle of its own. Farms are total institutions. A marketing funnel is a game driving system, eventually a corral, and much of the systematic practice of seeking increased lift through A/B testing was pioneered in seeking to increase wood yields in scientific forestry. As in traditional agrarian practice, the statistical optimisation of products isn’t a “thick” theory: it rarely seeks to produce an actual understanding of why it works. Its goal is to detect local optima and establish how to reproduce them. These local optima then become traditions, rituals of sorts, frozen structures. Many of the most important systems that corral behaviour on the Internet today are there mostly because they worked well for some ancestor of the current system, even if they no longer work so well for us.

If you know to look, you can feel the difference between software crafted with care for its users and systems of vacuous tradition that just happen to be good at producing the vapid fodder of convenience.

Digital technology differs in its power from pre-digital technology because it can dynamically react based on data, it features what Brett Frischmann and Evan Selinger call “intelligence-enabled control” (in Re-Engineering Humanity, which could be mushed into Seeing Like A State as “legibility-enabled control”). Analog technology cannot change faster than people and cannot be as opaque — it can be repurposed and resisted in ways that a total digital institution cannot. That is what makes digital domestication particularly powerful and why it became a place mired in ritual and tradition so quickly.

6. Because on the internet we live in the past, the future is not lived, it is programmed.

The Internet increasingly structures our lives, but who structures the Internet?” (Corine Cath, Changing Minds & Machines) Structuration1 is a process of reciprocal interaction between human actors and the structural features of organisations, including organisations defined by technology. Human actions are both enabled and constrained by organisational structures; and those structures are the result of previous interactions.

The degree to which human actions are enabled rather than constrained, however, as well as how much people can act to change the structure of tomorrow, can vary enormously. In some places, agency can dominate over structure and we can partake in choosing rules for our existence. In others, agency is weak, we have no say in the rules that guide us, structure dominates.

On the Internet, the nudgeocracy isn’t very invested in your agency. You will never be forced to flow with the structure, mind you, just exhausted into compliance. The future is not programmed per se in the sense of deterministically forcing an outcome, but the stats are stacked against you. And should the outcome displease you, only you can be blamed for going with the nudge or for enduring the loss of convenience that comes with refusing it. A key effect of nudging is to effectively externalise accountability: the choice architect decides how most people will react, but people remain accountable for their own choices.

Over time, the outcome of previous nudges feeds into the next ones as models and data. The structures accumulate and dominate — and nudging increasingly shifts towards programming, today decided by yesterday. If Luddism is simply the preference for agency over structures designed and coded by others, we’re going to need a bigger Ludd.

7. On the internet, the past is a black hole sucking the future into itself.

As we have seen in the previous theses, our digital environment:

  1. Regulates our lives towards a smaller number of paths purposely designed by others rather than trails more fortuitous and exploratory.
  2. Builds up a monolithic authentic self rather than a lush set of mutually-enriching contextual identities.
  3. Is heavily focused on categorising people, which is inherently a bureaucratic task, with all that implies in repetitive pursuit of metrics that have long lost their meaning.
  4. Is a world in which everyone’s past is forever and for everyone to see, and the only hope for friendship is a process of social sorting that limits interaction between intellectually diverse populations and aligns two groups into polarised standards.
  5. Has all the characteristics of a total institution working on the domestication of people by people.
  6. Develops a nudgeocratic environment that reinforces itself over time.

These trends work together to keep us stuck in a permanent past because they make it harder to follow different trajectories, to see meaning, to grow as people. It’s a world that rewards predicting over inventing.

It’s also unsustainable. The simplification of interaction trajectories and the uniformisation of editorial recommendations creates a runaway process that eliminates complexity, leading to a loss of noodiversity that eventually causes the intelligence and computational power of society to collapse.

We can shake this up, it’s only like this because we built it this way. With values, with privacy and the maintenance of contextual integrity, with cooperative approaches to control we can progressively build a livelier, future-oriented digital world. The internet has become a ménagerie of monoliths which we would gain from opening up (eg. just making it so that search and social are only backend services atop which anyone can and does build UI and recommendations would already address a lot).

The fact that there are big powerful companies is not new or specific to the digital world, but the Internet monopolies are each entrenched at the local optimum from fifteen years ago and endlessly reiterating uninventive options. Unfortunately, technologists are poorly positioned to understand the situation in part because they rarely are the ones involved in nudging, in part because this is all lathered in an ideology of unavoidably triumphant science in which technology is at the same time neutral and necessarily leads to a post-scarcity Star Trek universe, and in part because us computer nerds can always escape much of this domination and trade our mild-mannered alter egos for the Unix philosophy of power tools. But it wouldn’t be the dumbest thing to build for the future again.

Footnotes