On Machine-Mediated Monoculture


Updated:

This article is a bit of a Ship of Theseus - I keep rewriting bits of it and it's been drifting ever farther from the original analysis. The subject matter is so broad and complex that it was a mistake to even hope to capture it in one short blog post. But I figured, I'd treat it as a living document - I'd get back to it every once in a while, cringe at the naivity of my past self and rewrite parts that now seem unbearably misguided.

I'd like to start with a set of non-controversial observations:

  1. Our relationship with technology is best described by a quote often misattributed to Marshall McLuhan, which nonetheless summarises his work well: "We shape our tools, and thereafter our tools shape us". Eric Davis, in Techgnosis, puts it more directly: culture is technoculture.

  2. Throughout history technological progress has been accelerating. Furthermore, the technology and by extension culture have been growing more complex. This has been rigorously studied by Geoffrey West, particularly in his lecture The Simplicity, Complexity & Unity of Life from the Biosphere to the Anthroposphere (also check out Sean Carroll's meditation on it on his Mindscape Podcast).

  3. The reality we navigate daily is not the raw physical world but the shared cultural narrative that overlays it. Nations, currencies, careers, and markets - these are collective fictions invented to manage the complexity of large-scale human cooperation.

  4. Humanity's primary tendency is generating coherent narratives about ourselves and our place in the world.

If we accept the above premises, it follows that constructing new coherent narratives is becoming increasingly difficult due to the volume of information we need to synthesize and the pace of change we need to react to. One measurable effect of this narrative breakdown may be our diminishing ability to forecast. After all, predictions are just forward-facing stories. If narrative coherence about the present breaks down, so too does our capacity to imagine plausible tomorrows. A scholar in 1525 may not have foreseen full consequences of the Reformation or the encroaching scientific revolution, but they could reasonably expect that the rhythms of daily life - in agriculture, labor, communication, and governance - would remain broadly familiar a century later. By contrast, the speed and breadth of change from 2000 to 2025 have rendered long-term cultural forecasting quite challenging.

Another symptom is the cultural stagnation, evident in the nostalgia fetishism and endless cycles of remakes and reboots in pop culture. Something that was studied in great detail by Mark Fisher in Capitalist Realism. He points out the waning historicity and our apparent entrapment in a cultural time loop. Fisher was a member of the Cybernetic Culture Research Unit (CCRU), a group of rogue philosophers who laid the theoretical foundations of what we now call "accelerationism". Though what CCRU were talking about wasn't the naive techno-utopianism, they posited that capitalism is effectively a cyberpositive feedback loop: economic growth requires technological innovation; new tech enables faster growth. Cybernetics teaches us that positive feedback loops result in explosions. Capitalism's accelerating trajectory therefore will result into a metaphorical "explosion" we call Technological Singularity - not a utopia, but a decoupling of techno-capital from humans. Humanity's well-being is of no interest to this cybernetic process. To me it seems that Fisher's diagnosis is indicative of the singularity's encroaching - we're no longer feeling the passage of time because we're increasingly observers of time passing by.

When we can no longer keep up with the change we increasingly rely on machines to the point that machines no longer serve us - quite the opposite. Consider these recent examples:

Paradoxically, interacting with technology has become taxing. So we've invented a cognitive prosthesis in the form of generative AI: large language models (LLM), and most recently "thinking" LLMs. Playing with multiple frontier AI offerings I can't help but notice that while their outputs differ stylistically, they often converge substantively. This may be because they all represent the same thing: a compressed archive of human symbolic output (or at least its digitized subset). And if we outsource our creative thinking to machines, will it flatten the multiplicity of human thought into a machine-powered singularity? I constantly vacillate on this. On the one hand the Internet and social media created information bubbles and small online communities, which is the opposite of monoculture and AI may intensify this. On the other hand the meteoric rise of agentic software development and tools like OpenClaw tells us that the axiomatics of the market are moving to minimize human involvement in the process. Assembly line enabled mass production of physical goods, but the goods became more standardized and uniform. Agentic AI may enable mass production of knowledge and culture with similar effects.