Attention labor & symbolic engineering for preserving agency
Jac Mullen, posted on in: Notable Articles, ai, tech, ai-created-labor, culture, ad tech, memory, symbolic engineering and The Long Next.
~1,484 words, about a 8 min read.

This article, an interview with Jac Mullen executed by Peter Schmidt, is extremely good. It's long, but you should read it.
The core concept, that I find extremely useful and explanatory of the world, is the idea that AI is an external attention technology and in its current configuration presents as if it will expand our capability to pay attention, but instead narrows, pre-empts, and makes more predictable our attention and therefore our actions.
first and foremost, AI has externalized attention, in the same sense that writing previously externalized memory.
To the extent that writing creates a form of non-biological memory — an external system for storing symbolic information — to roughly the same extent, I think, many forms of AI constitute forms of non-biological attention, external systems for selecting, ranking, filtering, and reweaving fields of information around what's salient or important.
In dealing with ad tech and the general attention / surveillance economy this idea is really core, in my mind, to understanding what is going on. A major goal of algorithmic personalization, and Mullen argues AI systems as well, is to render users more predictable and therefore easier to interact with at less cost and more profit.
AI is being deployed by a small elite to rewire us at scale for certain forms of exploitation and extraction — through consumer technologies like smartphones and social media.
[...]
Similarly today, a new elite is using a new information technology to make people legible in new ways and to extract from them a new form of surplus. As the old elite hoarded its new memory technology, the new elite now hoards its attention technology, and the emerging power structure is characterized by a profound informational asymmetry.
It's no mistake that this ties back to ad personalization systems as early as 2003. Capitalism's intersection with modern technology aims to make us currency to be traded on predictive markets. One of the core concept from Shoshana Zuboff's The Age of Surveillance Capitalism that I agree with.
Big Tech was the first to invent looms—the first true “external attention system,” I’d argue, was achieved when Google added a quality score mechanism to its AdWords pipeline around 2003. Tech companies used them primarily towards the creation of predictive products—products which use machine learning systems to predict our behaviors, generating data to sell to clients. Their revenue derives from the accuracy of the predictive products they sell. To increase predictive accuracy of any model, you really have two options: improve the model, or simplify the system you are modeling—literally make the system more predictable. This is the ultimate purpose of the small range of gestures, the flattening effect our devices have on our range of behaviors, both cognitively and physically: swiping, staring, dissociative absorption, thumbing, whatever. It is the narrowing of possibility, to make us more predictable.
I think this is a key difference I see from Zuboff's work. It isn't just that attention economies transform us into currency, they force us into particular types of labor. We are not reduced to resource, but manipulated into doing the attention work of reducing ourselves.
If the point of the attention economy is to render us into resources, then that feels like a process that is inescapable on an individual level. But if the algorithmic systems of disinterest, simplification and oppression instead aim to make us the tools that render our own lives into prediction products it points to a greater possibility of resistance, executable by smaller groups in smaller ways.
If the surveillance economy's real trick is forcing us to render ourselves into product through UX dark patterns, monopolization, and personalization-based barriers then that is something we can combat effectively.
Machine attention has special properties too. These properties enable surveillance capitalists to hack and exploit weaknesses in the biological attention and memory systems of their users, converting customers into reliable hubs of resource extraction.
However, if access to these external attention systems were democratized, I think we could use them to defend against precisely these sorts of intrusions which, for over a decade, have cognitively re-engineered us against our will. We could learn to see ourselves more robustly, and even learn to red-team our forms of self-knowing against the intrusions of persuasive technology.
One of the things that has only become clearer is that the erasure of the past and of context is one of the key techniques in attacking the stability of our current social and political order (and preventing us from improving the state of the world). Archiving is a, perhaps tiny, radical act. Resisting the push to exist only in the present is going to be important for all of us.
There will only be vibes and feedback loops in a permanent ahistorical present. This will sometimes include the past, but not in a familiar way. More like how a diffusion model includes the past, paints with the past, impressionistically.
[...]
The politicians of old thought of memory’s personifications, History and Posterity: how would they be remembered? Trump thinks about attention’s personification: how will he be treated by the Algorithm? Trump lies and lies because he does not need to carry the past with him: he is a creature of the attention world, not the memory world.
Mullen makes a really excellent point about how these systems which push us to flatten our responses and engage in a limited set of choices are a threat to our agency.
Now, our choices are increasingly pre-empted before they arise. Through techniques like tuning (changing the choice architecture in an environment), herding (group-level orchestration), and conditioning (habitual reinforcement through operant feedback), predictive systems intervene on our behavior directly. And as these systems advance, the cognitive ecological foundations of agency itself are quietly degraded.
The suggestion is to take the tools of attention systems and put them in our control. We will render ourselves into something smaller than our selves when forced by surveillance capitalism; or we can use these tools to actively shape ourselves in resistance to the systems that seek to flatten both our identities and attention.
So the core challenge, as I see it, is to use external attention in a way that allows us to see ourselves as deeply, as completely, as these external systems presently see us, and in this way overcome the corrosive and pre-empting effect they have on our own agency. This is one major sense in which I understand what it means for external attention to be democratized.
To devise the means for this “exteriority” — this is a challenge of symbolic engineering. On the one hand, I take it to mean decomposing attention into a set of primitives valid for any biological or non-biological system (a conceptual challenge) and operationalizing them in a non-extractive way (a technical challenge), in which folks, wielding a sort of exploratory tool, would be enabled to recombine and compare and, in theory, apply the filters or salience policies of any attention system to any data set.
It feels like a solid conclusion here at the end: We must have systems that allow us to retain our memory and the symbols of ourselves and turn them into a flexible, multimodal selves that we control, through which we can realize the agency over our identity, attention and, from the first two, the world around us.
I don't think it is an accident that modern day fascism is so interested in eliminating identities outside of its own narrow definition. Though fascist projects always seek to deperson, the modern attention economy variant of fascism is as interested in reshaping people through the very logic of limited choices that attention economics forces on us daily.
Resistance is not just keeping memory, but keeping the memory of self. We must describe ourselves boldly and in detail. That seems like one of the most effective ways to oppose forces who would reduce individuals to broad statistical averages. I like how close this feels to the idea of E.N.T.E.R..
I would invite the symbolic engineers of today to create systems allowing people to ingather the fragments of externalized memory — journals, biometric data, etc. — through external attention systems in order to render some choosable section of “self” glanceable in an instant: the self through time, the self through space. This is what we will need, genuinely, if we are to resist complete auto-determination by external forces in the world which are emerging around us everywhere at once.
If we leave this symbolic engineering to the platforms, then the only people with real agency will be those who own the filters. Everyone else will be a training datapoint. This is not a future we should consent to.
(Cover image "open-door" used from veronikadrechslerova under CC-BY 2.0)
—
— Via Jac Mullen, Attention Machines and Future Politics