Gone from social media since last year (notwithstanding the two locked accounts to which I do not post but use to contact various companies’ support channels, or to keep up with a handful of cats, goats, and zoos), I nonetheless keep an eye out for concepts which help me explain why. Enter an Aeon dissection from Sally Davies of predictive processing.
Predictive processing casts the brain as a ‘prediction engine’ – something that’s constantly attempting to predict the sensory signals it encounters in the world, and to minimise the discrepancy (called the ‘prediction error’) between those predictions and the incoming signal. Over time, such systems build up a ‘generative model’, a structured understanding of the statistical regularities in our environment that’s used to generate predictions. This generative model is essentially a mental model of our world, including both immediate, task-specific information, as well as longer-term information that constitutes our narrative sense of self. […]
According to the emerging picture from predictive processing, cognition and affect are tightly interwoven aspects of the same predictive system. Prediction errors aren’t merely data points within a computational system. Rather, rising prediction errors feel bad to us, while resolving errors in line with expectation feels good. This means that, as predictive organisms, we actively seek out waves of manageable prediction error – manageable uncertainty – because resolving it results in our feeling good.
Davies goes deep into the reward system at play here, but it seems to me that predictive processing also more simply explains my troubles with what I’ve called the cognitive violence of the feed in and of itself as an organizing principle.
Whether we are talking about an algorithmic feed that makes judgments about what it thinks you will want to see the most, or simply the context collapse of a single feed into which all manner of people and subject matter are dumped willy nilly, we’re talking about a profound loss of any sort of predictability for one’s own actions or one’s own thinking.
That can be especially problematic for an autistic, for whom the need for a predictable environment can be almost a sort of prime directive. The greater the rise in prediction errors, the greater the sense of anxiety if not a general sense of overwhelm.
Predictive processing gets a bit at the idea of the distinctions between a database and a narrative.
Predictions in the offline world necessarily follow mostly a basic course of cause-and-effect—in other words, a narrative course. Algorithmic feeds and context collapse thwart any real sense of narrative; without clear cause-and-effect—without an obvious causal relationship between a first thing, a second thing, and a third thing—cognitive struggles abound.