I started to draft this post several days ago, a day or two after Channel 4 News first broke the Cambridge Analytica / Facebook story, and as they followed-up that story and its Trump / Brexit implications the world latched onto the scale of the story …. and the outrage. Oh the outrage!
For a take on the outrage it’s worth listening to @TheSimonEvans in the closing minutes of this week’s This Week – a great take on the “I can’t come to bed yet, there is something wrong on the internet” meme:
In fact, given the week and the topic, that whole edition, the @JamieJBartlett piece specifically, and all the regular contributors, provides a means to capture most of what needs to be elaborated.
Initially I was linking this post to this BBCR4 Digital Human “Social Media Vortex” piece, broadcast coincidentally(?) about the same time the C4 story broke:
I disagreed with a lot of what this programme suggested about the nature of the problem, but it provided an opportunity to recognise the issues, which I raise below
[Before diving in, let’s not forget it was Channel 4 News and The Guardian and an international team of collaborating journalists that researched the CA/FB political story over many months, and we’re only seeing the tip of an iceberg. This kind of proper journalism deserves our ongoing support.]
Worth starting with the outrage. Sure it was “big” news – something many people didn’t seem to know about, or were in ignorant denial of, on a grand scale. Plenty to concern us, but if we’re going to be outraged, we’d better understand where things went wrong and address the right solutions.
Whether it’s CA’s cynical services, or Russian bots, the national and international political manipulation is clearly the top of the pile of concerns. The main ignorance (ie news) seems to be in the extent to which our contract with the likes of FB makes aggregated personal data available for public use and private targeting. Marketing campaigns, commercial and political, have always used – and guarded their valuable use of – personal data. It’s a psychological game in which we all participate at every level. Privacy or transparency, which is it? Neither. It’s a game, always has been and, as an entirely natural process, it probably always needs to be.
All that technology has done is up the scale and speed of possibilities. What many of us have been warning (for decades) is that how we interact with those accelerating, concentrating, reinforcing forces is key to our future human progress. For me the feeling is one of relief rather than outrage. “I told you so” counts for nothing, but at least fewer people are now blind-sided to the issue, so long as they address the issue beyond initial outrage. At last! I say.
Let’s leave aside several topics for now: details of Facebook’s business model with us (free use in return for use of our data); or knotty ethical boundaries in the psychological tactics using guilt and fear as well as beliefs and desires for marketing purposes; the fact that apart from the evolving technology, little if anything is new; for example. The national and international level political concerns here are simply the highest profile layer of my own agenda: human decision-making – cybernetics – using “knowledge” to act. That is the psychology of how we understand and apply our own “rationality” individually and in groups of any size.
It’s about polarisation memetics. JP O’Malley posted four very quick takes, to which I responded:
Good quickie. A long-standing topic for me. Big but not new – agreed. How “we” learn to deal with the *natural phenomenon* that bad (polarising) data crowds out good (nuanced) stuff in days w so much data is freely processed so fast. (Conservative moderation counter-intuitively)
” Ian Glendinning (@psybertron) March 19, 2018
And in the main dialogue:
Exclusive? Eternal truth from time immemorial of course. Problem is scale and speed of electronic media content – and nefarious choices enabled for the unscrupulous.
” Ian Glendinning (@psybertron) March 20, 2018
And another thread:
[Thread] EXACTLY:
Counter-intuitively “social” media is most naturally suited to “division and destruction” driving polarisation and populism (been saying this like a cracked record for over 15 years) UNLESS we actively work to moderate & conserve.
ie REDUCE (!) freedoms. https://t.co/01N7brzmjn” Ian Glendinning (@psybertron) March 21, 2018
So what are people missing?
Social media use is as good as the motives (and understanding) of people using it. So harmless in that sense, (except for general polarising feedback problem) BUT FB problem is their business model and core functionality is *specifically designed* to support manipulation. @skdh
” Ian Glendinning (@psybertron) March 20, 2018
That was followed by a 5/6 post thread that captured some of what I’m elaborating here. The conclusion of which is that legislating for every possible future use / misuse / abuse of the evolving future technology is the wrong strategy. We only ever know specifics of the future with hindsight. We need to focus on human behaviour.
Apart from “Plus ça change / ‘Twas ever thus” – key point is that what’s missing is moderation – think moderator rods in an otherwise runaway nuclear reactor? Think conservatism – we have to provide it, institutionally, individually. The way to protect freedoms is to conserve them. Understand fidelity & fecundity in evolution.
This is only 4 or 5 years ago, but already people were failing to notice the reality. Using increasingly available data to target social media was only ever leading the wrong way. Ask the Labour Party about Momentum for example:
We really need to distinguish between well-meaning wishful-thinking – political statements by public faces are always about desired changes to reality – and actual expertise which understands how reality works. @JamieJBartlett (Enough of us have been warning for decades.)
” Ian Glendinning (@psybertron) March 24, 2018
The thing which leads to the polarising conflict is the error of applying expert standards of truth to the political values side of the equation. We end up with “expert dossiers” to justify political vision – what could possibly go wrong – when what we really need is dialogue.
” Ian Glendinning (@psybertron) March 24, 2018
Critical / dialectical argument should be reserved for controlled spaces – like moderated debates, editorially managed media or scientific discourse – NOT free-for-all public media. #digihuman
” Ian Glendinning (@psybertron) March 19, 2018
To round off. Persuasion, whether it be about choices or seemingly objective knowledge, is a game and games have rules. In the game of life, those rules are mostly implicit and inevitably “gamed”. Gaming evolves rules by creativity. The last thing we want is rules cast in statute in advance of technological possibility. Careful what we wish for.
And finally, at the national / international governance level, democratic electoral reforms must focus on systems that are more tolerant of – less dependent on – polarising effects, because future technology can only ever reinforce these effects. It’s binary choice that’s killing us. And worse, applying the rules of objective knowledge to both sides of any subjective choice invariably deepens the polarisation. We need objective knowledge automated by algorithms like a hole in the head.
You would tell me if – after careful consideration – you thought I was mad?
=====
[Post Note: Jamie Bartlett’s one tweet summary:
My one tweet summary of Cambridge Analytica & Facebook:
Improper use of Facebook data by CA: let’s see.
Use of psychometrics: prob less important than ppl think.
CA & FB helping Trump campaign w/ targeting: made a real difference.
Everyone else doing it: still mostly ignored.” Jamie Bartlett (@JamieJBartlett) March 23, 2018
As I said, leaving many detailed issues aside, this is important. Targeting is by definition small but real effect. The real influence on the specific decision can be small, and the rationale doubtful, but nevertheless crucial. What is real is the inexorably divisive targeting – the deepening of polarisation must not be ignored, in fact it’s the thing we must actively moderate by our own efforts. ]
[Post Note: And given the Grauniad was part of the team that broke the story, along with C4 News, this made @afneil chuckle:
Oh. The Guardian app harvests your personal data from your Facebook page and also the data of all of your friends: https://t.co/PnFw8bZ1vc
” Alex Wickham (@WikiGuido) March 29, 2018
Me too, but it reinforces the point that sharing and accessing personal data is NOT the problem. The issue is limits to ethical use of shared data.]
One thought on “This Watershed Week?”