All things in moderation?
When I say social media, I mean all media, since in the 21st C all media even long-forms depend on eyeballs and clicks for audience engagement. The issue I’m focussing on is the rapid and ubiquitous transmission of information, independent of its “honesty” or its content quality in general. (I call that memetics, but what I call it doesn’t really matter. It’s an ancient concept that bad information travels faster and wider than good information – rabbits run before common sense gets its pants on. If it bleeds it leads, you name it. Fake News is the catch-all term for the problem.)
Similarly, I’ve often suggested what we need in this frenzied information free-for-all is “moderation”, some management and control of information flows. As soon as I do, most people point out why they don’t like censorship and how it’s the thin end of some slippery slope to totalitarianism. Yeah, we know, none of us likes censorship.
And often it’s the same people who “wish” political messages and information communication generally (*) were all honest.
The slippery slope is no argument of course, since information flows have always been managed. In professional journalism it’s what journos and editors do. In any board or committee, minutes of meeting selectively publish what is considered significant and appropriate. As well as control arrangements on appropriateness, public interest, decency, incitement, etc on the publishing side, there are also public legal sanctions that transgressors might face. The slippery slope is a fear that control (censorship) falls into the wrong bad / state actors’ hands. But all social controls, not just informational media, are managed by our governance arrangements – the checks and balances of a functioning multi-layered democracy of some sort, we hope.
The management arrangements matter, we care what they are. How they are organised and in which bodies the responsibilities are vested.
If you’re going to reject all forms of moderation on the publishing side you are relying only on legal redress, so far as I can see. Slow, after the event, processes will never be the solution to problems with rapid universal “social” media. (See rabbits and pants and stable doors, and … nothing new under the sun.)
And it’s not just the speed mismatch that makes it fundamentally impractical, there are well-founded arguments why the law is the wrong solution to political and social behaviour. Recall that even Boris’ claim of £350m/day to the NHS on the side of a bus (ie a clear political lie) was thrown out of court. Read Kenan Malik on this, (on Fake News generally) (specifically on the Boris / £350m case and the ills of legal interference in politics) or more generally listen to this year’s BBC Reith Lectures by high-court judge Jonathan Sumption on legal interference in political life.
I have my own ideas on Moderation, expressed various places, avoiding institutional censorship except as a last resort in the process. Moderation by platforms themselves is clearly inadequate. Detail aside, I’ve yet to hear any other solution suggested by anyone, that doesn’t involve management & control arrangements on the publishing itself and/or legal redress after the event?
Shout up if you have alternative ideas, OR would like to explore next level of detail from the above.
[Like so many things aimed at improving democratic management of society, information moderation is another topic for Citizens’ Assemblies.]
(*) ALL information flows are political – communicated for a reason.