I left my review of Paul Mason at his revelation that Karl Marx is to be our saviour.
Not a bombshell that Mason has serious left leanings. Even as a journalist he’s always worn his heart on his sleeve supporting the financially underpowered – most recently in doggedly sticking up for the Greeks against the EU, ECB and IMF.
Interesting then that the Marx he recommends is not the Marx of Das Kapital.
Marx ‘s outline draft papers, Der Grundrisse, contain his Fragment on Machines. In this he saw fundamentally that knowledge was the valuable aspect of production – the how-to of labour inputs – even as the physical aspects of work became automated, increasingly supervised by the labour. And echoing Mason’s earlier reference to universal knowledge in the info-tech wave – not so much polymathic as genuinely generic and conceptual understanding – Marx gave us The General Intellect. The ability to know how; the understanding of how things work independent of the specific economic production or technology context. Mason gives us real examples from pre and early info-tech days (telegraph operators) and the fact that knowledge involves understanding the psychology of the fellow humans in the wider system, not just the immediate technology involved.
After showing how this view fits with the labour-theory of capitalism, it brings us to an interesting question of why does a pseudoscience like economics need a theory? What’s wrong with actual documented effects, pure empirical evidence?
After a reminder of the time-axis in labour-theory, data points not being static, but representing different stages in different process instances, so they can’t simply be manipulated arithmetically, we get an Einsteinism that encapsulates the focus on conceptual “general intellect”. Labour-theory, or any worthwhile economic theory …
… belongs to a class of ideas that Einstein described as “principle theories”: theories whose aim is to capture the essence of reality in a simple proposition, which may be removed from everyday experience. Einstein wrote that the aim of science is to capture the connection between “all experiential data in their totality” – and to do this “by use of a minimum of primary concepts and relations”.
The more clear and logically unified these primary concepts were, the more divorced they would be from data. The truth of a theory is, for certain, borne out by whether it successfully predicts experience, but the relationship between the theory and the experience can only be grasped intuitively.
Mainstream economics evolved into a pseudo-science that can only allow for statements obtained through crunching the data. The result is a neat set of textbooks, which are internally coherent but which continually fail to predict and describe reality.
Fascinating. My whole agenda in there. Autistic economics for sure, but also the weirdness of causation in general. The myth of empirical objectivity, when it comes to static data and predictive theory. In the quest to appear “scientific” – a word that monopolises credibility – any socio-politico-economic theory misses what science itself fails to notice. Reality is not scientific.
This much is given. But there are a couple of more contentious points.
Information IS immaterial, as Wiener (and Dennett) say. But Landauer (IBM) was right, it’s physical representation is always physical, whether in silicon or in synapses. It’s these physical representations that may tend to zero marginal cost. The information content remains independent of its physical embodiment however.
Secondly, information is NOT knowledge. Data and information may be freely networked – without any intrinsic hierarchy – but know-how and wisdom do come in layers. Mason himself is using the universal-knowledge / general-intellect ideas. These are free to the human that has them, and whilst their distribution is not controlled by any pricing model and their information content can be “shared” democratically, marginal possession of intellectual capability to understand varies hierarchically.
He’s right about one thing however. Once such increased knowledge can be embodied in machine processes, it can be standardised in all machines. Standards (my life for the last two decades) are information products that can be freely distributed. Generic intellect is always ahead of this game – unless machines can also think. Appropriately, AI is Mason’s next topic.
Unfortunately, he doesn’t pursue this beyond increased automation, whereby not only the labour inputs, but the machines themselves – real or virtual – also tend to becoming free through “repeated applications of info-tech”. It’s the repeated algorithmic application of information patterns upon information patterns. There’s no AI; no machine thinking. But if both labour and machines (capital) lose any marginal (financial) value, effectively becoming free, what is capitalism left with?
One thought on “Economics – in Theory”