I had this list in a recent post, reacting to Steven Pinker’s definition of rationality being so broad it was indistinguishable from any number of “good things”
He says:
“Rationality is using knowledge to attain [human] goals.”
I said:
Using knowledge to attain (human) goals is Rationality?
Using knowledge to attain (human) goals is Politics
Using knowledge to attain (human) goals is Cybernetics
Using knowledge to attain (human) goals is Game Theory
Etc, etc.
And now I add from this morning’s Reith Lecture by Stuart Russell on AI
Using information to attain goals is Artificial Intelligence?
Using information to attain (human) goals is Intelligence
One land-grab after another. Unashamed political interest.
Definitions so broad they are meaningless and useless.
If that’s Pinker’s unqualified definition, it’s a poor one. It could easily include eugenics or other doubtful “human goals,” such as trying to be richer than everybody else, or enforcing societal peace through state surveillance, or winning wars with extreme efficiency. An essential part of being rational is choosing the right goal: it’s about making decisions that are “measured.” That may involve knowledge, such as how to split the atom. But it also involves judgement, such as when and why to split the atom.
Yes, that was basically my point in the original (linked) post.
(He does qualify himself that this kind of rationality cannot decide your goals.)