I used to know a young lady whose principle attributes were an absolutely ravishing smile and a gorgeous pair of tits. She probably had a brain too. But her mind was virgin territory. Belinda, as I shall call her, had very strong opinions on everything, which was rather unfortunate because she knew next to nothing about anything. Her ‘knowledge’ went no further than
“I heard it on TV,” or “It’s obvious, silly,” or “Everyone knows…”: a mysterious and incoherent amalgam of factoids and knee-jerk reactions. Anything she didn’t already know ran off her like water off a duck’s back, and attempts to get anything resembling a new idea into her head inevitably resulted in such acrimony that I ended up limiting my attention to other, more responsive parts of the lady’s person. As Brassens put it:
… pour l’amour on ne demande pas
Aux filles d’avoir inventé la poudre.
You’ll probably be meeting her again in these pages because, for reasons that will become clear, she has since become one of my standard reference points. I mention her today because this piece about the NSA revelations by Henry Porter in the Observer (which I shall look at more closely in another post) has me thinking, not for the first time, about the way we form opinions and why some are worth more than others.
An opinion, it seems to me, is the product of three elements: a knowledge base, a thought process and a value filter. Facts alone do not constitute an opinion, but you cannot form a valid opinion without having a certain number of facts at your disposal; and the more facts you have acquired (i.e. the bigger your knowledge base) the greater the potential value of your opinion. Acquiring raw facts is relatively easy – it’s called rote learning – but forming an opinion is about establishing relations between facts, or making patterns if you prefer. To do this, it’s necessary to bear one fact in mind long enough to compare it with others. This (alas, poor Belinda) is where raw intelligence comes in: the more variables you can juggle at the same time, the richer the resulting opinion. Most of us can manage four or five variables; genius goes to seven or eight; Belinda on a good day could reach one and a half before the blinds came down.
The third element – the value filter – is something of a mystery to me. In the press and on TV the pundits go at it hammer and tongs; they all have considerable mastery of the facts and they’ve all thought about them a lot; so why do opinions on, say, immigration or taxation or privatisation differ so widely? Why could my brother-in-law and I never agree on anything? I’m sure he thought me a woolly-minded, left-leaning liberal, just as I saw in him the worst kind of blinkered, selfish cashocrat. Human nature? Can it really be that simple.
As far as I can see, a value filter is a sort of second-level opinion. We process facts to form opinions and a collection of related opinions then forms a filter, which we can think of as one facet of our world-view. Once established, a filter… filters. It does this in two ways: it filters out facts that don’t fit, and/or it avoids our wasting time on information that we deem irrelevant. Either way it tends to be self-reinforcing.
Basically, excluding uncomfortable facts is not a good idea: if a filter isn’t sufficiently robust to accommodate new information then it needs to be adjusted. If that adjustment results in a significant modification, so be it: this is called changing your mind and it’s a sign of healthily critical thinking. Being able and willing to change your mind is more important now than it ever was before, simply because more new facts are coming to light more frequently than ever before. However, changing your mind is reputed to be a sign of weakness and those who do it come in for some stick; this is unfortunate because it does nothing for the quality of public debate. George Monbiot, for example, has been roundly and most unreasonably jeered for changing his mind over the use of nuclear energy in the light of emerging knowledge about climate change.
As for using a filter to avoid wasting time on stuff that’s irrelevant, this is fine – provided you’re aware that it’s happening and you keep an eye open for genuine nuggets among all the rejects. In my case, for example, several years of careful reading and fact checking have established a climate-change-is-a-threat-to-civilisation-as-we-know-it filter, which enables me to ignore all the repetitive, mendacious rubbish trundled forth by politicians and other deniers. I prefer instead to devote my time and energy to presenting the facts about climate change to the people of my local community.
To sum up then, we form opinions by acquiring facts and thinking about them. The more knowledge we have, the greater the potential value of our opinions, subject to the quality of the thinking we apply to it. Opinions grow into super-opinions or value filters, which must be open to the impact of new ideas if our world-view is to remain valid.
I find this a useful way of thinking – about thinking. It’s a handy framework to bear in mind these days when the latest mega-tweet can have a real, if short-lived impact on that amorphous amalgam we call public opinion. However, there is a vast and important area I’ve not dealt with here, namely moral value filters: the process by which we label things not merely as true or untrue, but as intrinsically good or bad. I suspect that that has to do with widespread ignorance of the fact that believing and thinking are not or ought not to be synonymous. If I’m feeling brave – or foolish – enough, I’ll try to deal with that in another post.
In the meantime I hope it goes without saying any feedback would be heartily welcome.