Aug 252017
 

When I arrived at the Drayton Arms, he was already there. He had contacted me a few days earlier and we had arranged to meet for a drink. He worked for a head hunting firm, focused only – he was keen to specify – on investment management. After the introductory chit-chat, I made it clear that I was not interested in a job offer, and he made it clear that his purpose was to present his services to my firm’s potential hiring needs. With that out of the way, the conversation moved on amiably, flowing from market conditions to value investing, Brexit and other world affairs.

Until at one point – I can’t remember how and why – we veered towards terrorism, and from there to 9/11. “Of course” – said Sandeep, with the casual air of someone who is sharing the obvious among world-savvy, knowledgeable people, “it was clearly an inside job”.

“What? What do you mean?” – I looked at him straight in the eyes.

“What? You don’t think so?” – Sandeep was genuinely taken aback by my sudden change of tone. Which, I agree, requires some explaining.

I have a Spinozan tolerance for freedom of opinion. It is the essence of Bayes: different priors, different information, or different interpretations of the same information, can give rise to different conclusions. This is obvious, and there is nothing wrong with it. But of course it doesn’t mean that anything goes. It means that, even when I have a strong view, I hold on to Cromwell’s rule and remain open to the possibility that, however high in my mind is the probability that I am right, I may be mistaken. As we know, hypothesis testing is the result of a tug of war between confirmative and disconfirmative evidence, which accumulates multiplicatively, leaving the possibility that, however overwhelming the evidence may be on one side, it may be annihilated by even one piece of conclusive evidence on the other. Another consequence of this framework is that, while I strive for certainty, I am comfortable with uncertainty: if neither side is strong enough to win the tug of war, there is nothing wrong with accepting that a hypothesis is only probably right, and therefore also probably wrong.

It is important to remember, however, that this only works insofar as one makes sure that evidence accumulation is as thorough as possible on both sides. This is easy to understand: there is no point gathering a lot evidence on one side while neglecting to do it on the other. One side will win nothing but a rigged game. But it is far from easy to do it in practice, as it requires fighting our natural tendency to succumb to the Confirmation Bias. The easier one side seems to be winning, the stronger should be our urge to reinforce the other side. It is by winning an ever tougher tug of war that we can aim to approach certainty.

This is an aptitude I have learned to nurture. The more I am convinced about something, the more I like to explore the other side, trying to distil its best arguments. If this succeeds in lowering my confidence, so be it: I feel richer, not poorer. And if it doesn’t, I am richer anyway, as I have built a clearer picture of what the other side stands on. This, after all, is what understanding means – distinct from justifying and, more so, from agreeing. The better one understands an argument, the easier it becomes to dismantle it and, perhaps, convince people on the other side to change their mind.

This is where sometimes I fail to keep my composure: when I face a conviction based on a pile of one-sided arguments, typically soaked in hyperbolic language, which blatantly misrepresents, disregards or belittles the other side. But what really gets on my nerves is a dirtier trick: when the balance of evidence is overtly on one side, the only way to overturn the verdict is to find – or, failing that, make up – a conclusive piece of evidence on the other side. This is the standard trick employed by conspiracy theorists: I call them Conclusionists, and the pit they fall into a conclusive evidence trap.

That’s what happened with Sandeep.

“Of course I don’t think so!” I replied. “How can you say such a … thing?” I asked, working on resisting my own adjectival overpouring. He looked at me with candid disbelief. How could I be so naïve? The web is full of information about it – he said. And when I asked him to give me an example, he explained: “Of course it is not in the usual places. You need to know where to look”.

Oh my God. One tends to imagine Conclusionists as showing some exterior signs of dimwittedness. But there he was, a perfectly nice, bright-looking guy, splattering such shocking bullshit. As he excused himself to the men’s room, I tried to collect myself. But failed miserably. “So Sandeep” I asked him as he came back, even before he could regain his seat “Who killed JFK? And what about those moon landings? And the Illuminati? It’s all down to Queen Elisabeth, eh?” I deserved a sonorous expletive. But Sandeep was a gentleman, and perhaps he had regretted his own condescension over his micturating interval. “I see your point” he smiled “I’m not saying that everything you find on internet is true. But…” At which point I grabbed the two seconds void and, after mumbling some sort of apology myself, I cleared the air with a liberating “Anyway…” followed by a question about salaries, as if the whole interlude had never happened. The conversation resumed its cordial tone and carried on for a while, until it was time to go. We departed with the inevitable “Let’s keep in touch”. I never heard from him since.

Print Friendly, PDF & Email

Any comments?

This site uses Akismet to reduce spam. Learn how your comment data is processed.