Ode to Ignorance

“Seek First to Understand, Then to be Understood”

The 7 Habits of Highly Effective People, Stephen R. Covey

It is increasingly common to considering people with different views as “not listening to reason”. If such an encounter happens, it is increasingly okey to disconsider the other side of the discussion and not bother discussing at all. For example, people who voted for Brexit or Trump may simply be dismissed as “not knowing what they did”, without trying to understand their position. True, the fact that the future US president does not believe in global warming and the future US vice-president believes in creationism makes it off-putting to engage in intelligent conversations, but that is no excuse for not even trying.

I think we are not seeing the greater problem that is hiding behind these small disagreements:

It has become increasingly difficult to independently verify findings.

Let me give you an example. Assume you wanted to convince me that the Earth is round. How would you do that if I did not trust you? You could show me pictures from outer space, but then I could argue that you faked them. You could call me on Skype from the US to tell me that it is daylight there while it is night in Sweden, but I would need to trust that you did not put a poster on your window. We could perform the Bedford Level experiment, but even that is tricky to get right.

It is also tempting to just claim that people “did not do their research”. I mean, it should be easy to find a report that exhaustively lists the arguments for showing that the Earth is round. Unfortunately, truth fabrication has become such a great enterprise that reports can nowadays be found for any viewpoint. Climate change is fake? There are documentaries on that! Climate change is real? There is a report on that! And the list goes on.

“Well, you simply need to check if the authors are credible.” Now you have two choices: (a) you trust the authors – you’re done, (b) you check the references and data used in the report. The latter path is recursive, which completes either when (i) you found some authors that you trust, (ii) you reached some axioms that you agree are true and/or made measurements yourself. Depending on your level of distrust – in other words, how many levels of recursion you went down – you might end up spending a lot of time. Essentially, you might end up redoing all the science that generations of groups of research have done, which is impractical for mortals.

What I am trying to argue is that you need a “trust base”, a group of axioms and authors that you trust to make it practical for you to verify facts within your lifetime. But what if my “trust base” is completely different from your “trust base”, not because of ignorance, but because we both independently tried to create a system to quickly verify facts? Worse off, both our “trust bases” might have been strengthened by past experience. Your “trust base” allowed you to understand your jet lag, whereas my “trust base” allowed me to get over difficult situations in life. We ended up speaking the same math, but assuming completely different axioms. It is like Euclid teaching geometry to a non-Euclidian.

I hope I managed to convince you that agreeing with a viewpoint is not only an issue of reason, but also a matter of trust. Although it feels sad, so many topics have had trustfulness problems in the past (GMOs, vaccines, global warming, food) either due to profit or fame, that it no longer surprises me that people just shut down and prefer to pick facts (or “axioms”) based on feelings or intuition.

But let us not stay on the complaining side and get actionable about it. Here are some items I would start with: