Decision Theory for Planners#111 Popping the Filter Bubble

One of the key reasons The Nazis lost the battle of Britain was the inability of their fighters to protect bombers for more than about 20 minutes because of small fuel tanks. Field-marshal and later Air Inspector General Erhard Milch had recommended months before the battle of Britain that cheap drop tanks should be developed in preparation. The programme never went ahead, why? It was questioning the conventional wisdom of the organisation and of his superiors, who tended to discount the information of those lower down the hierarchy. It is what is known as hierarchical incompetence .

The theory is in any organisation that someone will get promoted in an organisation if they are competent and not get promoted if they are not. The consequence of course is the Peter Principle, people will get promoted to the level of their own incompetence. The employee’s incompetence is not necessarily a result of the higher-ranking position being more difficult — simply, that job may be crucially different from the job in which the employee previously excelled, and thus requires different work skills, which the employee may not possess, skills such as management rather than technical competency.

Hitlers skill was as a party leader and rabble rouser, his incompetence was as a commander in chief, never wishing to make a tactical retreat or listen to the ideas of others. The film Das Bunker should be read as an allegory on how not to make decisions in any large organisation.

The relevance paradox occurs because people seek only information that they see is relevant to them. There may be information. That is not perceived as relevant because the person does not already have it and becomes relevant only he has it. Lets give an example. Google etc. may seem like a completely open way of seeking information, we are in command. No website operators have worked out that people that hold to a particular view are only likely to seek out information confirming that view, so they optimise their websites to capture those people in search results. Its called search engine optimisation or SEO for short. Its a multi-billion dollar business. Lets say you wished to set up a baby clothes website. You need to capture those people with babies and who might be searching for things on the net about babies. Lets say there was a health scare. You set up a magazine with lots of articles on the baby health scare but at the side put links and adverts for baby clothes. Lets say you were a climate change denier and did a search for ‘global warming disproof’ you might convince yourself that all of the evidence disproved it. But youd be looking in the wrong place. Eli Pariser has called this the Filter Bubble, its another illustration of confirmation bias.

Poor information is at the route of why decisions go wrong. So how may we improve on information flows and gathering.

One tactic is to stress test the foundation of the belief as to why something is correct. For example town planners have lots of foundational belief they believe to be true. They are unaware that many many people disagree with these because they have never sought them out. They are shocked when in positions of power they implement anti-panning beliefs based on concepts they have never heard of. Deliberately seeking out such contrary information, through role playing, can inform you if your foundational arguments need correction.

Organisations possess considerable tacit knowledge, knowledge that people are not often aware of they possess or how it can be valuable to others. Communicating tacit knowledge generally requires extensive personal contact and trust. In overly hierarchical organisations restrict the flow of such knowledge.

Dave Andrews of the Open University has developed many ideas of how to improve the flow of information across organisations. In the late 1970s he was looking at energy he found that energy and raw material could be saved, and pollution avoided by interlinking industrial processes.

Most professionals and policy makers were not interested in these opportunities because they were unaware of them, for example the waste heat from power stations, which is an example of hierarchical incompetence and the relevance paradox. Andrews called this the need to find interlock . As no individual or group can ever understand anything fully on their own their is a need to interlock with the knowledge and ideas of others.

Writing in 1984 he urged urgent development of “lateral media” – a system where a PC in every home would be linked by modems and the telephone network and be equipped with software to enable messages, news and inquiries to be forwarded selectively to create a cloud of lateral communications hopping from computer to computer (similar to the way a flock of birds or shoal of fish communicated in order to stay in a coherent whole). He had unknowingly invented social media.

The problem of course with social media is when we lock ourselves in the filter bubble and restrict the potential benefits of interlock knowledge.

An interlock diagram is a way of bringing together expert knowledge to pierace teh filter bubble

For example in energy policy a woprking group might consist of people expert is several spheres of energy.

If then you map the limits of the potential of each sector you expose the tacit knowledge and throw up opportunities for interconnection. Its a way of putting expert knowledge at the fingertips of the decision maker.

Andrews, David; The IRG Solution – Hierarchical Incompetence and how to overcome it. Souvenir Press, London, 1984 – available online here.

Leave a comment