The omniscience bias: Why the internet spreads superstition

A challenge forces the brain figuratively to leave the comfort zone and enter a terrain of uncertainty. As a result it creates a gap of information to support one’s view on an issue and leaves gaps in the brain’s structure of knowledge. This “uncertainty effect” has been scientifically evaluated in 2006 by the Boston MIT and led to disturbing conclusions: The brain replaces missing information with an inexplicable fright, an “irrational by-product of not knowing — that keeps us from focusing on the possibility of future rewards”. Frankly: People want to know what they are already inclined to believe.

There has been a paradigm shift in the way we inform ourselves. Eight of ten people in the developed world using the internet. That´s more than most elections have at the ballot. About every fourth internet user is predominantly visiting social networks to use them as an information source. And people still use search engines. While Microsoft’s Bing has an estimated user base of 200 Million searches in 24 hours, Google gets hit with about 3.5 billion search the same day. There are one billion websites available today, with another 5 new ones published each second.

What effect has that abundance of information available to us? With the rise of the internet information sits right at our fingertips — regardless of its quality. Search algorithms, the categorisation of data by program, are ultimately affecting what information we consume. For example, search engines use your location to present results close to you. Or do you think this little bakery around the corner is world-renowned? In the original documents of the Stanford University of California, two students described how to create a search engine that delivers results as “an objective measure of its citation importance that corresponds well with people’s subjective idea of importance”. Again: “people’s subjective idea of importance.” Pagerank. That is Google’s initial white-paper from 1997. The search-rank of a webpage is determined fundamentally by the back-links from other webpages; frankly by the attention a site is already getting on the internet. What seems like a catch-twenty-two for newborn website, is yet a fundamental algorithm (amongst others) that influence our perception of the world.

Algorithms are developed by humans, hence not flawless. They expect a certain (human) behaviour but they cannot anticipate the abundance of possibilities. Any missing information is a “rough little edge” that will grow exponentially when applied by machines on big data. Data collection will bring the developers insights to work with and adapt algorithms. One major issue in current search engines like Google is that they rely on “honest attribution”. That simply means, the ranking of a web document does not pay attention to the quality of information it returns. With search engines, the ranking of information becomes a popularity contest.

Internet users generate popularity of information trough attention. Humans tend to believe information from a popular source to a further extend than from those they don’t perceive as such. Nine out of ten people pick only from the first page of results which Google offers on a search term. Our own behaviour is a key issue because it ranks pages/news to our (popular) liking. The internet user ultimately determines the truth. The online business with fake likes and fake comments has taken off big time. People can buy likes and comments on social networks to create the impression that they have an abundance of followers. This creates more visits and interactions by real people. Political parties, companies and even governments are in that game. Real people, “trolls”, are paid to spread misinformation while bots, automated software, hitting social profiles to generate content that is indistinctable from content which is generated by real humans. In the internet, followers are the asset, traffic is the turnover and real-world influence is the revenue.

Popularity generates popularity. With the help of the internet it happens explosion-like. There is not much time to overthink issues before they become seemingly clear as per popular agreement. More than half of Australian internet users say they can trust “most of the news that [they] use most of the time”.

The Omiscience Bias

The internet user’s contribution to the echo chamber of social popularity is not just clicking the links on the first search results page. Social media came to the rescue and gave people a digital voice. We can share information easily. And we do. Nearly a million flicks of data are hitting facebook’s servers per minute and most are consumed by us. In 2015 facebook overtook Google as the main source for information. Only 10% of content gets shared but 80% of it gets liked, which can be seen by the network of friends a user has and therefore spreads.

Internet users source and relay. On social media like Twitter people are able to tailor their information stream. On facebook users can determine their news stream. Ultimately we filter information by relaying it or not. This affects the way people reflect on the world and ultimately it affects their actions. But in what way?

There is an observable bias in people’s attitudes that developed with the use of the internet, called the “omniscience bias”. Google will give you insights on results page one, coming up with the British “New Statesman” magazine: “Omniscience bias: how the internet makes us think we already know everything”. Psychologists have replicated this effect in different ways and published the results in the “Journal of experimental psychology”: We tend to be over-confident that we have the right information we need to form opinions and make judgements. The modern internet feeds this tendency by persuading one into the belief that everything one needs to know is a click away or coming soon from a feed nearby. People flick through google-searches or social media timelines using external knowledge, becoming victim of the illusion that this knowledge is their own. We start to “mistake access to information for our own personal understanding of the information”.

On social media, where information is spreading most actively, the omniscience bias melts thoughts together by aligning them through popularity. This feedback loop becomes the famous echo chamber each internet user is sitting in. As facebook and Twitter have overtaken Google in terms of popularity for news, it affects Google’s (and Bing’s) search algorithm as well. The web-blog “search engine land” investigated if socially generated content (“social signals”) affect the ranking of information in search engines. Google plainly answered “yes”, while Microsoft added “We do look at the social authority of a user”. Who does? Human beings at a corporate entity or machine-run algorithms? Neither option seems pleasant.

Combined with social signals of shares and likes, Google & Bing display the ultimate selection of what people want to know about, not need to know about. And who’s to determine what people need to know about?

The Uncertainty Effect

In the 1960s a series of experiments were conducted, suggesting people’s bias towards confirming their existing beliefs when it comes to information processing. It has been labelled “Confirmation Bias”. Warren Buffett nails the issue: “What the human being is best at doing is interpreting all new information so that their prior conclusions remain intact.” — It is, in fact, that the human brain simply cannot operate without a collection of experiences. And these experiences are the frame we are thinking in. Something that the philosopher Immanual Kant stated over two centuries ago. The internet enabled us to dig out information that we ultimately incorporate as own experiences. But instead of critically reflecting on challenging ideas or facts, people tend to collect fragments of information that fortify own beliefs instead of challenging them.

The magazine Scientific American published a research-article where they ”examined a slippery way by which people get away from facts that contradict their beliefs”. The issue of same-sex marriage was presented to people “who supported or opposed same-sex marriage with (supposed) scientific facts that supported or disputed their position. When the facts opposed their views, [their] participants — on both sides of the issue — were more likely to state that same-sex marriage isn’t actually about facts, it’s more a question of moral opinion. But, when the facts were on their side, they more often stated that their opinions were fact-based and much less about morals.”

A challenge forces the brain figuratively to leave the comfort zone and enter a terrain of uncertainty. As a result it creates a gap of information to support one’s view on an issue and leaves gaps in the brain’s structure of knowledge. This “uncertainty effect” has been scientifically evaluated in 2006 by the Boston MIT and led to disturbing conclusions: The brain replaces missing information with an inexplicable fright, an “irrational by-product of not knowing — that keeps us from focusing on the possibility of future rewards”. Frankly: People want to know what they are already inclined to believe.

With the internet enabling people to submerge those islands of fear quickly, it might even have a detrimental effect on our sanity. The brain makes hypotheses about how the world should be and matches it with its sensory input. Are the hypotheses affirmed, the congition happens after very short processing times. If they are no match the brain needs to correct its hypotheses the processing times are prolonged. In most cases, so Singer, the act of congition confines to the affirmation of already formulated hypotheses.

Today we have developed a dependency on technology, which seems to have eclipsed our reliance on logic, critical thinking and common sense by evading the burden of critical reflection. In other words: The internet causes superstition to be spread.

People indulge headlines and internet-facts, weaving meaningful patterns to prove their point of view. This is not the discourse of an “extended mind” leading to knowledge as Andy Clark and David J. Chalmers of the Washington University tried to elaborate; it is merely building two versions of a house of cards with popular information fragments, ultimately looking all the same. Public opinion is peaking in a dualism of contrary views as the ramifications of different views are overridden by a more general idea. As long as parts of its concept do fit in a persons structure of thinking, it will be inherited leading to polarisation of attitudes. The 2015/2016 European refugee “crisis” made the issues of the internet as an echo-chamber visible. A so called attitude polarisation quickly spread. Declarations were used either side that were meant offensive: “Gutmensch” (good human) vs. “Nazi”. Surely there are people sharing views with either extreme which doesn’t necessarily make them one of those groups. Yet any discourse and critical discussion seems to become overruled by popular selection. In this case the internet became a tool as an intellectual gap-filler and a solidifier for inclining mindsets. And every human being is in imminent danger of aligning with a major stream.

As long as the uncertainty effect is an observable fact of our brain function, as long there will be bias. The internet is collecting and reflecting our human attitudes which have the power to quickly and violently create polarisation of views and manifestations in the real world. For what it’s worth, the net will open up deeper insights into social issues.

As internet facts don’t always bear the truth — or are at the least biased by selection, incorporating misinformation into one’s own world view has lead to an era some people describe “post factual”. The internet itself is not the reason or source for this diminution of varying mindsets but an (overrated) amplifier and catalyst. Finding valid information, i.e. the truth, is a personal challenge and responsibility. The spreading of information has never been so simple while the consumption of true facts has never been so hard. While humans behave to their nature, the internet is learning how that nature looks like. The web consists of the people who feed it information and big data is capable of shrinking that society’s reality to the essence of information. This bears the chance for social evolution if we learn from it. For starters, the internet is beginning to make fundamental behavioural patterns of our brains visible. From a social perspective it shows the division of critical thinking people and plain, frightened followers in the real world.


Weitere...

Schlaflos in Auckland Der andauernde Jetlag ist Grund für dieses Reisetagebuch. Es ist der vierte Tag und wir sind 3:30 Uhr wach. Verständlich, wenn man sich am Tag vorher ...
Lieber rot als tot? Naja, lieber alles als tot. Grundsätzlich erstmal. Die Kanzlerin mörtelt Steine auf RWE-Kohlekraftwerke, die olympischen Spiele sind von zynischem Bew...
Klima? Ich hab noch keins gesehen… Ich hatte es schonmal geschrieben - wir sind - im globalen Maßstab - Kinder. Aber was bringt diese Erkenntnis? Nur wieder ein Mahnen mit dem "Zeigefin...

Dieser Post ist auch verfügbar auf: Englisch

Kommentar verfassen