Wednesday, July 22, 2015

Why it is useless to try and change the minds of those with extreme

Why it is useless to try and change the minds of those with extreme        

http://www.livemint.com/Consumer/dJyyhAwTsg4PXLSViUg6cJ/Using-Bayes-Theorem-to-understand-extreme-opinions.html

LAST MODIFIED: MON, JUL 20 2015. 11 00 PM ISTHOME» CONSUMERUsing Bayes' Theorem to understand extreme opinions
The basic principle of Bayes' Theorem is to take a set of 'prior beliefs' and see how they change in the face of given evidence
Karthik Shashidhar
When two people with strong prior beliefs at opposite ends of the spectrum meet, there is absolutely no way either of them can influence the other. Now think about those prime time discussions on news channels. Photo: iStock
Dilip D'Souza, writing in his column A Matter of Numbers, gave a great introduction to the Bayes' Theorem, and used it to help us better analyse the Aarushi Talwar murder case. Analysing murder cases is but one use of this legendary theorem-it has applications in pretty much every aspect of human life. In this piece, we will use Bayes' Theorem to analyse why people continue to hold extreme opinions.
The last couple of years have seen a significant polarisation in the discourse in social media. It is not uncommon to see people hold on to rather unreasonable opinions even in the face of strong contrary evidence. This inability to change opinions leads to slanging matches and other unsavoury behaviour, which end up serving no purpose apart from annoying more reasonable people.
The basic principle of Bayes' Theorem is to take a set of "prior beliefs" and see how they change in the face of given evidence. As all such explanations in mainstream newspapers go, let us start with a toy example and then move on to a more serious one. And then we can see why extreme opinions are hard to shake off.
This is probably not the best time for a cricket example, but the Ashes are ongoing, so it's not a bad idea. Let's say a cricket match starts off and the batting team scores 6,7 and 4 runs respectively in the first three overs. The problem is to determine whether it is a 'batting pitch' or a 'bowling pitch'. It is known that a batting pitch produces an average of six runs per over while a bowling pitch produces an average of four runs per over. Let's assume that in both cases the standard deviation is one and a half runs per over, and the distribution is normal. What can we tell about the pitch after the first three overs?
Firstly, if it were a batting pitch, the probability of 6,7 and 4 runs in the first three overs is 0.61% (the respective probabilities are 26%, 21% and 11% respectively, and we will multiply them to get 0.61%). If it were a bowling pitch, the probability of this scoring pattern is 0.11% (11%, 4%, 26% respectively) . Given that one number is six times the other, can we conclusively say that it is a batting pitch?
Not so fast, for Bayes Theorem is one of 'conditional probabilities&# 39; and we need to know the 'prior beliefs' before we make our decision. Let's assume Ramu believed before the match that there was equal likelihood of batting and bowling pitches. Shamu, on the other hand, thought there was a 90% probability of it being a bowling pitch and only 10% probability of a batting pitch. How do they react to the given information?
Having observed the scoring pattern in the first three overs, Ramu 'updates&# 39; the probability of it being a batting pitch as 50% x 0.61% / (50% x 0.61% + 50% x 0.11%) = 85% (this is the all-important Bayes' formula for conditional probabilities) . In other words, having started with a prior belief that batting and bowling pitches were equally likely, Ramu has 85% confidence after three overs that it is a batting pitch.
Shamu is not convinced. Given his prior beliefs, the probability of it being a batting pitch after observing three overs comes out to only 38% (10% x 0.61% / (10% x 0.61% + 90% x 0.11%) ). In other words, having observed the same evidence as Ramu, Shamu believes it is more likely a bowling rather than a batting pitch The difference occurs because of their respective 'prior beliefs' .
Changing domains, let us assume that a government can be classified as either 'corrupt&# 39; or 'not corrupt' . A 'corrupt&# 39; government indulges in an average of six scams a year while a non-corrupt government indulges in an average of four scams a year (counting scams is a different problem in itself and we will save that for another day. Let's assume we can count easily here). In both cases, let us assume a standard deviation of one and a half scams and a normal distribution (note how cunningly I've chosen the numbers to be identical to the cricket problem).
Let us assume that we observe six, seven and four scams respectively in the first three years of government. What do you think of the government&# 39;s corruption after this? As you might expect, it depends upon your prior belief!
If Ramu thought that the government had an equal chance of being 'corrupt&# 39; and 'non-corrupt&# 39; when it came to power, based on the evidence of the first three years, he is now '85% confident' that the government is corrupt. If Shamu had voted in the government and believed there was a 90% chance that it was not corrupt when it came to power, he now gives it only a 38% chance of it being corrupt.
I might be repeating myself, but your prior beliefs affect how you perceive evidence. This might give you an indication of why people don't change opinion even in the face of overwhelming evidence. Let us introduce two new characters Bheemu and Somu. When the government came to power, Bheemu thought there was only a 1% chance that it would be corrupt. Somu went further. Having campaigned for the ruling party, he was convinced that there was a 0% chance it would be corrupt (given our definition of corruption above).
How do Bheemu and Somu change their minds after three years of 6,7 and 4 scams respectively? I'll spare you the calculations but Bheemu updates his probability of the government being corrupt to 5%. Notice that he hasn't moved very far from his prior belief of 1%. Perhaps a few more scamridden years might convince him to change his mind, though it will still take time and strong evidence. He is 'moving&# 39;, albeit slowly.
What of Somu? Given his prior beliefs, he updates the probability of the government being corrupt after three years to-wait for it-zero! Even if the evidence against the government gets stronger, irrespective of how strong this evidence gets, his beliefs will remain absolutely unchanged! Somu is some kind of a 'true believer' in the non-corruption of the government.
This, perhaps, helps explain the people with unwavering opinions. If your prior belief is extreme, there is no evidence that can make you change your mind. Even if your prior belief is tilted heavily towards one side, it takes overwhelmingly strong evidence for you to move your position significantly. And that is what seems to be happening in the discourse on social media.
This dependence on prior probabilities is what makes Bayes' Theorem controversial and resulted in a wait of a couple of centuries before the theorem was accepted. Even now, debates between 'frequentists&# 39; and 'Bayesians&# 39; abound, and they are unlikely to go anywhere due to the reason we saw above-extremely strong priors.
When people have been starting off with such strong beliefs that no evidence is enough for them to change their opinions, no amount of debating is going to have an impact. When two people with strong prior beliefs at opposite ends of the spectrum meet, there is absolutely no way either of them can influence the other. Now think about those prime time discussions on news channels.
Postscript: Readers who are interested to know further about this wonderful theorem are encouraged to read Sharon Bertsch McGrayne' sThe Theory That would Not Die , a wonderfully written history of the Bayes Theorem.FIRST PUBLISHED: MON, JUL 20 2015. 12 48 PM IST

No comments:

Post a Comment