I have recently read Enlightenment Now by Harvard and MIT cognitive psychologist Steven Pinker, which really is an extraordinary book. In chapter 21, Professor Pinker explains that many of the most contentious political beliefs are not grounded in reason or the depth of knowledge regarding an issue but rather in following party's line and belonging to the group. In politics like in sports tribal instincts prime over rationality. I found his argument very persuasive and it really helped me understand why Trump was elected and why his supporters can accept some of the nonsense and lies he keeps on uttering.

Here is an example regarding climate change.

On page 357:

"Believers in human- made climate change scored no better on tests of climate science, or of science literacy in general, than deniers. Many believers think, for example, that global warming is caused by a hole in the ozone layer and that it can be mitigated by cleaning up toxic waste dumps. What predicts the denial of human- made climate change is not scientific illiteracy but political ideology."

"The principal reason people disagree about climate change science is not that it has been communicated to them in forms they cannot understand. Rather, it is that positions on climate change convey values— communal concern versus individual self- reliance; prudent self- abnegation versus the heroic pursuit of reward; humility versus ingenuity; harmony with nature versus mastery over it— that divide them along cultural lines. The values that divide people are also defined by which demons are blamed for society’s misfortunes: greedy corporations, out- of- touch elites, meddling bureaucrats, lying politicians, ignorant rednecks, or, all too often, ethnic minorities. Kahan notes that people’s tendency to treat their beliefs as oaths of allegiance rather than disinterested appraisals is, in one sense, rational. With the exception of a tiny number of movers, shakers, and deciders, a person’s opinions on climate change or evolution are astronomically unlikely to make a difference to the world at large. But they make an enormous difference to the respect the person commands in his or her social circle. To express the wrong opinion on a politicized issue can make one an oddball at best— someone who “doesn’t get it”— and a traitor at worst. The pressure to conform becomes all the greater as people live and work with others who are like them and as academic, business, or religious cliques brand themselves with left- wing or right- wing causes. For pundits and politicians with a reputation for championing their faction, coming out on the wrong side of an issue would be career suicide. Given these payoffs, endorsing a belief that hasn’t passed muster with science and fact- checking isn’t so irrational after all— at least, not by the criterion of the immediate effects on the believer. The effects on the society and planet are another matter. The atmosphere doesn’t care what people think about it, and if it in fact warms by 4 ° Celsius, billions of people will suffer, no matter how many of them had been esteemed in their peer groups for holding the locally fashionable opinion on climate change along the way."

Page 358:

"The perverse incentives behind “expressive rationality” or “identity- protective cognition” help explain the paradox of 21st- century irrationality. During the 2016 presidential campaign, many political observers were incredulous at opinions expressed by Trump supporters (and in many cases by Trump himself), such as that Hillary Clinton had multiple sclerosis and was concealing it with a body double, or that Barack Obama must have had a role in 9/ 11 because he was never in the Oval Office around that time (Obama, of course, was not the president in 2001). As Amanda Marcotte put it, “These folks clearly are competent enough to dress themselves, read the address of the rally and show up on time, and somehow they continue to believe stuff that’s so crazy and so false that it’s impossible to believe anyone that isn’t barking mad could believe it. What’s going on?” What’s going on is that these people are sharing blue lies. A white lie is told for the benefit of the hearer; a blue lie is told for the benefit of an in- group (originally, fellow police officers). While some of the conspiracy theorists may be genuinely misinformed, most express these beliefs for the purpose of performance rather than truth: they are trying to antagonize liberals and display solidarity with their blood brothers. The anthropologist John Tooby adds that preposterous beliefs are more effective signals of coalitional loyalty than reasonable ones."

Another interesting point is that, however rational people are usually, when it comes to political discussion people suddenly abandon objectivity and rationality in favour of partisan emotions.

On pp. 359-360:

"Psychologists have long known that the human brain is infected with motivated reasoning (directing an argument toward a favored conclusion, rather than following it where it leads), biased evaluation (finding fault with evidence that disconfirms a favored position and giving a pass to evidence that supports it), and a My- Side bias (self- explanatory). In a classic experiment from 1954, the psychologists Al Hastorf and Hadley Cantril quizzed Dartmouth and Princeton students about a film of a recent bone- crushing, penalty- filled football game between the two schools, and found that each set of students saw more infractions by the other team. We know today that political partisanship is like sports fandom: testosterone levels rise or fall on election night just as they do on Super Bowl Sunday. And so it should not be surprising that political partisans— which include most of us— always see more infractions by the other team. In another classic study, the psychologists Charles Lord, Lee Ross, and Mark Lepper presented proponents and opponents of the death penalty with a pair of studies, one suggesting that capital punishment deterred homicide (murder rates went down the year after states adopted it), the other that it failed to do so (murder rates were higher in states that had capital punishment than in neighboring states that didn’t). The studies were fake but realistic, and the experimenters flipped the outcomes for half the participants just in case any of them found comparisons across time more convincing than comparisons across space or vice versa. The experimenters found that each group was momentarily swayed by the result they had just learned, but as soon as they had had a chance to read the details, they picked nits in whichever study was uncongenial to their starting position, saying things like “The evidence is meaningless without data about how the overall crime rate went up in those years,” or “There might be different circumstances between the two states even though they shared a border.” Thanks to this selective prosecution, the participants were more polarized after they had all been exposed to the same evidence than before: the antis were more anti, the pros more pro. Engagement with politics is like sports fandom in another way: people seek and consume news to enhance the fan experience, not to make their opinions more accurate. That explains another of Kahan’s findings: the better informed a person is about climate change, the more polarized his or her opinion. Indeed, people needn’t even have a prior opinion to be polarized by the facts."

"If these studies aren’t sobering enough, consider this one, described by one magazine as “The Most Depressing Discovery About the Brain, Ever.” Kahan recruited a thousand Americans from all walks of life, assessed their politics and numeracy with standard questionnaires, and asked them to look at some data to evaluate the effectiveness of a new treatment for an ailment. The respondents were told that they had to pay close attention to the numbers, because the treatment was not expected to work a hundred percent of the time and might even make things worse, while sometimes the ailment got better on its own, without any treatment. The numbers had been jiggered so that one answer popped out (the treatment worked, because a larger number of treated people showed an improvement) but the other answer was correct (the treatment didn’t work, because a smaller proportion of the treated people showed an improvement). The knee- jerk answer could be overridden by a smidgen of mental math, namely eyeballing the ratios. In one version, the respondents were told that the ailment was a rash and the treatment was a skin cream. Here are the numbers they were shown: Improved Got Worse Treatment 223 75 No Treatment 107 21 The data implied that the skin cream did more harm than good: the people who used it improved at a ratio of around three to one, while those not using it improved at a ratio of around five to one. (With half the respondents, the rows were flipped, implying that the skin cream did work.) The more innumerate respondents were seduced by the larger absolute number of treated people who got better (223 versus 107) and picked the wrong answer. The highly numerate respondents zoomed in on the difference between the two ratios (3: 1 versus 5: 1) and picked the right one. The numerate respondents, of course, were not biased for or against skin cream: whichever way the data went, they spotted the difference. And contrary to liberal Democrats’ and conservative Republicans’ worst suspicions about each other’s intelligence, neither faction did substantially better than the other. But all this changed in a version of the experiment in which the treatment was switched from boring skin cream to incendiary gun control (a law banning citizens from carrying concealed handguns in public), and the outcome was switched from rashes to crime rates. Now the highly numerate respondents diverged from each other according to their politics. When the data suggested that the gun- control measure lowered crime, all the liberal numerates spotted it, and most of the conservative numerates missed it— they did a bit better than the conservative innumerates, but were still wrong more often than they were right. When the data showed that gun control increased crime, this time most of the conservative numerates spotted it, but the liberal numerates missed it; in fact, they did no better than the liberal innumerates. So we can’t blame human irrationality on our lizard brains: it was the sophisticated respondents who were most blinded by their politics. As two other magazines summarized the results: “Science Confirms: Politics Wrecks Your Ability to Do Math” and “How Politics Makes Us Stupid.”"