Ricky Gervais appeared on The Late Show a few months ago. Gervais, an actor-comedian and outspoken advocate for secular humanism, had a frank conversation about religion with host Stephen Colbert, a practicing Roman Catholic. I watched, fascinated, as the video clip popped up on Facebook and my friends described it. My Christian friends thought Colbert had stumped Gervais, revealing the shallowness of his worldview. My atheist friends thought Gervais represented their perspective well, to the extent that Colbert let him get a word in edgewise. And I just scratched my head and wondered if everyone had really watched the same footage.
I might as well be asking why the fans see a strike when the umpire calls a ball. The pitch was either in the strike zone or it wasn’t, and yet different people watching the same event, sometimes even the same instant replay angle, can arrive at different answers. We know some issues are subjective; when two people watch the same movie and have different reactions, we chalk that up to personal taste. But how do we account for disagreements on apparently objective matters? Beyond balls and strikes or talk show victories, how do Gervais and Colbert read the same book and arrive at such different conclusions? These aren’t new questions either; why did even the earliest followers of Jesus have different interpretations of his teaching?
One answer is bias. Social scientists have cataloged numerous cognitive biases—interpretive slants of a message’s recipient rather than its author. These biases at least partly explain how people can arrive at different interpretations of the exact same data on matters of objective truth, such as whether a given pitch was inside the strike zone. Furthermore, these biases often operate at an unconscious level. We may not intend to prefer information that confirms our beliefs and we are likely not aware of all the ways that we realize that preference, but we are all susceptible to confirmation bias nevertheless. Thus, we may blithely curate bubbles of information which fail to challenge our beliefs, and may even fall prey to stories which seem plausible because they are consistent with how we view the world despite having no verifiable connection to reality.
Our era resounds with echo chambers, and fake news leads social media feeds. We no longer all read the same paper of record or watch the same handful of primetime news broadcasts. We can readily access a variety of news sources, and the ones we choose say a lot about our politics, such as who we voted for in the last presidential election, as found by the Pew Research Center. Fragmented audiences make targeted messaging more viable, which perhaps may explain why Pew discovered nearly three in four Americans believe news organizations favor one party over others in their political and social reporting.
If we want to recommend the Christian faith to skeptics or be influential in the world, it’d serve us well to understand why many of We may also want to turn that lens inward to search for our own blind spots. Biases require us to work harder to find the truth.
Cognitive bias may explain why I saw seemingly conflicting interpretations of the Gervais/Colbert conversation. Both atheists and Christians confirmed their prior expectations and favored the beliefs of whichever group each person already identified with. In fact, sharing and commenting on the video within a social network may be a way of reinforcing those group beliefs and even signaling one’s own adherence to them. Our groups—from political parties and religious communities to sports and entertainment fandoms—form part of our identity. We prefer our beliefs align with those groups and their members so that we can maintain those connections and our sense of self. This desire for what psychologists call belief consonance can unconsciously influence how we process information, leading us to prefer interpretations which are consistent with those corporate beliefs, and also influence our choice of groups, so that we prefer groups where our beliefs will not be continuously challenged. For example, political scientist John Alford and his colleagues found that spouses share political attitudes more frequently than any other trait, Stanford University psychology professor Geoffrey Cohen observed students more likely to base their support of a policy on the position they think their party holds, and political scientists Shanto Iyengar and Sean Westwood reported study participants assessing fellow party members more positively than nonmembers.