This is part three of a four part series. Here are links to parts one, two, and four.
Humans have certain tendencies that probably serve us in an evolutionary sense, but which are amplified in troubling ways in our online selves. There it becomes a vicious cycle, as the purveyors of news and social media optimize their offerings to exploit some of these quirks of our nature. In the first and second installments of this series that I’ve cornered myself into writing, I posited that we find some genuine gratification in the status quo (of online discourse and news media), so in this third installment (confusingly labeled part 2), let’s dive into a laundry list of various human tendencies and behaviors implicated here.
The Fear and Anger Click Machine
The media we consume and the social media spaces we inhabit overwhelmingly rely on ad revenue to survive, and ad revenue relies on our attention in the form of impressions (i.e. instances when an ad is shown to us) and clicks. These businesses have an existential imperative to attract and hold our attention, to keep us in “the room.” And they don’t just demand our attention, they harvest it, measure it, and sell it in various ways. In short, our attention is a digital commodity, the fuel that powers their machine.
Given that these platforms need our attention, they have built sophisticated algorithms to capture it, and it turns out that what draws our attention best of all is content that triggers fear or anger in us. Anger travels faster and farther than any other emotion on social media. We are drawn to words that connote moral outrage. We are activated by outgroup animosity. Etc etc.
We are highly tuned to fear and anger because it has helped us survive as a species. But it’s the bane of our online existence.
Confirmation Bias and Avoiding Cognitive Dissonance
We like ideas that align with our beliefs, and we don’t like ideas that challenge them – as in, the first delivers pleasure and the second causes psychological pain. It takes energy and often conscious effort to hold two conflicting ideas, and we are not inclined to expend energy if we don’t have to. Confirmation bias is one way we avoid such cognitive dissonance, and the pain and effort that comes with holding multiple conflicting ideas.
Not only does it feel better, but our brains are better at retaining information when it conforms with our beliefs, and we have natural defenses against antithetical ideas. This is especially true when we have an emotional attachment to a particular belief, but the same tendencies are present even when we don’t.
And we want to be right. We engage in both challenge-avoidance and reinforcement-seeking behaviors to protect ourselves from the possibility that we’re wrong, and to bolster our comfort that we are right.
Confirmation bias propels a lot of what we do online, and much of what we consume, and it’s also a great driver of our prejudices.
Binary Thinking and Certainty
Another way people avoid the pain and effort associated with cognitive dissonance is through binary thinking. Many times when I express an opinion to someone about some person or thing, I’ll get a frustrating response along the lines of, “well that’s because you love/hate [other thing],” even though I didn’t mention [other thing] at all.
Some people seem unable to process the idea that a critique of A does not automatically mean an endorsement of B, or vice versa: support for A is not an implicit rejection of B. Some people seem unable to think in terms of degrees or a spectrum, where A and B exist independently, and where it’s possible to like and dislike aspects of each, or to like them both but to different degrees, or even to have an opinion about A but no opinion at all about B. Binary thinkers not only can’t process reality in terms of degrees, they sometimes seem to equate nuanced thinking with moral weakness, as an inability or unwillingness to judge right versus wrong.
Binary thinking is another safety mechanism we employ to protect our psyches, but it is perfectly suited to spaces where discourse is constrained to 240 characters, punchy headlines, and 30-second video clips.
Ideologies and Thought-Terminating Clichés
So many online exchanges devolve into clichés, down to specific phrases and labels we’ve seen a million times. We fall back on ideologies when we’re too lazy or too short on time to engage our critical thinking capacities. We recite established answers to questions instead of of actually examining evidence or stepping through logical arguments.
Does the prevalence of assault rifles contribute to the number of mass shooting deaths in the US? Is man-made climate change real? Is gender nonbinary? How does the US healthcare system compare to systems in European countries?
These are all empirical questions, and many people are satisfied to respond to them with ready-made answers (or memes!) they’ve never really thought about. We attach ourselves to certain ideologies that we invoke reflexively instead of approaching such questions empirically.
Freed from ideology, we can happily answer “I don’t know. Let’s look into it.” But that doesn’t get the clicks and likes.
Fallacious Heuristics and Proxy Wars
In an episode of This American Life called A Little Bit of Knowledge, the prelude features a group of friends who keep each other in check when it comes to speaking without authority. When one of them starts to pontificate about the effects of gluten in the body, another interrupts with “Oh yeah! I think I read about that recently in Modern Jackass!” It’s their friend-group code for “shut up, you have no idea what you’re talking about.”
We constantly lean on experts and heuristics, like the overwhelming consensus of scientists. We have to use these kinds of heuristics, because we can’t be subject matter experts in everything that impacts or interests us.
But then sometimes we end up in proxy wars with someone where we point to our champion (heuristic), and the other person says “corrupt!” And then they point to their champion, to which we say “debunked!” Some heuristics are better than others, and some are downright fallacious, but it’s pretty much impossible to resolve those questions through a proxy war.
Just like we do with ideologies, we can too easily lean on heuristics to avoid the hard work of thinking. In any case, it’s important to be aware of the areas where we have offloaded our thinking to someone else.
Dunning-Kruger and Arrogance vs. Humility
I don’t know if I need to explain what the Dunning-Kruger effect is, but briefly it’s:
when a person’s lack of knowledge and skills in a certain area cause them to overestimate their own competence. By contrast, this effect also causes those who excel in a given area to … underestimate their relative abilities as well. – via The Decision Lab
People who overestimate their own competence are louder and more prolific participants in discourse. Mansplaining is one way this manifests. Conversely, bona fide subject matter experts tend to be quieter and more prone to equivocation and nuance.
Online, this means that confident ignoramuses tend to drown out people who are truly knowledgeable about any given topic. Not a good recipe.
Straw Men and Arguing from Anecdote
I mention these two logical fallacies only because I find them particularly maddening and all too common online. Wild fringe anecdotes travel fast and stick better than run-of-the-mill reality, though they’re not statistically significant or useful for anything beyond scoring cheap points and adding to the fear and anger click machine.
As for straw men, did you know that conservatives wanted to cause maximum COVID deaths? And leftists wanted mask mandates forever? Endless battles over positions that no real person actually holds.
It’s exhausting.
Gratuitous Cruelty
The last thing I want to mention is the tendency for people to be cruel online, to lash out from the safety of their bedrooms and basements. It’s this stuff more than anything else that lays me low and despairing about the state of mankind – people punching down, mobbing the vulnerable, cheering violence.
I try to remember that it’s probably a tiny percentage of trolls who are able to make an outsized impact. I sometimes think about a study the city of Richmond CA conducted years ago when it was battling an unprecedented wave of violent crime. Using network analysis techniques and data mining, they determined that just a few dozen people seemed to be at the center of a huge percentage of the crime. They focused some controversial new interventions on just those people and saw a dramatic drop in violent crime.
The other thing I try to remember are incidents when people who posted truly appalling things online were confronted with the targets of their vitriol, face-to-face. Almost invariably, they apologized. They said, “this is not who I am.”
That’s a pretty good bridge to the last installment I plan to write
Wrapping this one up…
Much has been written on all of these things, and so my purpose here is not to explain them generally but to highlight how these tendencies are amplified in our isolated but extremely online, media-consuming selves.
The platforms are engineered in ways that catalyze our bad behaviors and then exploit them to power the fear and anger click machine. And worse, our bad behaviors metastasize and infect our brains in ways that persist outside the machine.
Changing the trajectory of all this requires a very simple shift, and that’s what my fourth and final installment will focus on
.