A couple of months ago, I watched a documentary on Netflix called “The Social Dilemma.” If you haven’t had a chance to see it yet, let me first say that I highly recommend this documentary. In fact, it may be the most important thing you watch all year. The documentary touches on a number of incredibly important and interesting topics, but for me, one in particular stood out.
If you’re like me, you’ve probably been horrified by the level of polarization in our nation. Particularly in the months leading up to the 2020 Election, it was truly shocking how divided our country seemed to be. It wasn’t just like we weren’t on the same page as a country… we weren’t even in the same library. I know more than one person (myself included) who described having a conversation with their political counterparts as "like speaking a different language.” And things don’t seem to be getting any better.
This leads us to ask some obvious follow-up questions: How did we get SO divided? How did it get THIS BAD?
I’m confident there are a myriad of correct answers to these questions (some of which I will address in future posts.) But after watching The Social Dilemma, I’m convinced that one of the largest contributing factors is the rise of social media, search engines, and the algorithms that drive our interactions with them. As the documentary explains, these platforms were designed for one primary purpose: to keep us tethered to our devices. They do this by collecting massive amounts of data about each of us, and then use that data to push content that keeps us scrolling… and scrolling… and scrolling.
These platforms are incredibly addictive by design. Their primary purpose is not to be helpful, but to be profitable. The more hours we spend on Facebook, Instagram, and Youtube (among others,) the more advertising these companies can sell. So algorithms were designed to keep us online as long and as often as possible, and they do this by intentionally exploiting certain aspects of our psychology and brain chemistry.
One example of this can be understood through the concept of confirmation bias. If you’re unfamiliar with the term, confirmation bias is defined as:
"the tendency to search for, interpret, favor, and recall information in a way that confirms or supports one's prior beliefs or values."
Studies have shown that humans generally prefer to hold on to their existing beliefs, rather than be confronted with opposing beliefs or information. (Basically, we all really like being right.) And the algorithms use this basic human reality to their advantage. By primarily suggesting content that we already agree with, we are more likely to engage with that content and stay connected (and profitable) to the platform. Inversely, we are more likely to disengage with content that conflicts with our existing beliefs, so the algorithms often avoid exposing us to opposing points of view.
The inevitable result is that our social media and search engines become giant echo chambers, reinforcing and intensifying our pre-existing beliefs. We are rarely exposed to opposing points of view, or if we are, the content is often filtered through biased sources that interpret things for us in favorable ways.
For me, this helps explain so much of our present reality. Haven’t you talked with someone recently, or seen a post from a friend or family member, and felt like they must be living in a different reality? Well in many ways, they are. We all are. We are all experiencing the reality that we (along with those all-too-helpful algorithms) have carefully and methodically curated for ourselves over the past decade. This also helps explain how people can become radicalized to particular causes or conspiracy theories, and it’s even proven to be applicable to countries outside the United States (particularly democracies.)
So where do we go from here?
It’s a difficult question, and one we all should be wrestling with as individuals as well as societies. Here are some initial steps that I’m planning to take moving forward:
Go out of my way to expose myself to diverse points of view and sources of information. F**k with the algorithms a bit. Don’t become predictable.
Take breaks from social media, long ones if necessary.
TALK to people. Like... in person (or at least on Zoom.) It’s much more difficult to dehumanize and degrade a person face to face, and far too easy to do it online.
If you have thoughts on this topic, I would love to hear from you in the comments below. Let’s go out of our way to embrace the person across the aisle, and be very wary of the enemy lurking in our technology.
Watch the documentary:
Center for Humane Technology:
Tools for creating constructive dialogue: