The Social Dilemma
27 Oct 2020
I just finished watching The Social Dilemma, and here’s my hot take: The Social Dilemma is an emotive, accessible introduction to problems that, without exaggeration, pose an existential threat to life as we know it. If you can, watch it.
Having read Technopoly by Neil Postman, I didn’t feel like The Social Dilemma was revelatory in saying that technologies have unintended side-effects. What’s different about The Social Dilemma is that it is really accessible and it highlights real, present, concrete problems caused by Facebook, Twitter, Instagram, etc. (I think if Neil Postman had been alive to witness Facebook and Twitter, he would have been scared silly.)
Social media is perhaps unique in that there has never been a technology so complex that requires so little effort or expertise on the part of the end-user. Prior to the invention of the personal computer, the most complex piece of technology a lay person used was a car. Even though it is relatively easy to drive, a car still requires you to have some knowledge of its inner workings: knowing when to change the oil or coolant or when to get the tires replaced or the brakes checked is part of operating a car successfully.
Not so with social media. All you need to know is how to type your name and scroll. The complexity behind Facebook’s implementation is orders of magnitude greater than that in any car. Yet Facebook is so easy to use that even the most illiterate, technologically inept person can figure it out. Therein lies some of the danger: Facebook, Google, Instagram, etc. all employ technologies1 to influence your behavior, and end users are entirely oblivious to their workings and effects.
So what do we do?
At the end of The Social Dilemma, many of the experts and creators of these technologies suggested that we need governmental regulation. I’m persuaded that our best bet lies in some kind of regulation, because market forces do nothing to incentivize their creators to fix the problems of polarization and misinformation. Exactly how that regulation should be enacted is an open question. Perhaps social media should be regulated like alcohol or tobacco: e.g. stricter limits on when someone can start using it, and visible warnings about the psychological effects that it has. Removing or severely curbing political advertisements, especially when paid for by parties other than the candidate in question, might have an interesting effect as well.
As for myself, I’ve gotten off of Facebook entirely. I’ve taken conscious steps to make it more difficult for me to get on Twitter. When I sleep, my phone is in a different room. Whenever I notice myself reaching for an app when I have nothing to do, I turn it off, delete it, or otherwise force myself to be more deliberate about my technology habits.
When I have kids, I intend on acquainting them with the workings of computers and algorithms. While I don’t expect (or want) all of my kids to become computer scientists, I want them all to be aware of the mechanisms at play to help them be on their guard for addictive and manipulative technologies like social media.
I am deeply worried about where things are going, and I do not think the solution to misinformation and polarization is more technology or AI or some other technological solution. It cannot be. They can help if employed correctly, but they cannot function as the cure. I think the only possible solutions will come from better regulation to shift the incentives of these massive technology companies and better education so that people can be more mindful, more deliberate, and less vulnerable to addiction or deception.
-
Note that “technology” is more than a physical device. Technology includes things like writing or a mathematical formula. See my post on Technopoly for a little more detail on this subject. ↩︎