Before getting to the recommendations, some framing thoughts:
- It is impossible to tell how Facebook affected the election. There is too much causal density in who people vote for to point to one cause and say: that’s the thing that mattered.
- I think there’s a major danger in saying: “people voted for Trump because of fake news” instead of acknowledging that people voted for Trump because of a combination of cultural and policy beliefs.
- Liberals have plenty of fake news of their own. I’ve spent the past 5 years dealing with a constant stream of unscientific articles about New Orleans education reform in papers such as the New York Times.
- The research on how people form and solidify opinions is messy, and I think social media gives a chance to run a lot of new experiments rather than assume we know how opinion formation occurs.
- I think censorship is almost always the wrong answer, and the line between editorial scrubbing and censorship is very blurry when it comes to a communication platform like Facebook.
- Since the advent of advertising driven media, every medium has had to figure out how to balance revenue and legitimacy. Modern fake news has been a problem since the creation of the penny newspaper.
- In considering Facebook, I think the first solution set should be to try and figure out how to harness the power of social media rather than curtail it.
- Ultimately, it’s not Facebook’s job to make us better; it’s our job.
Recommendation #1: An Opt-In Intellectual Diversity Function
Facebook should create an opt-in intellectual diversity function that harnesses its algorithm to populate a user’s feed with diverse intellectual opinions. This could be done throughout the newsfeed, or opposing viewpoints could be tagged to specific articles.
Overtime, Facebook could track what types of opposing articles are clicked and viewed – and tweak its algorithm to place the most effective type of opposing arguments for each type of person or issue.
Recommendation #2: An Opt-In Share My Story Function
From what I understand, personal posts garner much more engagement than article sharing. And my hunch is they are more effective in making people explore other opinions.
Facebook should create a share my story function that allows a user to give Facebook permission to share a personal story with strangers who have opted-in to the intellectual diversity feature.
Facebook could then share powerful personal stories that provide different viewpoints to its users, as well as track what types of stories resonate most with people of different intellectual viewpoints.
Recommendation #3: An Intellectual Diversity Rating
Facebook should provide the opportunity for each user to see a feed diversity rating score, that gives the user some (imperfect) estimation of the intellectual diversity of her feed.
Recommendation #4: Alternate View Feed Day
Facebook should have one day a year where users can opt-in to receiving a typical daily feed of a user who shares opposite political views.
So for a whole days a user would see the news / articles / etc. that a user from the opposite political spectrum would usually see.
Recommendation #5: Livestream Beer Summits
Occasionally, Facebook could livestream a summit of two people with different political viewpoints engaging in a discussion / visiting each other’s homes / going to a bar / etc.
It could provide a model for what we should all be doing more of: talking to each other.
I don’t know if the above recommendations would work or not. Perhaps they would backfire and create more belief anchoring and division.
My major point is that instead of trying to censor social media we should run a bunch of experiments to try and figure out how it might make us better.
Lastly, like most posts, this post was born out of an exchange with friend: thanks to Mike Goldstein for inspiring this post by coming up with recommendations #1 and #5.