The news is waking up with stories of closures on the wrong platforms and anger involved in banning key members. But these are band-aids on a deeper issue – namely, the problem of misinformation is one of our own designs. We have inadvertently increased polarization of some core elements that build social media platforms and can spread misinformation.
If we can periodically teleport to reclaim social media platforms such as Facebook, Twitter, and Tickcock, with the goal of reducing misinformation and conspiracy theories from the outset… what will they look like?
It is not an academic practice. Understanding these root causes can help us develop better prevention measures for current and future platforms.
We have inadvertently increased polarization of some core elements that build social media platforms and can spread misinformation.
As one of the Valley’s leading behavioral science companies, we have helped brands such as Google, Lyft and others understand human decision making as it relates to product design. We recently collaborated with TikTok to design a new series of signals (launched this week) to help prevent the spread of potential misinformation on our platform.
The intervention successfully reduces flagged material shares by up to 24%. TicTalk is unique among platforms, while the lessons we’ve learned have helped shape ideas that may resemble social media redux.
Create opt out
We can take very large swings to reduce the idea of unsaturated material compared to the label or signal.
In the experiment we did with TikTok, people watched an average of 1.5 flagged videos over a period of two weeks. Yet in our qualitative research, many users stated that they were on Tickcock for fun; They did not want to see any flagged video. In a recent earnings call, Mark Zuckerberg also spoke of the tiredness of hyperpartisan content by Facebook users.
We suggest that people be given the option to “opt out of flagged content” – completely remove this content from their feed. In order to make it the right choice, this opt-out needs to be prominent, and not buried where users should find it. We suggest putting it directly into the sign-up flow for new users and adding an in-app prompt for existing users.
Shift business model
There is one false news spreading six times faster than the actual news on social media: information that is controversial, dramatic or polarized is more likely to grab our attention. And while algorithms are designed to maximize the engagement and time spent on an app, this kind of content is more thoughtful, deliberately preferred over content.
The main problem is the advertising-based business model; It is so difficult to progress on misinformation and polarization. An internal Facebook team looked into the issue, finding that, “our algorithms exploit the human brain’s attraction to segmentation.” But the proposed work was nixed by senior officials to address the project and issues.
Essentially, this is a classic incentive problem. If business metrics that define “success” are no longer dependent on maximum busyness / time on site, then everything will change. Polarizing materials will no longer need to be favored and more thoughtful discourse will be able to rise to the surface.
A primary part of misinformation dissemination is feeling marginalized and alone. Humans are fundamentally social beings who seem to be part of a group, and partisan groups often provide a sense of acceptance and recognition.
So we should make it easier for people to find their authentic tribes and communities in other ways (versus those who are bound by conspiracy theories).
Mark Zuckerberg says his ultimate goal with Facebook was to connect people. To be fair, Facebook has done so in many ways, at least on the surface level. But we must go deeper. Here are some ways:
We can design for more active one-on-one communication, which has been shown to increase wellbeing. We can also delete offline connections. Imagine two friends interacting on Facebook Messenger or through comments on a post. When they live in the same city, when (after COVID of course), how is the signal to meet someone? Or a nudge to hop on the call or video if they are not in the same city.
In a scenario where they are not friends and the conversation is more controversial, the platform can not only play a part in exposing the other person’s humanity, but also shares one thing with another. Imagine a sign that, as you are “shouting” with someone online, you have everything in common with that person.
Platforms should also reject anonymous accounts, or at least encourage the use of real names. The clubhouse has good standard-setting on it: In the onboarding flow he says, “We use real names here.” The connection is based on the idea that we are interacting with a real human. Gumnami Baba says that.
Finally, help people reset
We should make it easier to get people out of an algorithmic rabbit hole. YouTube is on fire for its rabbit hole, but it is a challenge across all social media platforms. Once you click the video, you are shown similar videos. This can sometimes help (that whole “how to” video sometimes requires a search), but for misinformation, it’s a death march. One video leads to another on flat Earth, along with other conspiracy theories. We need to help people evict their algorithmic fate.
with great power comes great responsibility
More and more people now get their news from social media, and those who do are less likely to be informed correctly about important issues. It is likely that this trend of relying on social media as an information source will continue.
Social media companies are thus in a unique position of power and have a responsibility to think deeply about their role in reducing the spread of misinformation. They continue to run and run trials with a fully research-informed solution, as we did with the Tickcock team.
This task is not easy. We knew going in, but after working with the Tickcock team we have a deep appreciation for this fact. There are many smart, well-intentioned people who want to solve for the greater good. We are deeply hopeful about our collective opportunity to think more creatively about reducing misinformation, motivating connections, and strengthening our collective humanity all at the same time.