Let’s Stop Pretending Facebook and Twitter’s CEOs Can’t Fix This Mess | Social
It’s not often you hear some of the richest, most powerful men in the world described as naive, but it’s becoming pretty commonplace. Mark Zuckerberg, after a dark period for Facebook, has been called naive more than once. Jack Dorsey, meanwhile, has admitted having to rethink fundamental aspects of Twitter.
But I struggle to believe that these brilliant product CEOs, who have created social media services used by millions of people worldwide, are actually naive. It’s a lot more likely that they simply don’t care. I think they don’t care about their users and how their platforms work to harm many, and so they don’t bother to understand the interactions and amplification that result.
They’ve been trained to not care.
The core problem is that these CEOs are actually making totally logical decisions every step of the way. Capitalism—which drives the markets, investors, venture capitalists, and board members—demands a certain approach to growth and expansion, one that values particular metrics. So social media companies and the leaders who run them are rewarded for focusing on reach and engagement, not for positive impact or for protecting subsets of users from harm. They’re rewarded for keeping costs down, which encourages the free-for-all, anything-goes approach misnomered “free speech.” If they don’t need to monitor their platforms, they don’t need to come up with real policies—and avoid paying for all the people and tools required to implement them.
Ellen Pao (@ekp) is an Ideas contributor at WIRED. She is founding CEO of Project Include and author of Reset: My Fight for Inclusion and Lasting Change. Previously Pao was CEO of Reddit.
But the way investors, companies, and even many users value engagement has a fundamental flaw exposed ruthlessly in the past couple of years. CEOs and VCs don’t generally track whether user interactions that fuel a lot of tech platforms are positive or negative—they just look at overall metrics around users, views, likes, and shares. Many users, on the other hand, want attention, often regardless of whether the likes and shares are positive or negative; they quickly learn that the more outrageous and angry the tweets and posts get, the bigger the engagement response. Everyone’s holding hands on the road to hell.
In the earliest days, it wasn’t always obvious what these platforms were doing and what they would become—even to insiders. But at a certain point, it became clear that money was the driving factor, and dopamine- or rage-induced interactions meant more money. They didn’t care about creating a positive experience for users—which would at a minimum require physical and mental safety. Every single one of them knew what was going to happen. And we saw it time and time again.
At Reddit, where I was CEO, we saw the dangers of unfettered free speech and engagement addiction. When we allowed linking to stolen naked celebrity photos on Reddit, that was what took over the site and what the site became. People flocked to Reddit to find stolen nude photos, and the commentary was neither enlightening nor humanizing—and invasive in so many ways for the subjects involved. Good conversations were pushed off the site by the overwhelming demands to see nude photos. Social games companies like Zynga admitted that they were hunting the “whales”—high-spending users who could drive financial results—and that meant encouraging unhealthy behavior. We’ve already seen Gamergate, the Boston Marathon witch hunt, and Myanmar rioting.
Mark Zuckerberg isn’t naive: He knew what he was doing. Jack Dorsey knows what having Donald Trump and Infowars on his platform means. These CEOs, and the board members and investors who back them, choose not to know the details. Because they don’t want to know.
Jack Dorsey knows what having Donald Trump and Infowars on his platform means.
We have enough evidence. We know that when harassment happens, you end up with a site that is filled with the loudest, meanest voices and the worst content. We know that it silences voices. We know that the dark corners will take over the entire site.
Here’s what I think we should do. We must use this as our lens the next time a CEO claims ignorance. We must hold them accountable, and we need to stop contributing to their cycles of dangerous behavior.
Companies can address harassment without hurting their platforms. Taking down shitty content works, and research supports it. When we took down unauthorized nude photos and revenge porn, nothing bad happened. The site continued to function, and all the other major sites followed. A few months later, we banned the five most harassing subreddits. And we saw right away that if we kept taking down the replacement sites, they would eventually disappear. University researchers who studied the impact of the ban report that it successfully shut down the content and changed bad behavior over time on the site—without making other sites worse.
Taking down shitty content works, and research supports it.
If you’re a CEO and someone dies because of harassment or false information on your platform—even if your platform isn’t alone in the harassment—your company should face some consequences. That could mean civil or criminal court proceedings, depending on the circumstances. Or it could mean advertisers take a stand, or your business takes a hit.
Today, I don’t see a single CEO or even board member who is willing (or perhaps able) to step up and say: “Enough. I’m willing to focus on quality and user experience. I am willing to take a hit on quantity to create a real place for meaningful conversation and to end harassment, misinformation, and the goal of engagement at any cost.” We need to fill this vacuum of leadership.
That’s where today’s user is stepping in to try to patch the void because, ultimately, companies and their leaders are even more addicted to engagement than they want their users to be. Employees, advertisers, and users are trying to hold them accountable to their policies and values. Vimeo staff forced the company to take down Infowars. At Reddit, staff didn’t want to work for a company that was only posting stolen nude photos, and their response helped fuel the change in policies.
Many of us have stopped posting new content to Facebook and, more recently, Twitter to protect ourselves—and because it doesn’t feel consistent with our values. I didn’t download Twitter on my latest phone, and Facebook hasn’t been on my last two phones. And on August 8, I stopped contributing to Twitter altogether. I still need to use Twitter for work, but my personal view is that any content I contribute is promoting white supremacy and harassment. Others are taking a stand, like the #GrabYourWallet, #DeactiDay, and #BlockParty500 movements on Twitter, or the Sleeping Giants campaign to pressure advertisers on Breitbart.
It often feels like a game of whack-a-mole; something we can’t fix without CEOs making change a priority. CEOs should just forget about hiding behind “naivete” and “free speech,” and instead remind themselves they can take actions that will meaningfully change the direction of the future. The first step is acknowledging the problem. The second step is learning more about the problem, who is affected, how interactions become engagement, and how to distinguish between positive engagement and negative ones. You’ve solved for increasing engagement; now it’s time to make real, positive interactions a priority.