Facebook

Increasing sharing friction, trust, and safety spending may be key Facebook fixes

Gelo Gonzales

This is AI generated summarization, which may have errors. For context, always refer to the full article.

Increasing sharing friction, trust, and safety spending may be key Facebook fixes
Are 35,000 moderators enough for a platform with 2.7 billion users?

Facebook is designed in a way that we keep posting, sharing, and interacting with others. This activity fuels Facebook and other competing digital platforms.

This activity is recorded by the platform to turn into behavioral data, which allows the platform and its customers to microtarget us with ads and, more sinisterly, nudge our behaviors in ways that they want us to behave under the economic imperative we now know as surveillance capitalism. 

This design has been detrimental to democracy, catalyzing the spread of dangerous, false information, and hate – because hate spreads faster –which persists today because of a lack of judicious action from its leaders. Columbia University professor Anya Schiffrin, for example, asks, why stop political ads in the US just a week before elections, and why not just stop them entirely, and for eternity? 

Tristan Harris and Aza Raskin, founders of the nonprofit Center For Humane Technology, in a 2019 interview with Rappler executive editor and CEO Maria Ressa, give interesting ideas, too, that would potentially make Facebook better for society. 

Increasing sharing friction, trust, and safety spending may be key Facebook fixes

Harris and Raskin are also featured, along with other experts, in the new Netflix documentary The Social Dilemma , which explores technology addiction, social engineering, and surveillance capitalism.

One key idea is that Facebook desperately needs to increase its spending for content moderation. The two lay down the facts. Facebook has about 35,000 moderators. It also has 2.7 billion users. Just from a glance, the odds are stacked against the moderators, especially considering how fast material spreads on Facebook. Facebook spends about 6.5% of their revenue on trust and safety. 

Harris compares it to the city of Los Angeles, which he says spends about 25% of their budget for trust, safety, and security. It’s not an apples-to-apples comparison, but the basic numbers show that they’re not allocating enough resources for the problem. 

“So they’re spending one fourth on safety compared to the city of Los Angeles. And if they’re running a 2 billion person city, is that enough? The one simple thing Facebook could do is why aren’t they quadrupling the size of their trust and safety budget?” says Harris. (READ: Zuckerberg ‘stubborn,’ ‘very dug in’ says ‘Facebook: The Inside Story’ author Steven Levy)

Harris also believes that Facebook should turn off custom audiences or microtargeting for political ads. Raskin, on the other hands, suggests that design-wise, sharing content should have more friction. 

To the first point, Facebook makes use of look-alike models for advertisers where Facebook could pinpoint a group of users, and look for groups of users similar to it. It’s “a dangerous tool,” Harris says, and leaves people vulnerable to being manipulated.

And since Facebook serves ads for political campaigns, this means that politicians can easily see which groups to target, and how to target them effectively. This is something that’s done in the offline world too – knowing one’s voter base, and how to appeal to them. But the scale of Facebook, its lack of gatekeeping, its unprecedented information on individuals makes it too dangerous and powerful.  

To Raskin’s point, the current design, he says, makes it too easy to share, ergo, too easy for content – especially ones that prey on emotions – to spread. Raskin argues that if there were more friction, we could give the “human brain the chance to catch up to this impulse.” If it wasn’t a one-click design, we’d have time to think if a piece of content is truly worth spreading. 

Some ways that Facebook have tried to slow the spread of messages is through putting a limit on the number of people you can forward a message to on Messenger and WhatsApp. It’s the hateful, usually false content that spreads fast, so there is some merit to forward limiting. 

Harris adds that one-click sharing could be changed with “two lines of JavaScript code.” It could be a very easy, but impactful change but something that Facebook hasn’t talked about. From a surveillance capitalist’s standpoint, any friction means less behavioral data, ergo, less profits. 

“I think that the tech platforms have to realize that the public sphere is now theirs to protect, to grow, to pollute, and that if they jump in now, it is still salvageable. There are many things that can be done. If a little journalist, if a little group in the Philippines like us can try to do this, with no resources, they can,” Ressa says.

Before everything went digital, there was an information ecosystem that people could trust, and one that placed accountability on the people that write, publish, and distribute important information, the journalists. 

Then Facebook came, and became what is essentially the world’s biggest distributor of information – minus the guardrails of the old information ecosystem that assured certain unassailable facts, and a design that ultimately favored hateful content and content that prey on emotions, all for the sake of harvesting more behavioral data. – Rappler.com

Add a comment

Sort by

There are no comments yet. Add your comment to start the conversation.

Summarize this article with AI

How does this make you feel?

Loading
Download the Rappler App!
Clothing, Apparel, Person

author

Gelo Gonzales

Gelo Gonzales is Rappler’s technology editor. He covers consumer electronics, social media, emerging tech, and video games.