The social media exec also said Facebook would “nudge” teens away from material in its apps that “may not be conducive to their well-being.” He didn’t provide specifics for this new approach. He did, however, suggest that Facebook’s algorithms should be “held to account,” including by regulation if needed, to be sure real-world results matched intentions.
The new methods might address some of Haugen’s concerns. She claimed Facebook was aware its algorithms were destructive, leading children to harmful material and removing only a fraction of hate speech. Haugen also felt Congress should reform the Communications Decency Act’s Section 230 to increase Facebook’s liability for algorithm-chosen content, and that Facebook should add friction to reduce the virality of content and force users to think about posts rather than share them reflexively.
At the same time, this might not satisfy Haugen and fellow critics. Breaks and nudges may reduce exposure to harmful content, but they won’t remove the content in question. Clegg’s statements also reflect a familiar strategy at Facebook. It likes to invite regulation, but only the regulation it’s comfortable with. While the proposed changes could help, politicians may demand more — in part to prevent Facebook from dictating its own regulation.
All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.
social experiment by Livio Acerbo #greengroundit #engadget https://www.engadget.com/facebook-instagram-teen-take-a-break-205617444.html?src=rss