Facebook responded to the testimony of Whistleblower Frances Haugen by trying to change the narrative on hatred speech. The integrity of VP Guy Rosen has discussed the defense of the anti-hate statement of social networks where he argues a decrease in the visibility of important hatred speeches more than just the existence of the content. “Prevalence” (aka visibility) Hatred on Facebook has dropped by almost 50 percent in the last three quarters to 0.05 percent of content seen, said Rosen, or about five views from every 10,000.
The executive thinks it’s “wrong” to focus on transferring content as the only metric. There are other ways to fight hatred, said Rosen, and Facebook must be “confident” before removing any ingredients. That means making a mistake on the caution side to avoid removing content mistakenly, and limiting the reach of people, groups and pages that might violate the policy.
There is a level of truth here. Facebook sometimes has problems because of influential content mistaken as a speech of hatred, and the aggressive disappearance system might lead to further accidents. Likewise, hatred will only have a limited impact if some people have seen the given post.
However, there is little doubt Facebook involved in the round. Haugen in his testimony confirms that Facebook can only capture the “very small minority” of offensive material – it is still a problem if it is true, even if only a small portion of the user has ever seen material. Rosen’s response also did not touch the allegations of Haugen that Facebook refused to apply a safer algorithm and other efforts to minimize hatred and divisive interactions. Facebook can make a significant step in limiting hatred, but it’s not Haugen’s point – it’s that social media companies are not enough.