Facebook cannot monitor all hateful content on its platforms
The article includes the viewpoint of Lucy Handley, the reporter, who believes that “there are no easy answers for how this can be achieved, especially since the content revolves around complex issues such as politics, misinformation, and freedom of speech” (01:00-01:08). The video also shows that in 2013, after news of hateful content being posted online, Facebook representatives have claimed that their “systems to identify and remove hate speech have failed to work as effectively” (01:34-01:37).
The video also shows how Facebook has failed to take action after the 2020 Black Lives Matter protests, when then-president Donald Trump posted extremist content on both Facebook and Twitter. While Twitter responded and hid the content, claiming it went against their guidelines, Facebook “declined to act” (02:14).
Currently, Facebook claims that it is looking to “review and update its policies” (02:35), which implies that the company is aware of its problematic policy: “We know we have more work to do, and we’ll continue to work with civil rights groups, GARM [the Global Alliance for Responsible Media], and other experts to develop even more tools, technology and policies to continue this fight” (02:25-02:46). Despite these claims, Eric Levy talks about Mark Zuckerberg (the CEO of Facebook) being “reluctant to censor all but the most heinous sorts of speech” (03:24-03:29).