Home Technology More Content Moderation Is Not Always Better

More Content Moderation Is Not Always Better


As companies develop more types of technology to find and extract content in different ways, there becomes an expectation that they should be using it. can do medium meaning needed To lighten After all, once a device is in use, it is difficult to put it back in the box. But content moderation is snowballing now, and the collateral damage in its path is often overlooked.

have a chance Now for some careful thought about the way forward. Trump’s social media accounts and elections are in the rearview mirror, meaning content moderation is no longer a constant A1 story. Perhaps this proves that the real source of most of the anger was politics, not the forum. But there is – or should be – some lingering discomfort amazing display of power That showed a handful of company executives flipping the off-switch on the free world leader’s accounts.

The chaos of 2020 shatters any notion that there is a clear category of harmful “misinformation” that some of Silicon Valley’s powerful people should dispel, or even that there is a way to separate health from politics . Last week, for example, Facebook reversed its policy And said it will no longer take posts claiming to be COVID-19 man-made or manufactured. just a few months ago the new York Times It was cited in the belief.baseless“There is evidence in theory that social media contributed to the ongoing “reality crisis.” The back-and-forth was something like mask. At the start of the pandemic, Facebook banned Advertisement for them on the site. it lasted until June, when the WHO finally changed your guide Advising on wearing a mask despite the advice of many experts long ago. good news, i guess, they are were not so effective On enforcing the ban in the first place. (At the time, however, this was not seen as good news.)


There is more information about what officials did wrong during the pandemic or instances where politics, not expertise, set narratives, would naturally lead to more skepticism about trusting them or private platforms as to when to shut down the conversation. is. Issuing public health guidance for a particular moment is not the same as declaring the appropriate boundaries of the debate.

There is also a geopolitical cost to the call for further action. Authoritarian and repressive governments around the world have pointed to the rhetoric of liberal democracies in justifying their own censorship. This is clearly a typical comparison. shutting down criticism of the government for its handling of the public health emergency, as the Indian government is doing, is as clear an insult to freedom of expression as it gets. but there is To take some of the tension further down shouting on platforms Here but stop getting so down over there. So far, Western governments have refused to address it. They have massive platforms to restrain themselves in the global rise of digital authoritarianism. And losing platforms. Governments need to talk and chew about platform regulation and free speech if they want to stand up for the rights of many users outside their borders.

There are other trade-offs. Because there will be massive content moderation never be perfect, the question is always on which side of the line to make a mistake when applying the rule. Enforcing stricter rules more strictly means inevitably more false positives: that is, more valuable speech will be removed. This problem is further compounded by the increasing reliance on automated moderation to reduce content to a large extent: these tools are blunt and stupid. If asked to remove more material, the algorithm will not think twice about it. For example, they cannot evaluate the context or distinguish between material glorifying violence or recording evidence of human rights abuses. The toll of such an approach has become apparent during the past few weeks of the Palestinian-Israeli conflict as has Facebook repeatedly removed necessary ingredients From and about the Palestinians. this is not one time. probably should No Always need meaning—especially when we know these errors are rare disproportionate already Marginalized and Vulnerable Community.


Read More Technology News

Keep Reading Latest Breaking News

Source link


Leave a Reply