22 August 2020

If You See Something, Say Something?

Next door in a Neighborland close by You: brought to this page by
Techdirt's think tank, the Copia Institute, is working with the Trust & Safety Professional Association and its sister organization, the Trust & Safety Foundation, to produce an ongoing series of case studies about content moderation decisions. These case studies are presented in a neutral fashion, not aiming to criticize or applaud any particular decision, but to highlight the many different challenges that content moderators face and the tradeoffs they result in.
Find more case studies here on Techdirt and on the TSF website.
_________________________________________________________________________
 
from the everything-is-politics dept
Summary: Nextdoor is the local “neighborhood-focused” social network, which allows for hyper-local communication within a neighborhood. The system works by having volunteer moderators from each neighborhood, known as “leads.” For many years, Nextdoor has faced accusations of perpetuating racial stereotyping from people using the platform to report sightings of black men and women in their neighborhood as somehow “suspicious.
. . .
Decisions to be made by Nextdoor:
  • When there are national conversations around movements like Black Lives Matter, when is it appropriate to take a public stand? How will that stand be perceived by users and by local moderators?
  • If the company is taking a stand on an issue like Black Lives Matter, should it then make it clear that related content should be kept on the platform -- even if some moderators believe it violates other guidelines?
  • How much leeway and power should local, volunteer moderators have regarding what content is on the platform?
  • How much communication and transparency should there be with those local moderators?
  • How involved should the company get with regards to implicit or unconscious bias that may come from non-employee, volunteer moderators?
  • Is it feasible to have a rule that suggests that local communities should not be a platform for discussing state or national political issues?  How does that rule play out when those “national or state” issues also involve local activism?
Questions and policy implications to consider:
  • When issues of national importance, such as civil rights, come to the forefront of the public discussion, there is often the likelihood of them becoming politically divisive. When is it important to take a stand despite this, and how should any messaging be handled -- especially in cases where some staff or volunteers may feel otherwise?
  • Issues of race are particularly controversial to some, and yet vitally important. How should companies handle these questions and debates?
  • Using volunteer moderators to help moderate targeted niche communities has obvious benefits, but how might historical bias and prejudice manifest itself in doing so?
Resolution: Nextdoor has continued to support the Black Lives Matter movement, and Gordon Strause, the company’s director of community, went onto the forum where some leads were complaining to explain the company’s position and why they were supporting Black Lives Matter, and to push back against those who argued that the movement itself was discriminatory, while also highlighting how there were a variety of perspectives, and there was value in learning about other viewpoints:
READ MORE PLEASE > https://www.techdirt.com
_________________________________________________________________________
USER-GENERATED CONTENT ::
from the too-big-of-a-problem-to-tackle-alone dept
Ten thousand moderators at YouTube. Fifteen thousand moderators at Facebook. Billions of users, millions of decisions a day. These are the kinds of numbers that dominate most discussions of content moderation today. But we should also be talking about 10, 5, or even 1: the numbers of moderators at sites like Automattic (Wordpress), Pinterest, Medium, and JustPasteIt—sites that host millions of user-generated posts but have far fewer resources than the social media giants.
There are a plethora of smaller services on the web that host videos, images, blogs, discussion fora, product reviews, comments sections, and private file storage. And they face many of the same difficult decisions about the user-generated content (UGC) they . . .

The Complete Bart Simpson Timeline