December 30, 2019

SASB to Research Content Moderation on Internet Platforms

Greg Waters
Analyst, Sector Lead – Technology & Communications, SASB

Greg Waters, Analyst, Sector Lead – Technology & Communications

During its Dec. 4 meeting, the Sustainability Accounting Standards Board (SASB) announced a research project that will explore the topic of content moderation on Internet platforms. The project will involve research, analysis, and engagement intended to help the Standards Board determine whether establishing standardised accounting metrics in this area may be warranted.

Internet platforms, such as social media, have connected billions of people around the world, facilitating improved human communication and information sharing. In recent years, however, there has been relentless media attention paid to objectionable content being posted by users of technology products, and the ways that companies are attempting to manage and moderate this content.

Areas of media focus include:

  • Weaponization of social media platforms to spread disinformation before elections
  • Instances of governments and other actors inciting violence against vulnerable groups through hate speech and misinformation/fake news
  • Abuse and harassment of users
  • Terrorist organisations using platforms as recruiting tools
  • The sharing of content containing child exploitation over the Internet, including cloud-based hosting and encrypted messaging services
  • The posting and live streaming of self-harm, mass shootings, and other violence
  • Secondary trauma experienced by the people (generally contractors) hired by companies to review and remove graphic and illegal content
  • Company policies regarding political ads and fact-checking
  • Perceived censorship by Internet platforms based on internal political bias

The result has been increased public scrutiny of technology companies and questions surrounding the nature of their responsibility to manage what appears on their platforms. SASB’s initial research, which includes analysis of company filings and earnings calls at Facebook, YouTube parent Alphabet, and Twitter, suggests that content moderation themes are a key area of focus for companies and investors. Underscoring investor interest in this subject, the Financial Times recently reported that New Zealand’s sovereign wealth fund has assembled a coalition of over 100 investors with more than $13 trillion in assets that will engage with social media companies in order to push for strengthened controls to prevent “objectionable content.” (Many of the participating investors are also members of SASB’s Investor Advisory Group, which provides feedback and guidance on SASB’s work and demonstrates support for a market standard for investor-focused sustainability disclosure.)

It is also clear that content moderation is incredibly challenging for companies, who are in the unenviable situation of choosing between leaving objectionable content on their platforms or being perceived to be censoring users. The sheer volume of content produced and uploaded to these platforms also means that even a 99 percent success rate in blocking the most harmful content will still result in significant user exposure.

In this research project, SASB staff will:

  • Define various content moderation themes and the amount of distinction/overlap between them (hate speech, misinformation, secondary trauma experienced by content reviewers, etc.) through the lens of SASB’s General Issue Categories (customer welfare, product safety, employee health & safety, etc.)
  • Build a preliminary list of companies/industries that are dealing with these issues
  • Establish an initial evidence-based view on whether and how content moderation themes have a financial impact on relevant companies
  • Provide a recommendation to the Standards Board on whether to proceed to standard-setting

Note that the initiation of a research project does not constitute standard-setting, but is rather aimed at determining whether standard-setting is needed, as described in the standard-setting process overview.

The research project will include an initial evidence-gathering phase, as well as structured outreach to relevant stakeholders. SASB staff welcomes feedback from content moderation practitioners, investors, and subject matter experts on this topic, and encourages interested parties to reach out at [email protected]. Project status and updates can be found on the Current Projects page of SASB’s website.