Are social media platforms good at content moderation?
Yixin Zou wins the NSF-DFG collaborative grant to investigate transparency of content moderation on social media platforms
U.S. National Science Foundation (NSF) and Deutsche Forschungsgemeinschaft e.V. (DFG, German Research Foundation) offer funding for collaborative project initiatives between scientists from the United States and Germany submitted to the Secure and Trustworthy Cyberspace (SaTC) Program. MPI-SP faculty member Yixin Zou is part of a team of researchers who were granted funding to investigate the transparency of content moderation on social media platforms via the SaTC Program.
Image-Based Sexual Abuse (IBSA) refers to the sharing or threat to share intimate images of a person without their consent online. Such experiences have harmful consequences for a survivor’s mental health, physical well-being, reputation, and job security. “Additionally, there is a lack of transparency from the social media platforms when it comes to taking down the reported sensitive content,” says Yixin Zou. “The survivors are re-traumatized every time they have to check if the compromising content has been removed since platforms fail to notify them”.
The team of researchers from the US and Germany will use IBSA as a case study for their project. Their work aims to identify and develop new tracking systems scrutinizing the removal of sensitive content from social media platforms. By framing their work in the current laws of German and European Union data access for research, they aim to make content moderation data available and transparent to empower researchers to better audit content moderation. “Ultimately, our research will enable us to [..] hold platforms accountable for providing users with a safer environment,” the team concludes in their proposal.