TY - GEN
T1 - Automated Transparency
T2 - 2024 ACM Conference on Fairness, Accountability, and Transparency, FAccT 2024
AU - Kaushal, Rishabh
AU - Van De Kerkhof, Jacob
AU - Goanta, Catalina
AU - Spanakis, Gerasimos
AU - Iamnitchi, Adriana
N1 - Funding Information:
CG is supported by the ERC Starting Grant research project HUMANads (ERC-2021-StG No 101041824) and the Spinoza grant of the Dutch Research Council (NWO), awarded in 2021 to Jos\u00E9 van Dijck, Professor of Media and Digital Society at Utrecht University.
Publisher Copyright:
© 2024 Owner/Author.
PY - 2024/6/3
Y1 - 2024/6/3
N2 - The Digital Services Act (DSA) is a much awaited platforms liability reform in the European Union that was adopted on 1 November 2022 with the ambition to set a global example in terms of accountability and transparency. Among other obligations, the DSA emphasizes the need for online platforms to report on their content moderation decisions ('statements of reasons' - SoRs), which is a novel transparency mechanism we refer to as automated transparency in this study. SoRs are currently made available in the DSA Transparency Database, launched by the European Commission in September 2023. The DSA Transparency Database marks a historical achievement in platform governance, and allows investigations about the actual transparency gains, both at structure level as well as at the level of platform compliance. This study aims to understand whether the Transparency Database helps the DSA to live up to its transparency promises. We use legal and empirical arguments to show that while there are some transparency gains, compliance remains problematic, as the current database structure allows for a lot of discretion from platforms in terms of transparency practices. In our empirical study, we analyze a representative sample of the Transparency Database (131m SoRs) submitted in November 2023, to characterise and evaluate platform content moderation practices.
AB - The Digital Services Act (DSA) is a much awaited platforms liability reform in the European Union that was adopted on 1 November 2022 with the ambition to set a global example in terms of accountability and transparency. Among other obligations, the DSA emphasizes the need for online platforms to report on their content moderation decisions ('statements of reasons' - SoRs), which is a novel transparency mechanism we refer to as automated transparency in this study. SoRs are currently made available in the DSA Transparency Database, launched by the European Commission in September 2023. The DSA Transparency Database marks a historical achievement in platform governance, and allows investigations about the actual transparency gains, both at structure level as well as at the level of platform compliance. This study aims to understand whether the Transparency Database helps the DSA to live up to its transparency promises. We use legal and empirical arguments to show that while there are some transparency gains, compliance remains problematic, as the current database structure allows for a lot of discretion from platforms in terms of transparency practices. In our empirical study, we analyze a representative sample of the Transparency Database (131m SoRs) submitted in November 2023, to characterise and evaluate platform content moderation practices.
KW - Computational Compliance.
KW - Digital Services Act
KW - Transparency
U2 - 10.1145/3630106.3658960
DO - 10.1145/3630106.3658960
M3 - Conference article in proceeding
T3 - Proceedings of the ACM Conference on Fairness, Accountability, and Transparency, FAccT
SP - 1121
EP - 1132
BT - 2024 ACM Conference on Fairness, Accountability, and Transparency, FAccT 2024
PB - Association for Computing Machinery
Y2 - 3 June 2024 through 6 June 2024
ER -