您的浏览器禁用了JavaScript(一种计算机语言,用以实现您与网页的交互),请解除该禁用,或者联系我们。[ITIF]:欧盟应提高《数字服务法》的透明度 - 发现报告

欧盟应提高《数字服务法》的透明度

信息技术2025-10-20ITIF周***
AI智能总结
查看更多
欧盟应提高《数字服务法》的透明度

ASH JOHNSON AND PUJA ROY |OCTOBER 2025 The implementation of the Digital Services Act’s transparency obligations fails to providemeaningful insight into online platforms’ content moderation decisions, the extraterritorialeffects of the act, and its effects on online speech. KEY TAKEAWAYS The Digital Services Act (DSA) is an EU law that limits online platforms’ liability for third-party content and imposes transparency and safety obligations. Article 17 of the DSA requires platforms to provide a statement of reasons for contentmoderation decisions. These statements make up the DSA Transparency Database. Platforms may choose to apply certain DSA standards globally to maintain consistencyand simplify compliance, which risks imposing EU content rules in other countries. The EU’s implementation of the DSA Transparency Database obscures the potentialextraterritorial effects of the DSA and its effects on online free speech. Online platforms remove content containing illegal or harmful speech 98 percent of thetime, rather than disabling or otherwise restricting it. The Transparency Database reveals a large variation of content moderation practicesbetween platforms, which benefits users and fosters healthy competition. Online platforms in the EU proactively and voluntarily moderate most content, rather thanacting only in response to takedown notices and government orders. itif.org CONTENTS Key Takeaways ................................................................................................................. 1Introduction ..................................................................................................................... 2Content Moderation and Transparency................................................................................. 4The Digital Services Act..................................................................................................... 5Transparency Requirements............................................................................................ 6Global Impact ............................................................................................................... 6Methodology..................................................................................................................... 8Results ............................................................................................................................ 9Analysis ......................................................................................................................... 12Recommendations .......................................................................................................... 14Conclusion..................................................................................................................... 16Endnotes ....................................................................................................................... 17 INTRODUCTION Regulating media to protect consumers while respecting free expression has always been adelicate balancing act. From the invention of the printing press, which made books accessible tothe masses, to the advent of radio and television, which brought live entertainment into people’shomes, policymakers and industry stakeholders have grappled with how to facilitate thedissemination of information and entertainment while limiting the spread of illegal and harmfulcontent. To date, no media technology has posed a more complex challenge for both government andindustry than social media. Social media allows billions of people around the world to publishinformation and communicate with others. This unprecedented communications channel haskick-started important social and political movements, connected users with distant friends andfamily, facilitated the spread of knowledge and ideas, and brought endless entertainment toanyone with an Internet-connected device. However, while social media has undoubtedlytransformed society in many positive ways, it has also exacerbated existing challenges andintroduced new ones by making it easier than ever to create and spread harmful or illegalcontent. Moreover, a lack of agreement about what content is harmful makes content moderationdecisions politically charged. Thus, content moderation is one of the key challenges social media platforms have to grapplewith in today’s globally interconnected online ecosystem. Complicating this further, differentgovernments have taken different approaches to regulating online content, and platforms thatoffer their services in multiple jurisdictions (i.e., most online services) must comply with allthese different regulations. Businesses then must make difficult trade-offs between tailoringtheir services to each jurisdiction’s rules and standardizing their practices as much as possibleacross jurisdictions, leading the strictest set of rules to take precedent. The EU’s DSA is one of the most ambitious content moderation regulatory