Legal Opinion on How the European Commission can Improve its Content Moderation Regulations to Best Mitigate Harms Created by ‘False Information’ on Social Media
Caitlin Hemminga
This paper aims at analysing oversight and content moderation beyond the national level while taking Facebook and the Oversight Board as an example. Therein also discussing the Digital Services Act and proposing changes to ensure that content moderation is effective across platforms to truly create a safe online space.
As such, the author concludes that a call for collaboration is urgently required to ensure that the borderless Internet is approached with an international approach to content moderation with relevant stakeholders, NGOs, society, and regulators at the core to ensure user protection.
Legal Opinion on the Best Possible Means and Digital Tools for the Department for Digital, Culture, Media, & Sport to Review and Implement in Order to Successfully Tackle the Spread of Conspiracy Theories on Social Media
Emma Mackenzie Nation
This paper aims to provide an insight into the current spread of conspiracy theories in the digital environment. It offers a hopeful vision for the future, suggesting that the solution to the spread of conspiracy theories online is multifaceted. As such, it is suggested that a holistic approach is taken which situates conspiracy theories within their social and political context, recognising that conspiracy thinking frequently occurs in vulnerable people in turbulent times. It therefore ultimately argues that social media regulation should be utilised by the government alongside a host of measures across society, from education policy and mental health reform, to examining the current incentives across the journalism and advertising industries.
Green Paper on the Digital Services Act and the Need for Due Process in Algorithmic Content Moderation
Emanuele Salviati
This paper aims at discussing how the Digital Services Act proposal can ensure that algorithmic content moderation systems are compliant with fundamental rights, notably by upholding the liability regime of the E-Commerce Directive and by introducing tailored due process obligations for online platforms. The author concludes by suggesting several amendments to overcome the proposal’s existing shortcomings, including clarifications on the circumstances in which intermediary providers’ liability arises, and measures to strengthen algorithmic transparency and accountability.
Digital Legitimacy: A Model for Internet Intermediaries
Tom Middleton
The paper analyses how the current legal and private systems for notice and takedown procedures fail to encourage proper legitimate decision making by digital platforms at the expense of end-users. Furthermore, the paper argues for a more legitimate notice and takedown decision-making process through a global, uniform code imposing more specific obligations on intermediaries in line with the legitimacy principles of accountability, transparency, due process and proportionality.
The Digital Afterlife on Social Media – Legal Opinion
Valentin Freiermuth
The paper aims to highlight shortcomings in today’s policies of Facebook, Instagram, YouTube, Twitter and TikTok and proposes policy and legislative improvements. The author suggests that governments need to strive for innovative legislation that governs the transferability of digital assets and the protection of personal data of the deceased. However, the paper argues that mere legislative action is insufficient. Social media platforms, as the first point of contact to the personal data and intellectual properties, have to be innovative, too.
Has the CJEU been successful in balancing and consistently applying the economic right of "communication to the public"?
Carl Emil Bull-Berg
The paper argues that the Court of Justice of the European Union (CJEU) has been unsuccessful in balancing and consistently applying the economic right of “communication to the public” afforded to rightsholders within the copyright regime.
The Need for Due Process in Notice-and-Takedown-based Content Moderation on Social Media Platforms
Eva Knoepfel
This paper critically analyses Facebook, Twitter and Youtube’s use of notice-and-takedown (NTD) mechanisms in content moderation and the effect on the rights of social media users. These platforms’ inconsistent enforcement of their governing rules and lack of transparency exacerbate systemic flaws of NTD to the detriment of users, society and the platforms themselves. The paper concludes that the adoption of due process principles is necessary to mitigate deficiencies in content moderation processes, and is appropriate given social media platforms’ role as modern public fora and regulators of online speech.
Extent of Liability for Intermediaries in EU, USA and India – a Stakeholder Perspective
Lakshmi Srinivasan
Given the policy of EU on intermediary liability with the current DSM directive, it is time for us to refresh our perspectives on this aspect of law, starting with the stakeholders. Looked at from the perspective of the Rightsholders, Intermediaries and importantly, the Users, this paper analyses the policies that EU, USA, India and South Korea operate on for ascertaining liability for online copyright infringement.
A comparison of the legitimacy of recent legislative and non-legislative pressures exerted on the EU intermediary liability framework
Roosa Tarkiainen
This paper analyses the impact of EU law reform and voluntary measures on the existing intermediary liability framework in the intellectual property enforcement context. The paper concludes that changes to the intermediary liability framework are inevitable. However, in many instances, the changes proposed by the EU law reform are already in line with the existing acquis, whereas voluntary measures in their many forms exacerbate fragmentation of liability across the EU and should be avoided as a means of bypassing the established legal framework.
A green paper addressing the role of social media in the fight against counterfeit: Is the current legal framework regarding intermediary liability sufficient to enforce intellectual property rights on social media platforms in the EU, US and China
Claudia Kieser
This paper tackles the framework needed to set up a task force under the umbrella of the World Intellectual Property Organization (WIPO) to tackle counterfeit on social media platforms (SMPs) in the EU, US and China. It maintains that SMPs, rights holders and users must unite in providing their own perspectives in addressing these issues. It begins by evaluating the current legal framework, then suggesting the use of technology and concluding that a task-force may be worth pursuing, with the objective of disrupting supply and demands of counterfeits on SMPs. It proposes the innovative use of blockchain for right holders and a shared onus of proof between right holders and SMPs, yet it recognises the obstacles posed by the fundamental right of expression and right to privacy.