Google, Meta, Apple on the radar in EU’s online content rules
- April 27, 2023
- Posted by: OptimizeIAS Team
- Category: DPN Topics
Google, Meta, Apple on the radar in EU’s online content rules
Subject: Science and Technology
Section: Awareness of IT
Why in News?
The European Union (EU) has confirmed the names of 19 platforms that will be subject to its landmark online content rules.
News In Brief:
- The rules, notified under the Digital Services Act, aim at overhauling the EU’s social media and e-commerce rules, and at tightly regulate the way big technology platforms moderate user content. Five subsidiaries of Google’s parent Alphabet, two Meta units, two Microsoft businesses, Apple’s AppStore, Twitter, and Alibaba’s AliExpress are among the entities that the EU has identified.
Key features of the Digital Services Act (DSA)
Faster removals and provisions to challenge: Social media companies will have to add “new procedures for faster removal” of content deemed illegal or harmful. They will also have to explain to users how their content takedown policy works. The DSA allows users to challenge takedown decisions made by platforms, and to seek out-of-court settlements.
Bigger platforms have greater responsibility: The legislation does not subscribe to a one-size fits all approach, and places increased accountability on the Big Tech companies. Under the DSA, ‘Very Large Online Platforms’ (VLOPs) and ‘Very Large Online Search Engines’ (VLOSEs) — that is, platforms having more than 45 million users in the EU, will have more stringent requirements.
Direct supervision by the European Commission: These requirements and their enforcement will be centrally supervised by the European Commission itself — an important way to ensure that companies do not sidestep the legislation at the member-state level.
More transparency on how algorithms work: VLOPs and VLOSEs will face transparency measures and scrutiny of how their algorithms work, and will be required to conduct systemic risk analysis and reduction to drive accountability about the society impacts of their products. VLOPs must allow regulators to access their data to assess compliance and allow researchers to access their data to identify systemic risks of illegal or harmful content.
Clearer identifiers for ads and who’s paying for them: Online platforms must ensure that users can easily identify advertisements and understand who presents or pays for the advertisement. They must not display personalised advertising directed towards minors or based on sensitive personal data.
India’s online laws similar to EU’s DSA
- In February 2021, India had notified extensive changes to its social media regulations in the form of the Information Technology Rules, 2021 (IT Rules) which placed significant due-diligence requirements on large social media platforms such as Meta and Twitter.
- These included appointing key personnel to handle law enforcement requests and user grievances, enabling identification of the first originator of the information on its platform under certain conditions, and deploying technology-based measures on a best-effort basis to identify certain types of content.
- Social media companies have objected to some of the provisions in the IT Rules, and WhatsApp has filed a case against a requirement that requires it to trace the first originator of a message. One of the reasons that the platform may be required to trace the originator is that a user may share child sexual abuse material on its platform.
- However, WhatsApp has alleged that the requirement will dilute the encryption security on its platform and could compromise personal messages of millions of Indians.
Earlier in 2022, with a view to making the Internet “open, safe and trusted, and accountable”, the Ministry of Electronics and IT notified amendments to IT intermediary Rules, 2021 aimed at protecting the rights of Digital Nagriks.
Key changes effected in the IT Rules 2021 are as under:
- Currently, intermediaries are only required to inform users about not uploading certain categories of harmful/unlawful content. These amendments impose a legal obligation on intermediaries to take reasonable efforts to prevent users from uploading such content. The new provision will ensure that the intermediary’s obligation is not a mere formality. The amendment requires intermediaries to respect the rights guaranteed to users under the of the Indian Constitution, therefore, including a reasonable expectation of due diligence, privacy and transparency.
- For effective communication of the rules and regulations of the intermediary, it is important that the communication is done in regional Indian languages as well.
- The grounds in rule 3(1)(b)(ii) have been rationalized by removing the words ‘defamatory’ and ‘libellous’. Whether any content is defamatory or libellous will be determined through judicial review.
- Some of the content categories in rule 3(1)(b) have been rephrased to deal particularly with misinformation, and content that could incite violence between different religious/caste groups.
- The amendment requires intermediaries to respect the rights guaranteed to users under the Constitution, including a reasonable expectation of due diligence, privacy and transparency. The rules also have made it explicit for the intermediary to respect the rights accorded to the citizens of India under the Articles 14, 19 and 21 of the Indian Constitution
- Grievance Appellate Committee(s) will be established to allow users to appeal against the inaction of, or decisions taken by intermediaries on user complaints. They would have the authority to review and revoke content moderation decisions taken by large tech platforms. However, users will always have the right to approach courts for any remedy.