Digital Services Act (DSA)
Disclaimer: Copyright infringement not intended.
- The European Union’s groundbreaking Digital Services Act (DSA) went into effect.
- The European Union’s groundbreaking Digital Services Act (DSA) constitutes an overhaul of the EU’s social media and e-commerce rules and tightly regulates the way intermediaries, especially large platforms such as Google, Meta, Twitter, and YouTube, moderate user content.
- This new regulation aims to contribute to the proper functioning of the EU’s internal market for online intermediary services by setting out harmonized rules for a safe, predictable, and trusted online environment.
- The DSA, seeks to regulate how Google, Facebook, and Amazon target advertisements towards users, based on parameters such as religion or race, and adopt stricter ways to cut down on hate speech or child abuse material on platforms.
What was the need for DSA?
- Companies such as Google and Facebook already publish transparency reports on how they react to government requests for data removal on their platform around the world voluntarily. However, many industry observers have often labeled the self-regulatory efforts of tech companies to be inadequate.
Observer Research Foundation (ORF):
Self-regulation has so far “allowed the big tech to cherry pick what is to be acted on and what is to be ignored, effectively making it the arbiter of permissible speech.
Key features of the Digital Services Act
Faster removals, opportunity to challenge
- Social media companies are required to add “new procedures for faster removal” of content deemed illegal or harmful. They must explain to users how their content takedown policy works.
- Users can challenge takedown decisions, and seek out-of-court settlements.
Bigger platforms have greater responsibility:
- The legislation has junked the one-size-fits-all approach and put a greater burden of accountability on the big tech companies.
- Under the DSA, ‘Very Large Online Platforms’ (VLOPs) and ‘Very Large Online Search Engines’ (VLOSEs), that is, platforms with more than 45 million users in the EU, have more stringent requirements.
Direct supervision by the European Commission:
- These requirements and their enforcement will be centrally supervised by the European Commission itself, ensuring that companies are not able to sidestep the legislation at the member-state level.
More transparency on how algorithms work:
- VLOPs and VLOSEs will face transparency measures and scrutiny of how their algorithms work and will be required to conduct systemic risk analysis and reduction to drive accountability about the societal impacts of their products.
- VLOPs must allow regulators and researchers to access their data to assess compliance and identify systemic risks of illegal or harmful content.
Clearer identifiers for ads and who’s paying for them:
- Online platforms must ensure that users can easily identify advertisements and understand who presents or pays for the ads. They must not display personalized ads directed toward minors or based on sensitive personal data.
Changes big tech has been forced to make
- The DSA imposes heavy penalties for non-compliance, which can be up to 6 percent of the company’s global annual turnover. Companies that do not wish to abide by the rules cannot function within the EU.
- Due to the harsh repercussions and the threat of losing a market of around 450 million users, major social media companies have fallen in line and announced they will allow more freedom to users in the way they interact with their platforms.
- Meta: The company that operates Facebook and Instagram has said it will introduce non-personalized digital feeds.
- Google: It has said it will increase how much information it provides about ads targeted at users in the EU. It will also expand data access to third-party researchers studying systemic content risks in the region.
DSA validates India’s Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021
- Some of the rules set by the DSA are similar to those in India’s Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, also known as the IT Rules, 2021. So, India has already sought a “more responsible social media", with its IT Rules. This is now easier to push for on a global scale because of the DSA.
- The DSA is both a precedent and validation of India’s own rules governing social media intermediaries and limiting their safe harbor protection.
- In India, the IT Rules, 2021, made it mandatory for ‘significant’ social media intermediaries, or online platforms with more than 5 million registered users in India, to appoint a grievance redressal officer. The latter is required to acknowledge a user’s grievance within 24 hours of receipt of the complaint and offer a resolution in the next 15 days. The DSA seeks to establish a similar practice, where users of very large online platforms (VLOPs), which have more than 45 million registered EU users, will be able to challenge content moderation decisions made by the platforms. They will also be able to take legal recourse, through courts, against decisions made by such tech platforms.
- Recommendations on India’s upcoming Data Protection Bill stated that online platforms would be required to ensure transparency and ‘fairness’ of algorithms that are used to process personal user data. The DSA seeks to establish transparency measures so that platforms have to explain decisions where algorithms are used for “recommending content or products to users".
- The DSA can thus help set the pace for other countries in “codifying accountability and liability of social media platforms"
India’s Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021
Amendments to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (IT Rules, 2021)
Q. The Digital Services Act (DSA) is both a precedent and validation of India’s own rules governing social media intermediaries and limiting their safe harbor protection. It can thus help set the pace for other countries in codifying accountability and liability of social media platforms. Substantiate.