Skip to main content
Legal Query India

Main navigation

  • Home
User account menu
  • Log in

Breadcrumb

  1. Home

Digital Safety Rules Every one Must Know.

By shagun , 30 July 2025
Intermediary Guidelines and Digital Media Ethics Code Rules, 2021

The Intermediary Guidelines and Digital Media Ethics Code Rules, 2021 were introduced to regulate online content and ensure accountability of digital platforms like WhatsApp, Facebook, Twitter, and online news and OTT platforms. These rules aim to protect users from harmful content, fake news, and online abuse while promoting transparency.

The Rules say that social media platforms must follow due diligence — like removing illegal or harmful content within a specific time after a complaint, appointing officers to handle complaints and ensure compliance, and helping law enforcement when needed. For OTT platforms and digital news, the Rules set up a three-tier grievance redressal system and require content to follow a self-regulation code and age-based classifications.

Key Highlights:

  • Social media platforms must remove flagged content quickly.
  • Big platforms (with over 50 lakh users) must appoint grievance officers in India.
  • Online news and OTT platforms must follow a code of ethics.
  • Viewers can file complaints about online content.
  • The government can issue directions to ensure public order and safety.

FAQs – Intermediary Guidelines and Digital Media Ethics Code Rules, 2021

1. For what are these rules set for?

The rules are set for social media platforms, OTT apps, and digital news to ensure content is safe, legal, and user-friendly.

2. Do these rules also apply for social media I use?

Yes. After a complaint, your content can be removed only if it’s against the rules or it is illegal.

3. What is an intermediary?

Platforms like WhatsApp, Facebook, Instagram, Twitter, etc., which let users post or share content.

4. What will happen after filing a complaint for illegal content?

The content will be removed within 24–72 hours; in serious cases, it will be removed only if it violates the rules.

5. Who handles user complaints on platforms now?

Every big platform must appoint a Grievance Officer to receive and resolve complaints.

6. What if my personal data is misused?

Whoever misused your personal data will face penalties after you report it to the platform or Grievance Officer, and they will act on it according to the Act.

7. Are OTT platforms like Netflix and Amazon required to rate shows or movies?

They must rate shows/movies (like 13+, 16+, etc.), provide parental controls, and have a clear complaint system.

8. Is there any government control on digital content?

Yes, but it starts with self-regulation by platforms. If not resolved, it can go to a government-level oversight.

9. Can platforms be punished under these rules?

Yes. If they don’t follow the rules, they may lose their protection from being sued for what users post.

10. Do I need to verify my account now?

Not mandatory, but platforms may offer voluntary verification options for users (like a blue tick).

11. Will my private messages be read?

No, but if required by law (in serious crimes), platforms may be asked to trace the origin of a message—not its content.

12. What is a significant social media intermediary?

Any platform with over 50 lakh users in India. They have more responsibilities under these rules.

13. What are the content classifications on OTT?

They include U (Universal), U/A 7+, U/A 13+, U/A 16+, and A (Adult). Platforms must display this before playing content.

14. What if a child sees inappropriate content on OTT?

OTT apps must offer parental controls to restrict such content.

15. Can digital news websites publish anything they want?

They must follow journalistic ethics and respond to complaints through a proper redressal mechanism.

16. Who resolves complaints if the platform doesn’t?

There’s a three-tier system—first the platform, then a self-regulatory body, and finally the government.

17. Do these rules apply to YouTube channels?

Yes, especially if the channel is offering news or current affairs or reaches a large audience.

18. Are memes or jokes also monitored?

Only if they’re harmful, offensive, or violate laws (e.g., hate speech, obscenity).

19. Do these rules apply to WhatsApp forwards?

Yes, especially if the forward is harmful or illegal—WhatsApp may have to trace where it originated from.

20. Can I take action if my complaint is ignored?

Yes. You can escalate to the next level (self-regulatory body or government) depending on the platform type.

Comments

About text formats

Plain text

  • No HTML tags allowed.
  • Lines and paragraphs break automatically.
  • Web page addresses and email addresses turn into links automatically.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
RSS feed