Centre Issues Warning to Social Media Platforms Over Obscene Content, Threatens Legal Action
Digital Desk
The central government has issued a stern warning to social media platforms, instructing them to immediately block obscene, vulgar, pornographic content and material involving the sexual exploitation of children. The advisory, issued by the Ministry of Electronics and Information Technology (MeitY) on Monday, cautioned that failure to comply would result in legal action under the Information Technology Act and other applicable laws.
The advisory comes amid growing concerns over the proliferation of illegal content on internet platforms, including content harmful to children. MeitY emphasised that intermediaries must review and strengthen their compliance frameworks in line with the IT Act and the IT Rules, 2021. Platforms that fail to exercise due diligence risk losing their exemption from liability for third-party content under Section 79 of the IT Act.
“Social media intermediaries and other intermediaries are legally bound under Section 79 of the IT Act. To avail exemption from liability regarding third-party content uploaded or transmitted on their platforms, due diligence must be exercised,” the advisory stated.
Key directives include ensuring that platforms do not host, display, upload, modify, publish, transmit, store, update, or share content that is obscene, pornographic, related to child sexual abuse, or otherwise illegal. Legal consequences for non-compliance may include cases filed under the IT Act, the Bharatiya Nyaya Sanhita (BNS), and other relevant criminal statutes.
According to MeitY, many intermediaries have shown inconsistency in monitoring and removing prohibited content, particularly content deemed obscene or illegal. The government emphasised the need for a uniform and rigorous approach across all platforms.
The advisory follows several high-profile incidents and judicial interventions highlighting the risks posed by unregulated online content. Earlier this year, the Supreme Court stressed the responsibility of platforms to prevent the circulation of obscene material and hold intermediaries accountable.
Industry experts note that this move aligns with global trends where governments are increasingly holding social media companies accountable for illegal content. Platforms may now need to enhance moderation systems, deploy AI-based detection tools, and increase human oversight to ensure compliance.
The advisory signals a stricter regulatory environment for digital intermediaries and underscores the government’s commitment to safeguarding children and the public from harmful online content. Failure to adhere could lead to prosecution and penalties, marking a crucial turning point in India’s approach to online safety and digital governance.
