Social media intermediaries such as Facebook, Twitter and Instagram will continue in India despite the expiry of a government-set deadline for the platforms to comply with a new set of norms.
The central government had asked the social media platform providers to comply with the Information Technology (Guidelines for Intermediaries and Digital Media Ethics Code) Rules, 2021, by May 25.
Social media has been abuzz with rumours that the major platforms would withdraw from India since the rules, announced on February 25, would make them responsible for the content posted by users.
Experts opined that the intermediaries may not face ban or restrictions in India, but they would lose the immunity—or safe harbor protection—from the law over the content. The central guidelines have not mentioned banning them for non-compliance.
While notifying the rules on February 25, the Union government allowed three months’ time to social media platform providers to comply with the new norms. Reports said only Koo App has, so far, complied with the norms.
Koo (previously known as Ku Koo Ku) is an Indian microblogging and social networking service, similar to Twitter, developed by two Bengaluru-based developers.
Meanwhile, social media intermediaries have sought more time to comply with the rules. Facebook said it would abide by the new set of rules.
Facebook-owned WhatsApp, however, cited privacy rights and moved the High Court of Delhi on Tuesday, challenging a traceability clause (see below) in the guidelines, terming it unconstitutional. It also sought the court’s intervention to insulate its employees against criminal liability for non-compliance.
What is safe harbor?
Facebook, Twitter, Instagram, etc., are merely platforms for communication, and the providers do not have control over the content. Safe harbor is a protective mechanism--in place in several countries—protecting platform providers from being held culpable for the users’ content.
The Section 79 of the Information Technology Act, 2000, extends the same protection to the intermediaries in India. Going by the new rules, the service providers could also face legal action if a case was registered over the content posted by a user, Advocate Prasanth Sugathan, legal director of Software Freedom Law Centre, said.
What is new?
• Each platform should appoint a resident of India as its chief compliance officer.
• Appoint a nodal officer to coordinate with legal agencies
• Appoint a grievance officer to address complaints
• Prepare a monthly action-taken report. The report should include the number of complaints received, action taken, content that were removed, etc.
• The guidelines have included a traceability clause. Messaging service providers such as WhatsApp should help investigators and prosecutors in tracing the source of the content (identification of the first originator of information) in serious cases, like those pertaining to national security
• Illegal content should be removed based on a court’s order or government instruction
Challenges
The social media intermediaries have sought more time for implementing the norms without posing a threat to the secure end-to-end encryption now in place.
The firms also foresee that residents of India, appointed as mandated by the guidelines, may face criminal cases.
Facebook and WhatsApp could not find a suitable candidate to head their country divisions in 2018, since most people kept away fearing the possibility of being slapped with criminal charges.