In December 2018, the government proposed major changes to the rules governing online platforms and tech companies in India. These changes, which were introduced to control the misuse of social media and the spread of fake news online, threatened to disrupt the ease of doing business for all internet-based businesses in India. As a result, they were met with strong push-back from companies and civil society organisations alike. The government is now revising these rules in an attempt to balance the competing privacy and security interests.
The government’s proposed rules have the potential to increase the legal risks of doing business in India for tech companies, and so industry experts have urged the government and legislators to consider workable alternatives while tackling fake news and misinformation on the internet.
How Tech Companies Are Affected By India’s New Rules?
In order to enable their smooth functioning, the Information Technology Act, 2000 (IT Act) grants online platforms (or ‘intermediaries’) a ‘safe harbour’ that protects them from liability for acts of third parties on their platforms. This ensures that internet service providers are not held liable for their subscribers’ unlawful acts on the internet and that social media platforms are not punished for defamatory content posted by their users. However, this immunity is not absolute. In order to be eligible for the safe harbour protection, companies must comply with the due diligence guidelines issued under the IT Act. It is these rules that the government attempted to make changes to in December 2018.
The new rules proposed by the government dilute the ‘safe harbour’ protection by increasing the obligations that tech companies have to comply with. If companies are unable to comply with the strict rules, they will lose their safe harbour protection and become liable for the acts of their users on their platforms, which is a fairly significant legal risk. Therefore, it becomes very important to understand what the implications of the government’s proposals are.
Three key features of the changes proposed by the government are:
- The local incorporation requirement;
- The strict 24-hour timeline for the takedown of content; and
- The proactive content monitoring requirement.
Each of these requirements significantly impacts the way internet-based businesses operate and requires considerable time, money and resources to implement. In some cases, these requirements are just technically impossible to implement.
If these requirements are introduced in their current form into the due diligence guidelines, most tech companies in the country will find it extremely difficult to retain their safe harbour protection. This will discourage the growth of tech businesses and stifle innovation in the IT sector. It could also affect investor interest in Indian tech players. This, in turn, would affect the quality of products and services available to Indian consumers.
The Local Incorporation Requirement
Take the local incorporation requirement, for example. If introduced, all online platforms with over 50 lakh users in India would be required to incorporate a company with a permanent registered office in India. Currently, there is no such requirement under the due diligence guidelines for intermediaries in India. This requirement ignores the fact that many tech companies with global operations have very lean teams that are based out of a single country. For example, Wikipedia, one of the most visited websites in the world, is run by the Wikimedia Foundation, a non-profit organisation based in the United States. This organisation is unlikely to have the resources to set up offices in every country that it offers its services in.
If a local incorporation requirement is introduced in India, the Wikimedia Foundation would most likely refrain from offering its services in the country. This would have a detrimental impact on Indian consumers that rely on Wikipedia’s services. Similarly, the local incorporation requirement could discourage other similarly placed organisations from offering their products in India, limiting the variety of services available to Indian consumers.
This requirement also carries the risk of being replicated in other countries, which would hurt the growth of Indian startups. This is because startups would be required to set up local establishments in every country that they offer services in. This would create excessive entry barriers for Indian startups with users in other countries, and eventually splinter the internet itself. Instead of requiring companies to set up offices in India, a viable alternative could be mandating the local presence of a nodal officer or any other authorised representative of an intermediary. This would address the government’s concerns around the accountability of off-shore businesses without discouraging tech startup ambitions or limiting access to services for consumers.
24-Hour Deadline For Takedown Of Content
The second key feature of the changes proposed by the government is the 24-hour timeline for the takedown of unlawful content. This is an excessively short timeframe that does not account for the internal review processes that companies have to undertake every time they receive a takedown request.
There is no accompanying rationale that justifies the need for a 24-hour takedown timeline in all cases. If companies are made to adhere to this timeline, it could lead to excessive censoring of content without any checks and balances, which will affect the freedom of speech and expression. In certain cases, it may even be technically impossible for companies to take down content at such short notice. Instead of adopting a uniform 24-hour deadline for takedowns of all categories of content, a more practical approach would be to assign timelines basis the sensitivity of content involved.
Proactive Content Monitoring
The third and final key feature of the changes proposed by the government is the proactive content monitoring requirement for online platforms. Per this requirement, internet businesses are required to proactively identify and remove unlawful content on their platforms. There is no such requirement under the current rules. This requirement poses significant risks to the fundamental right to privacy and free speech because it allows intermediaries to independently decide what is considered ‘unlawful’, an exercise that should be left to the judiciary to decide.
As explained by Manuj Garg, co-founder of myUpchar.com, “the onus and responsibility of filtering content will be placed on platforms. This can, supposedly, be done either by manual review or through technology.”
He added, “Consider manual review – small platforms don’t have the resources for it, and large platforms won’t be able to do this given the quantum of content created. 500 hours of videos are uploaded on YouTube every minute, for instance. Now consider technology. It is presumed all startups will be able to build and deploy artificial intelligence (AI) systems to check content. Unfortunately, AI is still in its infancy when it comes to understanding the variety and nuances in content, especially in Indian languages. In short, if platforms are held responsible for content filtering, this could spell the end of user-generated content (UGC) and UGC platforms in India.”
“Ultimately, the regulators seem to be missing the fact that those who want to spread misinformation will just find another medium,” Garg said.
While the need to tackle unlawful content on the internet is indeed a legitimate one, compliance with such requirements should not be linked to the intermediary safe harbour, which is crucial for commercial survival.
An alternative could be to make this requirement a part of a ‘good practice’ clause which requires good faith compliance by intermediaries. This will be in sync with current business practices since most online platforms already have monitoring processes for unlawful content in place.