In the run-up to the release of the revised rules for online businesses (proposed intermediary guidelines) by the Ministry of Electronics and Information Technology (MeitY) in January 2020, Inc42, in association with Ikigai Law, on November 29, organised a roundtable interactive session with startups to discuss the government’s approach to regulating online businesses and the impact that this would have on India’s startups.
Moderated by Vaibhav Agarwal, founder and CEO, Inc42, Tanya Sadana, principal associate, Ikigai Law, and Tuhina Joshi, associate at Ikigai Law, ‘The Dialogue’ focussed on three key issues under the proposed intermediary guidelines under the Information Technology Act, 2000 (IT Act) — local incorporation, proactive content monitoring and strict takedown timelines.
The invite-only roundtable was attended by startup founders who brought up a host of challenges that have not been addressed under the proposed draft of the intermediary guidelines.
The Key Takeaways From The Dialogue On Intermediary Guidelines
Among the major contentious issues in the intermediary guidelines
Currently, there is no requirement for online businesses (in technical terms, ‘intermediaries’ under the IT Act) to set up permanent registered offices in the country. This allows small businesses, particularly bootstrapped startups, to set up online businesses and offer services in the country with ease from any location in the world. This, in turn, fosters innovation and the spirit of entrepreneurship in the country.
In December 2018, MeitY released a set of proposed amendments to the intermediary guidelines that mandate local incorporation (i.e., a permanent registered office) for all online platforms with over 50 lakh users in India. Reacting to this requirement, online skill-building platform AttainU’s CBO, Pramod Kumar explained that as per him, it was not necessary or useful to require companies to have permanent registered offices in India. Instead, he stated that nodal officers would work as effective communicators between the government and a company’s core team.
The discussion then moved onto the effects that such a local incorporation requirement would have on the global ambitions of start-ups. For instance, there is a very real possibility that other countries will introduce similar local incorporation requirements in their own jurisdictions.
This means that Indian startups wishing to offer services in other countries would be required to set up local establishments in each and every country that they offer services in, which would create excessive entry barriers for Indian startups expanding abroad.
Utkarsh Sinha of Bexley Advisors added that this would threaten the very architecture of the internet by splintering it, which ultimately reduce the free flow of services across countries that we enjoy today.
The discussion further highlighted that, instead of government regulation, a self-regulatory mechanism would allow the government’s concerns regarding law enforcement to be assuaged without unnecessarily impinging on startup growth in the country.
Proactive Content Monitoring Requirements
The proposed amendments also require online platforms to proactively monitor user-generated content and block ‘unlawful’ content. There is no such requirement under the current rules. According to Sankaranarayanan Devarajan the cofounder of vernacular content sharing platform Pratilipi, it is extremely difficult for startups to demarcate the line between lawful and unlawful speech, and therefore this is not an obligation that should be placed on startups. He stressed that for Pratilipi recognising problematic content on visual media is relatively easy, but doing so with the written word, particularly with vernacular languages, is nearly impossible.
Himanshu Gupta, former head of growth with Walnut, a fintech app, added his own perspective, highlighting that this rule might incentivise companies to have smaller content moderation teams. This is because if companies are unaware of illegal content on their platforms, they may be extended the safe harbour protection as they will fulfil the test of being ‘passive’ intermediaries. On the other hand, if they do have large content moderation teams and are still unable to filter out illegal content despite their best efforts, they would inevitably lose their safe harbour protection, exposing them to heavy penalties and liability.
Mobile gaming platform Mobile Premier League’s general counsel, Dibyojyoti Mainak added that proactive filtering demands would add heavy burdens on fast-growing startups, which would impact their growth. He added that he believed a flagging system for labelling content that was potentially problematic was a better system than requiring outright banning.
Comparing this system to what we see on our televisions – he explained that he would be happy to categorise content into different buckets, the way we have different categories (news/sports/soap operas) for televised content. This would allow users to make an informed choice on whether or not they want to consume content, instead of outright banning the content.
AttainU’s Kumar explained that when even large companies are undecided on the best method of content moderation, it would be unfair to place that burden on small companies, and would, in fact, deter entrepreneurs from entering the startup space in India.
Strict Takedown Timelines
The final topic for discussion was on the strict 24-hour deadline to comply with takedown requests of content deemed unlawful or objectionable, which is introduced under the proposed amendments. As explained in our previous piece, this is an excessively short timeframe that does not account for the internal review processes that companies have to undertake every time they receive a takedown request.
On this issue, the room was in general agreement that 24 hours is not nearly enough to comply with takedown requests, as in many cases it is simply technically infeasible. Participants at The Dialogue by Inc42 and Ikigai Law were also largely in agreement that if at all short timelines are to be introduced, they should be applied only to sensitive content, and not to all content generally.