Chinese content aggregator ByteDance-owned TikTok (formerly known as Musicaly) is reportedly taking steps to comply with the Indian government’s concerns around content and privacy policies rules for social media companies.
The user-generated video sharing app, which counts 39% of its 500 Mn global users in India, mostly between the age group of 16 and 24 years, had recently appointed Sandhya Sharma from Mastercard as TikTok India’s public policy director. The social media app is said to be working to improve hygiene on its platform.
It has put in place protective measures to remove unsuitable content, combining content moderation technology with a human moderation team based in over 20 countries and regions now cover 36 languages. On TikTok, users can create short videos and like any other social app has followers, hashtags, likes and comments. However, what is separates from others is its live streaming feature, which allows users to send virtual gifts which can be bought with real currency.
In India, the company said, the moderation team covers major Indian vernacular languages including Hindi, Tamil, Telugu, Bengali and Gujarati. The social media app has its offices in Mumbai and Delhi-NCR. It has recently started an online campaign in partnership with Cyber Peace Foundation to promote online safety.
“We, at TikTok, are continuously working to introduce additional features to promote safety. TikTok’s first of a kind Digital Wellbeing feature which limits the time users can spend on the app is one such example,” Sharma said in a statement by TikTok.
In the past, TikTok introduced a feature where users can delete or block their fans if they find their comments contain hate speech. The app also has a feature called ‘Restricted Mode’, which helps users filter out inappropriate content.
The India government to make the intermediaries (social media platforms) more responsive to blocking fake news, hate speech, nudity, from being shared across its platform, has introduced draft guidelines as part of Section 79 of the IT Act.
“The said rules seek to ensure better regulation of content on platforms of various intermediaries as well as the source of the content. The said platforms could face liability if they do not take down the said content after attaining ‘actual knowledge’ of the illegal nature of the content, within a period of 72 hours,” law firm Athena Legal’s principal associate Simranjeet Singh said about the responsibility of the social media platform.