New IT rules: Need of the hour or vice on free speech?

Experts believe that self-regulation by social media companies needed a check and the new IT rule amendment could be an answer to it, although it needs to be dealt with care

e4m by Nilanjana Basu
Published: Nov 2, 2022 8:45 AM  | 6 min read
social media
  • e4m Twitter

Last Friday, the Indian government amended its IT rules for social media companies to moderate the content that goes on the platforms in an attempt to regulate controversial content. The rules now require a government panel to be formed to hear complaints from users about the moderation of their content on social media platforms.

Each grievance appeal committee will consist of a chairperson and two whole-time members appointed by the central government. All the social media companies like Facebook, Twitter, Instagram, and Youtube will have to mandatorily comply with the new rule.

The Indian government has had a tough relationship with big tech social media companies. Several times in the past, the platforms have received requests to remove certain content which did not seem appropriate to be publicized.

Big tech companies have advocated self-regulation but the government now wants to take it into their own hands and see that users’ complaints are looked into.

Tech companies were already told to have a chief grievance officer to address user complaints on content and also have executives placed to look after legal matters.

The new amendment wants companies to acknowledge user complaints within a day and resolve them within 15 days or take down content within 72 hours if they need to be removed.

This move comes at a time when Tesla’s chief Elon Musk took over Twitter’s operations. Musk has been openly advocating free speech on the platform and tweeted jokes around it like “Finally, the truth that carbs are amazing can be said on this platform! #FreeSpeech” and “Comedy is now legal on Twitter.” However, he also said “Twitter will be forming a content moderation council with widely diverse viewpoints. No major content decisions or account reinstatements will happen before that council convenes.”

A necessary evil?

Speaking about this new amendment rule, Karan Taurani of Elara Capital thinks this could be a good filtration process for unwanted or hurtful content. “These regulatory issues are being spoken about for quite some time in the country. India, as you know, is a free country. People can speak whatever they want. The government keeping an eye on the platforms is a good thing in a way, because it will definitely lead to the filtration of some amount of content. People may think before posting anything controversial that can hurt religious sentiments or spoil the harmony of society. So, I think that is the bigger implication of this.”

Tanya Swetta, CEO & Co-Founder of id8 media solutions says: “The moderation of content has always been an integral part of social media platforms like Facebook, Twitter, and Google enforcing their own rules, most likely by policing commercial content. The new rules certainly do give more control to the government over content moderation decisions, and in turn, could affect the way agencies handle content moderation for their own clients while keeping in mind the government's directive. id8 media solutions believe in being able to voice opinions as brand custodians and as industry experts, and therefore, I believe that this would raise the question of how a brand could remain agile during these times.”

Hareesh Tibrewala, Joint CEO of Mirum India addresses the issue and thinks this sort of self-regulation from big tech needed a check. “Social media companies will need to set up people and processes that can actively moderate content on social channels and delete stuff that is unlawful and illegal. Social media companies have become extremely powerful in terms of their ability to shape public opinion. Thus, some kind of regulation and grievance addressal mechanism is imperative. The challenge is to ensure that the grievance it does not become a kind of censorship, which allows governments to prevent dissent from being voiced.”

Viren Razdan, Managing Director of Brand-Nomics, also articulates on how critical a situation related to free speech can be. “Up until now, there has been some sort of unwritten self-policing but with a sort of ‘code of conduct’ coming into play, it would have to be seen how this is governed and executed. On one hand, it’s against the very nature of free speech and expression, which is the basis of the social space. On the other hand, if allowed to be policed, neutrality of the view becomes a critical factor.”

He further adds, “A code of conduct opens itself to bias and interpretations which could colour social space. While I say this, I do realise that freedom is and can be used to manipulate, at times to dangerous proportions. So, it is a thin line but depends on the maturity of markets and minds.”

Rashid Ahmed- Digital Head of Infectious Advertising, points out that one of the keys to successful content moderation is to specify what is acceptable and not on social media. "A bulk of content could be successfully moderated by automated systems that check against previously disapproved content, and such systems could likely learn over time. Legal systems already regulate what constitutes acceptable standards for free speech, which in turn may need to be applied to self-regulating or panel-moderated social media platforms."

Impact of content moderation

Taurani also talks about how this new rule might impact social media companies. “Ëven last year, the government came out with some kind of norms around digital media, wherein they clearly mentioned that they want social media companies to have a chief compliance officer, a resident grievance officer and a monthly compliance report. And this is something again which was done to avoid any kind of controversies, any kind of a statement which is hurting anybody's sentiment.

"Social media companies will need to invest a lot more in terms of legal issues in India. Despite a lot of the complaints on the platforms, the complaints are not being addressed by them due to lack of any kind of a grievance addresal system, but now they will need to invest a lot more in terms of getting the right people who can tackle these kinds of issues.”

Swetta paints the right picture about the usefulness of content moderation and says, “An effective content moderation guideline could have a positive impact on social media companies as well as their clientele. It would help in safeguarding the community conversations, ensuring information accuracy, will help protect individuals and their interests from being trolled and could bring meaningful transparency. Considering how volatile the content market is today, it is crucial to find ways to rebuild trust in content platforms.”

Published On: Nov 2, 2022 8:45 AM