Ministry: 
Communication and IT

Rules and Regulations Review 

Amendments to IT Rules, 2021

Key Features of the Rules 

2022 Notified Amendments

  • Intermediaries must ensure compliance with rules and regulations, privacy policy, and user agreement, and make reasonable efforts to cause users to not create, upload, or share prohibited content.  They must also respect the Constitutional rights of users.

  • The central government will establish one or more Grievance Appellate Committee to hear appeals against the decisions of grievance officers.

2023 Draft Amendments

  • The draft Amendments propose to prohibit false/fake information and also regulate online gaming.

  • Intermediaries must remove any information identified as false or fake by the fact check unit of the Press Information Bureau (PIB), or any other centrally authorised agency from their platforms, to avoid liability for such content. 

  • Online games are defined as games played on the internet with a deposit and the expectation of a winning.  An online gaming intermediary has to ensure additional due diligence such as displaying a random number generator and a no-bot certificate.

  • Self-regulating bodies (SRBs) will also be empowered to regulate the content of registered games.  SRBs must evolve a framework to test, verify, and register games such that the sovereignty, integrity, and security of the country is secured.

Key Issues and Analysis

2022 Notified Amendments

  • The Amendments may expand the role of intermediaries and enable them to regulate content at their own discretion which may impact the freedom of speech of users. 

  • An intermediary may not be the appropriate entity to do a balancing exercise between deciding what content may be prohibited and also ensuring right to freedom of speech. 

2023 Draft Amendments

  • The draft Amendments violate Article 19 by regulating content on the grounds of being false or fake.  False information is not a constitutional ground for restricting speech.

  • It may not be appropriate to empower an executive body to cause removal of content, without any safeguards in place.  It may lead to instances where any criticism of the government, which is fundamental to a democracy, may be removed without adequate checks and balances.

  • The central government may not have jurisdiction to regulate gaming.  It would have competence to regulate only that aspect of online gaming, which corresponds to communication, as it is part of the Union List.

Intermediaries are entities that store or transmit data on behalf of other persons, and include telecom and internet service providers, online marketplaces, search engines, and social media sites.[1]  The Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (IT Rules) specify due diligence requirements for intermediaries to claim exemption from being liable for any third-party information.  It also provides a framework for regulating the content of online publishers of news and current affairs, and curated audio-visual content.  The 2022 Amendments require intermediaries to make reasonable efforts to cause users to not create, upload, or share prohibited content and amend the grievance redressal mechanism.  

In December 2022, the Government of India (Allocation of Business) Rules, 1961 were amended to include matters related to online gaming under the purview of the Ministry of Electronics and Information Technology.[2]  Following this, draft Amendments to the IT Rules were released on January 2, 2023.[3],[4]  The 2023 draft Amendments add provisions to regulate fake information and online games.  The Ministry stated that the proposed Amendments intend to prevent user harm caused by playing online games.[5]

KEY FEATURES

2022 Amendments 

  • Obligations of intermediaries: The IT Rules require intermediaries to publish rules and regulations, privacy policy, and user agreements for access or usage of its services.  The Amendment adds that these details should be made available in English or any language specified in the Eighth Schedule of the Constitution.  Under the IT Rules users are prohibited to create, upload or share content that threatens the unity of India or public order, is pornographic, violates copyright or patent, or contains software virus.  Intermediaries must inform users about these restrictions.   The Amendment adds that the intermediaries must: (i) ensure compliance with rules and regulations, privacy policy, and user agreement, (ii) make reasonable efforts to cause users to not create, upload, or share prohibited content, and (iii) respect citizens’ rights under the Constitution of India including Articles 14, 19, and 21.

  • Appeal mechanism against decisions of grievance officers: The IT Rules require intermediaries to designate a grievance officer to address complaints regarding violations of the Rules.  The Amendment adds that the central government will appoint Grievance Appellate Committee(s) to hear appeals against the decisions of grievance officers.  Such appeals may be made within 30 days of the decision of the grievance officer, and should be decided within 30 days.  Orders passed by the Committee shall be complied with by the intermediary and a report to that effect uploaded on its website.

  • Expeditious removal of prohibited content: The IT Rules require intermediaries to acknowledge complaints regarding violation of Rules within 24 hours, and dispose of complaints within 15 days.   The Amendment adds that complaints regarding the removal of specified prohibited content must be addressed within 72 hours.

2023 Draft Amendments

  • Online games: An online game is defined as a game that is offered on the internet and is accessible if the user makes a deposit with the expectation of earning winnings.  The deposit may be made in cash or kind.  A winning refers to any prize in cash or in kind that is distributed or intended to be distributed based on rules of the game and performance of a player.  The central government may notify any other game as online game.  

  • Intermediary obligations for online games and fake news: All intermediaries must take reasonable efforts to ensure that users do not host an illegal online game or publish any information that is identified as false or fake by the fact-check unit of the Press Information Bureau (PIB) or any centrally authorised agency.

  • Online gaming intermediaries:  An online gaming intermediary is an intermediary that offers at least one online game.  To host, publish or advertise an online game, an intermediary must confirm whether the game is registered with a self-regulatory body.  Additional obligations for these intermediaries include: (i) registering their games with a self-regulatory body, (ii) obtaining and displaying a random number generation certificate and a no-bot certificate, (iii) informing users of the know-your-customer (KYC) procedure for user registration, the risk of financial loss and addiction associated with the game, and the measures taken to protect the user’s money, and (iv) verifying user identity as per RBI procedures for account-based relationships.  Such intermediaries must have a physical address in India.  

  • Grievance redressal: Online gaming intermediaries are required to publish a mechanism for addressing complaints against violation of the Rules.  If a person is aggrieved by the first level of grievance resolution of the intermediary, he may escalate the matter to the self-regulatory body (SRB).  The SRB is required to have a grievance mechanism in place.  In case it is not resolved at this level, it may be escalated to the Grievance Appellate Committee.  The Committee is appointed by the central government.

KEY ISSUES AND ANALYSIS

2022 Amendments 

Amendments may expand the role of intermediaries    

Under the Information Technology Act, 2000, an intermediary is not liable for third-party information that it holds or transmits.[6]  However, to claim such exemption, it must fulfil due diligence requirements under the Act and the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (IT Rules).[7]  These requirements include specifying in service agreements the categories of content that users are not allowed to upload or share, and taking down content on receiving a government or court order.  Prohibited content includes material that is obscene, harmful to child, impersonates another person, and threatens public order.  The Amendment adds that intermediaries must make “reasonable efforts to cause” users to not create, upload, or share prohibited content.   This requires intermediaries to use their own discretion in deciding what content is prohibited and take measures to prevent such content from being hosted on their platforms.  There are two issues with this.  

First, the Supreme Court (2015), in Shreya Singhal Versus Union of India, has held that under the IT Act, 2000 intermediaries can only disable content upon receiving an order by the Court or appropriate government or its agency.[8]   Requiring intermediaries to apply their own minds in deciding what constitutes prohibited content may expand their role from being facilitators of user-generated content to regulating content on their platforms.  

Second, the intermediaries are also required to respect users’ fundamental right to speech and expression (Article 19).  This implies that they will have to balance between deciding whether content should be removed and if such removal could violate a user’s right to speech.  Intermediaries may not be the most appropriate entity to decide whether the removal of any content violates a citizen’s fundamental right, as such questions require judicial capability and are typically decided by Courts.

2023 Draft Amendments

Rules may be going beyond the powers delegated under the Act

The IT Act provides a safe-harbour model for intermediaries.  Under the Act, the central government may make Rules specifying: (i) safeguards or procedures to block information for access by the public, and (ii) guidelines to be observed by intermediaries for exemption from liability for third-party information.[9]  The draft Amendments: (i) create a new ground of false information for restricting content, (ii) add the definition of online games, (iii) create a new category called online gaming intermediary, (iv) provide for creating a self-regulating body which would register such online gaming intermediaries, and (iv) create a framework to regulate their content.  

The Rules may be going beyond the scope of the Act by adding: (i) a new principle of false information (which does not violate any existing law) for intermediary protection, (ii) a new category of intermediaries (online gaming), and (iii) by regulating a new set of online activities.  The Act does not provide for regulation of false information, nor does it delegate the power to regulate online gaming to the Executive.  The Supreme Court has held that Rules cannot alter the scope, provisions, or principles of the parent Act.[10],[11],[12]

Removing false information 

The IT Act regulates intermediaries through a safe harbour model.  Under this, they are granted protection from liability for any illegal user-generated content, if they fulfil certain obligations.  The IT Rules specify intermediary obligations to claim safe harbour.  The draft Amendments add that, in order to claim safe harbour intermediaries must remove any content that is identified as false by the fact check unit of the Press Information Bureau, or any other government entity.  This raises several issues as discussed below.

Removing information for being false may violate fundamental rights

Removing online content for being false may undermine: (i) citizens’ right to freedom of speech and expression, and (ii) journalists’ right to practise their profession.  Under Article 19(1)(a), all citizens have the right to freedom of speech and expression.[13]  Article 19(2) provides that this right may be restricted only on grounds of national security, public order, decency or morality, contempt of court, defamation, or incitement to an offence.[14]  False information in and of itself is not a constitutional ground for restricting speech.   Therefore, an individual has the right to speech that may be false unless it meets these criteria.  In 2015, while examining amendments to the IT Act, the Supreme Court struck down section 66A since it restricted free speech beyond the grounds mentioned in Article 19(2).  Section 66A prohibited sharing information that is false/offensive, and causes annoyance, danger, insult or injury.8  

As per the draft Amendments, news articles, if identified as false (by PIB or any other centrally authorised agency), may be removed from online platforms.  This may violate journalists’ right to carry out their profession.   Journalists use intermediary platforms for disseminating news and opinion pieces, and circulating news on such platforms may be integral to their profession.  The Supreme Court has also held that the freedom of profession through the medium of the internet is constitutionally protected under Article 19(1)(g), subject to reasonable restrictions in public interest.[15]  These restrictions may also violate the freedom of press, which is protected under the freedom of speech and expression in Article 19(1)(a).15,[16]  

Intermediary protection for false information that may not cause harm

Intermediary liability can only arise (and they require a safe harbour) when an offence is committed on their platforms.  However, since simply posting false information is not an offence, there may not be a need for intermediary protection in such cases.  False information is currently regulated to address specific harms arising out of the spread of such information.  For instance, the Indian Penal Code (IPC), 1860 penalises defamation, i.e., making false statements about a person with the intention to ruin their reputation.[17]  Section 171G of IPC penalises false statements made concerning the personal character of a candidate, with the intent of affecting the result of an election.[18]  The Consumer Protection Act, 2019 prohibits misleading advertisements that make false claims regarding a product, its use, or guarantee.[19]

Removal of content by the executive

To claim safe harbour, any information identified by PIB or any other centrally authorised agency must be removed by an intermediary.   It may not be appropriate to empower an executive body to cause removal of content.  For example, there may be content that is critical of the government; authorising a government body to direct removal may cause conflict of interest and violate the principle of separation of powers.  

There are some instances where the executive may direct the removal of content.  Section 69A of the IT Act empowers the central government to direct the blocking of information if it is necessary for certain objectives such as security of the state, public order, or to prevent incitement of offences related to these.   The Rules made under this Section authorise the Secretary of the IT Ministry to block access at the recommendation of a Committee of Joint Secretary level officers.[20],[21]  The originator of the information is provided an opportunity of being heard before a blocking order is made.  The Supreme Court has upheld the validity of Section 69A since it has a high threshold for allowing the executive to block content.8 

Courts have ruled that only a high-level executive authority may curtail fundamental rights.  Under the Aadhaar Act, 2016, a Joint Secretary was empowered to reveal individual information (biometric data or Aadhaar number) in the interest of national security.[22]  The Supreme Court struck down the provision and held that a high-ranking officer, preferably along with a judicial officer should be empowered to reveal information of individuals.[23]   Consequently, the Act was amended to empower a Secretary-level officer.   

Further, under the IT Rules, a user may approach a centrally appointed Grievance Appellate Committee to complain against content removal.  This implies that the executive determines what content to remove, and also resolves complaints against such removal.  Since the removal of such content has implications for free speech, it may not be appropriate for the Executive to resolve such grievances.  Note that if a person is not satisfied with the decision of the Grievance Appellate Committee, they may approach the courts.

International experience with regulating false information 

The European Union addresses the issue of false/fake information through voluntary self-regulatory mechanisms.  The 2022 Code of Practice on Disinformation, signed by intermediaries, specifies several commitments to counter online disinformation.[24]  These include: (i) not funding the dissemination of disinformation, (ii) ensuring transparency in political advertising, and (iii) enhancing cooperation with fact-checkers.  The European Democracy Action Plan identifies disinformation as false information that is shared with an intent of causing harm, and that may cause public harm.[25]   

Several intermediaries already regulate false or misleading information through voluntary fact-checking.  Social media intermediaries create certain ‘community standards’ that users must abide by, to use the service.  For instance, Twitter users can provide context on potentially misleading posts through the ‘Community Notes’ feature.  Contributors can leave notes on any tweet and if enough contributors from different points of view rate that note as helpful, the note will be publicly shown on a tweet.   These tweets are not removed.   During the run-up to the 2020 Presidential elections in the United States, Twitter updated its Civic Integrity Policy, which labels tweets that make false claims about polling booths, election rigging, ballot tampering, or misrepresenting affiliations.[26]   Such content is either prohibited or flagged as misleading, depending on the severity of violating Twitter’s policy.26  Similarly, Instagram prohibited false claims regarding COVID and its vaccinations, to ensure that misinformation regarding the spread of the disease and the effectiveness of the vaccine was not spread.[27]  It also flagged all content related to COVID with a disclaimer to only trust verified medical research.  

Regulation of online gaming

Jurisdiction of the Centre to regulate online gaming 

The central government seeks to regulate online gaming through the rule making powers given under the IT Act, 2000.9  While ‘games’ is not included under any List in the Seventh Schedule of the Constitution, ‘sports’ (entry 33) and ‘betting and gambling’ (entry 34) are included in the State List.[28]  Communication (entry 31) is listed in the Union List.  Online games are played through communication devices on the internet.   To determine legislative competence for a subject matter, courts typically use the doctrine of pith and substance.[29]  That is, they identify the central purpose of the law and see which of the Lists it falls under.

Based on this doctrine, we see that the centre has competence to regulate only those aspects of online gaming which correspond to communication.  However, in 2018, the Law Commission had observed that Parliament has competence to legislate on online betting and gambling as it is played over communication media (such as telephones, wireless, or broadcasting).[30]

The draft Amendments provide for self-regulatory bodies (SRBs) that will be responsible for verifying games and regulating their content.  They also delineate principles through which SRBs regulate the content of online games.  The question is whether such requirements go beyond regulating the communication aspect of gaming, and hence the legislative competence of the Centre.  Further, as discussed on page 3, prescribing these regulations through Rules may be exceeding permitted levels of delegation.

Requiring game users to do a KYC verification may be excessive

The draft Amendments require online gaming intermediaries to inform users of the know-your-customer (KYC) procedure followed to register a user account.  Hence, gaming intermediaries must have a KYC procedure.   As per Rules under the Prevention of Money Laundering Act, 2002, a KYC verification is carried out to prevent money laundering or financing of terrorism.[31],[32]  It is unclear if playing an online game requires such a high threshold for user identification.  Service providers such as ride-sharing apps, content streaming platforms, or physical lotteries also involve financial transactions but do not require a KYC for customer identification.  Further, a KYC verification is required to be done by banks or non-banking financial companies.[33]  To the extent that online game users are transferring money, a KYC verification would already be covered under the respective bank’s requirement for customer identification.

 

[1]. Section 2 (1) (w), The Information Technology Act, 2000.

[2]Government of India (Allocation of Business) (Three Hundred and Seventieth Amendment) Rules, 2022, December 23, 2022. 

[3]Draft Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Amendment Rules, 2023, Ministry of Electronics and Information Technology, January 2, 2023.

[4]Information Technology (Intermediary Guidelines and Digital Media Ethics Code), 2021.

[5]Notice for Public Consultation, Ministry of Electronics and Information Technology, January 2, 2023.

[6]. Section 79, The Information Technology Act, 2000.

[7]. Rule 3 (1), The Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021.

[8]Shreya Singhal vs Union of India, Writ Petition (Criminal) No. 167 Of 2012, Supreme Court of India, March 24, 2015.

[9]. Section 87, The Information Technology Act, 2000.

[10]Agricultural Market Committee vs Shalimar Chemical Works Ltd, 1997 Supp (1) SCR 164, The Supreme Court of India, May 7, 1997.

[11]Kerala State Electricity Board vs Indian Aluminium Company, 1976 SCR (1) 552, The Supreme Court of India, September 1, 1975.

[12]State of Karnataka v Ganesh Kamath, 1983 SCR (2) 665, The Supreme Court of India, March 31, 1983.

[13]. Article 19 (1), The Constitution of India.

[14]. Article 19 (2), The Constitution of India

[15]Anuradha Bhasin vs Union of India, Writ Petition (Civil) No. 1031 of 2019 and 1164 of 2019, The Supreme Court of India, January 10, 2020.

[16]Express Newspaper (Private) Ltd. vs Union of India, The Supreme Court of India, January 8, 1958.  

[17]. Sections 499 and 500, Indian Penal Code, 1860.

[18]. Section 171(G), Indian Penal Code, 1860.

[19]The Consumer Protection Act, 2019.

[20]Information Technology (Procedure and Safeguards for Blocking Access of Information by Public) Rules, 2009.

[21]. Section 69A, The Information Technology Act, 2000.

[22]The Aadhaar (Targeted Delivery of Financial and Other Subsidies, Benefits and Services) Bill, 2016.

[23]Justice K.S. Puttaswamy vs Union of India and others, Writ Petition (Civil) No. 494 of 2012 and connected matters, The Supreme Court of India, September 26, 2018.

[24]The 2022 Code of Practice on Disinformation, June 16, 2022, as updated on July 4, 2022.

[25]European Democracy Action Plan, European Commission, December 3, 2020.

[26]Civic Integrity Policy, Twitter, October 2021, as accessed on January 26, 2023. 

[27]Community Guidelines, Instagram Help Centre, as accessed on January 24, 2023.

[28]. Seventh Schedule, The Constitution of India

[29]Background Paper on Concurrent Powers of Legislation under List III of the Constitution of India, P.M Bakshi.

[30]Report No. 276, Law Commission of India, Legal Framework: Gambling and Sports Betting Including in Cricket in India, July 5, 2018.

[31]Prevention of Money Laundering (Maintenance of Records) Rules, 2005.

[32]Prevention of Money Laundering Act, 2002.

[33]Master Direction – Know your Customer (KYC) Direction 2016, Reserve Bank of India, as on May 10, 2021. 

DISCLAIMER: This document is being furnished to you for your information.  You may choose to reproduce or redistribute this report for non-commercial purposes in part or in full to any other person with due acknowledgement of PRS Legislative Research (“PRS”).  The opinions expressed herein are entirely those of the author(s).  PRS makes every effort to use reliable and comprehensive information, but PRS does not represent that the contents of the report are accurate or complete.     PRS is an independent, not-for-profit group.  This document has been prepared without regard to the objectives or opinions of those who may receive it.