Thursday, 13 November 2025

An Open Letter to the Ministry of Electronics and Information Technology: A Critique of the India AI Governance Guidelines on the Omission of Mandatory Disability and Digital Accessibility Rules

 To:

The Secretary, Ministry of Electronics and Information Technology (MeitY)
Government of India, New Delhi
Email: secretary[at]meity[dot]gov[dot]in

I. Preamble: The Mandate for Accessible and Inclusive AI

The recently issued India AI Governance Guidelines (I-AIGG) assert a vision of “AI for All”  [Click here to view document] and commit India to inclusive technology, social goods optimisation, and the avoidance of discrimination. However, the guidelines have failed to operationalise mandatory and enforceable disability and digital accessibility rules – a legal and ethical lapse that undermines both national and international obligations. As a professional engaged in technology policy and disability rights, and in light of the Supreme Court's Rajive Raturi v. Union of India (2024) judgment, this letter outlines why voluntary commitments are insufficient and why robust, mandatory accessibility standards are immediately warranted.

II. The Policy Paradox: Aspirational Promises versus Legal Obligations

The I-AIGG framework advances “voluntary” compliance, elevates inclusive rhetoric, and references “marginalised communities” in its principles. However, it neither defines “Persons with Disabilities” (PwDs) nor mandates conformance with domestic accessibility rules, as legally required by the Rights of Persons with Disabilities Act, 2016 (RPwD Act). This introduces a regulatory gap: aspirational principles supplant the non-negotiable legal floor guaranteed to persons with disabilities. Such dilution is legally unsustainable given India’s obligations under the UNCRPD and under Sections 40, 44, 45, 46, and 89 of the RPwD Act.

III. The Rajive Raturi Judgment: Reinforcing Mandatory Compliance

The Supreme Court’s decision in Rajive Raturi (2024) unambiguously directed the Union Government to move from discretionary, guideline-based approaches to compulsory standards for accessibility across physical, informational, and digital domains. The Court found that reliance on non-binding guidelines and sectoral discretion violated statutory mandates, and it instructed the creation of enforceable, uniform, and standardised rules developed in consultation with persons with disabilities and stakeholders.

This is particularly relevant to digital and AI governance, where exclusion can be algorithmic, structural, and scaled, denying access to education, employment, health, and social participation. The judgment refutes the adequacy of sectoral or voluntary approaches – digital accessibility is a fundamental right and non-compliance amounts to denial of rights for PwDs in India.

IV. The EU Benchmark: Legal Mandates, Not Discretion

The European Union’s AI Act (Regulation (EU) 2024/1689) and its general accessibility directives establish mandatory, rights-based compliance for digital accessibility. The EU Act:

  • Explicitly enforces accessibility as a legal obligation, not a voluntary commitment, anchored in the UNCRPD and Universal Design principles.
  • Mandates that all high-risk AI systems comply with technical accessibility standards by design, with legal penalties for non-compliance.
  • Classifies systems impacting education, employment, healthcare, and public services as high-risk, subjecting them to strict regulatory scrutiny.
  • Prohibits any AI deployment that exploits or discriminates against persons with disabilities, addressing historical and algorithmic bias at source.

Thus, the EU approach demonstrates enforceable protection for PwDs, with stakeholder consultation, technical linkage to sectoral accessibility standards, and mechanisms for remediation and complaint.

V. Critique of I-AIGG: Core Deficiencies and Recommendations

  1. Absence of Disability-Specific Provisions:
    The term “marginalised communities” is insufficiently specific. India’s legal framework demands explicit protection for PwDs, including reasonable accommodation, accessible formats (such as ePUB, OCR-based PDF), and compliance with domestic (GIGW, Harmonised Guidelines 2021) standards.

  2. No Accessibility-By-Design Mandate for AI:
    While the I-AIGG insists on “Understandability by Design,” it fails to require “Accessibility by Design.” Systems that are explainable but not operable by PwDs remain discriminatory.

  3. Inadequate Response to Algorithmic Bias:
    AI bias mitigation in the I-AIGG does not extend to underrepresented disability data or to systemic exclusion caused by inaccessible training sets. The EU model, by contrast, mandates active audit and correction for disability-related data bias.

  4. Weak Grievance Redressal Mechanisms:
    Voluntary or generic redress measures neglect the diversity of disability and the necessity for robust, accessible remedies in every sector where AI is used.

  5. Non-compliance with Judicial Mandate:
    Above all, the approach bypasses the Supreme Court’s explicit instructions to operationalise compulsory rules – an omission that is both ultra vires and constitutionally indefensible.

VI. Policy Prescription: Steps Toward Compliance

  • Draft and Notify Mandatory AI Digital Accessibility Standards:
    MeitY must codify and enforce AI digital accessibility standards as binding, not optional, rules. These must reference existing Indian standards (GIGW/HG21), adopt international best practices (WCAG), and be technology-agnostic.

  • Classify High-Risk AI Systems with Disability Lens:
    Mandate Disability Impact Assessments, mirroring the EU approach, for all AI systems deployed in health, education, employment, and public services.

  • Institutionalise Disability Rights Expertise:
    Add disability rights experts and diverse PwD representatives to the AI Governance Group and the Technology Policy Expert Committee, to ensure continued compliance monitoring and gap correction.

  • Mandate Dataset Audits and Privacy Protections:
    Require dataset bias audits for disability, establish anonymisation protocols for disability-rights data, and ensure representation in AI datasets.

  • Create Enforceable, Accessible Grievance Redress Channels:
    Grievance and remedy processes must be designed for operability by all 21 disability categories, in multiple formats and languages, with offline options for digitally marginalised users.

VII. Conclusion and Urgent Appeal

Presently, the I-AIGG’s disability approach is aspirational, not enforceable; voluntary, not mandatory. This is contrary to the Supreme Court's directive, India's legal obligations, and international best practice. To prevent algorithmic exclusion and rights denial, MeitY must urgently revise the I-AIGG:

  • To operationalise mandatory disability accessibility safeguards across all AI and digital systems;

  • To implement Disability Impact Assessments as standard in high-risk domains;

  • To establish permanent, consultative mechanisms with DPOs and subject-matter experts.

Failure to act will perpetuate digital exclusion, legal non-compliance, and undermine the promise of “AI for All.” India’s technology policy must embrace enforceable accessibility, both as a legal imperative and a standard of global leadership.

Yours faithfully,
Nilesh Singit
https://www.nileshsingit.in/


References

  • Rajive Raturi v. Union of India, Supreme Court of India, 8 November 2024.
  • India AI Governance Guidelines: Enabling Safe and Trusted AI Innovation, MeitY, 2025.
  • Rights of Persons with Disabilities Act, 2016, and associated Rules.
  • Finding Sizes for All: Report on Status of the Right to Accessibility in India, for facts on digital exclusion.
  • European Union, AI Act 2024 (Regulation (EU) 2024/1689), especially Recital 80, Article 5(1)(b), Article 16(l).
  • Web Content Accessibility Guidelines (WCAG) and Guidelines for Indian Government Websites (GIGW).

 

  • Open letter references and scope: blog.nileshsingit.org/open-letter-to-niti-ayog-ai-disability-inclusion.