Artificial Intelligence is routinely portrayed as the great equaliser of the digital age, its promise one of unprecedented inclusion. Yet, as one scratches beneath the surface, a different narrative emerges—one of invisibility, algorithmic exclusion, and the subtle force of technoableism. In practice, AI systems too often reproduce existing social inequalities, with disability uniquely rendered invisible or pathologised, rather than recognised and engaged. The Bias Pipeline was forged out of the conviction that it is not mere technical misstep, but the logical outcome of a system guided by technoableist assumptions—those that treat disability as a flaw to be corrected, if not erased entirely, through data, design innovations, and policy “solutions.”
The Origins and Urgency of Technoableism
Technoableism is not a recent phenomenon, nor is it confined to AI. It is a form of ableism specific to the technological domain, where the belief persists that disability represents a deficiency best ‘solved’ by technological means. Rather than regarding persons with disabilities as rights-bearers and knowledge-holders, technoableism recasts them as passive recipients of intervention, remade according to normative, able-bodied standards.
In the AI context, technoableism works silently but relentlessly. Consider the case of voice recognition systems, which—trained overwhelmingly on non-disabled speech—routinely fail users with dysarthria or atypical speech patterns. Or hiring algorithms, which filter out disabled applicants based on implicit biases embedded within their data. Or the slick generative models that churn out stereotyped, often inaccurate, visual representations of blind or mobility-impaired persons. In each instance, the root problem is neither randomness nor technicality, but the presumption that a single, “normal” user baseline exists—and the tacit belief that those outside it are marginal cases, acceptable losses.
Legibility, Exclusion, and the Pipeline Metaphor
The Bias Pipeline traces how these structures of bias are encoded from start to finish. From upstream choices about what data are collected (or omitted), to how that data are labelled, the models trained, outputs evaluated, and ultimately deployed—the pipeline captures each step at which ableist logic becomes infrastructural, stamping “legibility” upon some while relegating others to digital erasure.
This post continues a conversation begun with “Prototype — Accessible to Whom? Legible to What?” and extends it through regular essays, critical notes, and resources foregrounding accessibility and disability-led design. Every piece hosted here asks: who is assumed to be the user? Whose voices are omitted from the dataset, or overwritten by reductive “fixes”? What policies incentivise or entrench technoableism—and how might we imagine differently?
Three Pillars: Expose, Advocate, Align
The Bias Pipeline stands upon three foundational aims:
To expose how technoableist bias travels from data into decision: By mapping the full length of the pipeline, the project reveals not only individual exclusions but their systemic origin. It demonstrates how poorly representative training sets, inflexible performance metrics, and cultural assumptions about “normal” function pull bias from design table to deployment.
To champion participatory, disability-led design and accessible technology: Rather than soliciting token feedback or treating disabled persons as objects of intervention, the site calls for their leadership at every stage. The best technological outcomes emerge not when disability is “managed”, but when disabled expertise actively shapes the terms and architecture of inclusion.
To push policy actors—including NITI Aayog—to align India’s AI future with the UNCRPD, Rights of Persons with Disabilities Act (2016), and the Supreme Court’s Rajiv Raturi judgment: Merely aspirational guidelines are inadequate. As the Rajiv Raturi verdict clarified, accessibility in India is an ex-ante, enforceable right, not a post-hoc accommodation. Public bodies, developers, and civil society must insist on accessibility as foundational—mandating inclusive standards, equity-focused benchmarks, and meaningful legal oversight for AI systems.
Beyond Afterthought: Accessibility as Design Architecture
Accessibility here finds no place as an afterthought or bureaucratic hurdle. It must inform the architectural blueprints—of interfaces, algorithms, policy criteria, and regulatory practice. There may be tendency, particularly in rapidly-evolving fields, to revert to baseline compliance or “tick-box” solutions. This project refuses such minimalism. Instead, it insists: only if accessibility is integral—from data collection to final deployment—can AI deliver on its inclusive promise.
The Indian Imperative and Global Resonances
This task is especially urgent in India, where over 2.74 crore citizens experience disability across diverse contexts, regions, and intersections of caste and class. That nation’s digital ambitions cannot afford to regard accessibility as peripheral, nor equity as negotiable. The international frameworks already exist—the UNCRPD as the global benchmark, the RPwD Act 2016 as national statute, and the Supreme Court’s Rajiv Raturi judgment as constitutional clarification. The lacuna lies in translation from high-level directives to daily practice—something AI development, when unchecked, too often subverts through expediency and oversight.
Yet, as India’s AI policy evolves and new infrastructures are built, so too grows the possibility of centring disability expertise and experience. When those most at risk of exclusion become co-authors of technological futures, the pipeline can shift from conduit of bias to channel of justice.
Conclusion: Listening, Learning, Leading
The conversation on disability and AI shall determine not only whose knowledge counts, but whose lives shape the horizon of possibility. Here at The Bias Pipeline, the commitment is simple: to listen when disability leads, to learn from the complex realities of exclusion, and to demand that India’s AI story be written with accessibility as both principle and practice.