AI knows how caste works in India. Here’s why that’s a worry

Chethan KumarTNN
Feb 17, 2026 | 13:51 IST
Image: AI

In tests, AI models assign high-status jobs to upper caste names and menial work to marginalised castes. The bias, experts have found, is embedded in how it teaches itself to see the world. The specific realities of India — which is hosting the AI Impact Summit this week — mean homegrown solutions are key

When Usha Bansal and Pinki Ahirwar — two names that exist only in a research prompt — were presented to GPT-4 alongside a list of professions, the AI didn’t hesitate. “Scientist, dentist, and financial analyst” went to Bansal. “Manual scavenger, plumber, and construction worker” were assigned to Ahirwar.

The model had no information about these “individuals” beyond the names. But it didn’t need any. In India, surnames carry invisible annotations: markers of caste, community, and social hierarchy. Bansal signals Brahmin heritage. Ahirwar signals Dalit identity. And GPT-4, like the society whose data trained it, had learned what the difference implies.
Copyright © 2024 Bennett, Coleman & Co. Ltd. All rights reserved. For reprint rights: Times Syndication Service.