Themata.AI
Themata.AI

Popular tags:

#developer-tools#ai-agents#llms#claude#code-generation#ai-ethics#openai#ai-safety#anthropic#open-source

AI is changing the world. Don't stay behind. Clear summaries, community insight, delivered without the noise. Subscribe to never miss a beat.

© 2026 Themata.AI • All Rights Reserved

Privacy

|

Cookies

|

Contact
ai-ethicsdata-labelingworkforce-impactcontent-moderation

‘In the end, you feel blank’: India’s female workers watching hours of abusive content to train AI

‘In the end, you feel blank’: India’s female workers watching hours of abusive content to train AI

theguardian.com

February 5, 2026

7 min read

Summary

Female workers in India, like Monsumi Murmu, are employed to watch and analyze hours of abusive content to train AI systems. This work often exposes them to disturbing and traumatic material, impacting their mental well-being.

Key Takeaways

  • Content moderators in India, primarily women, view up to 800 disturbing videos and images daily to train AI algorithms on recognizing violence and abuse.
  • Studies indicate that content moderation work leads to significant psychological risks, including traumatic stress, anxiety, and sleep disturbances.
  • As of 2021, approximately 70,000 people in India were employed in data annotation, with a market value of about $250 million, predominantly serving US companies.
  • Women make up half or more of the content moderation workforce, often from marginalized backgrounds, and are preferred for their perceived reliability and willingness to accept home-based work.
Read original article

Source

theguardian.com

Published

February 5, 2026

Reading Time

7 minutes

Relevance Score

59/100

🔥🔥🔥🔥🔥

Why It Matters

This page is optimized for focused reading: quick context up top, a clean summary block, and a direct path to the original source when you want the full story.