
Amazon Reports High Volume of CSAM in Training Data
TL;DR
Amazon identified significant amounts of Child Sexual Abuse Material (CSAM) related to artificial intelligence (AI) in its training data, with the majority of reports arising from its services.
Amazon Identifies CSAM in AI Training Data
Amazon has revealed that it found a significant amount of Child Sexual Abuse Material (CSAM) associated with artificial intelligence (AI) in its training data. In 2025, the National Center for Missing and Exploited Children (NCMEC) received over 1 million reports of AI-related content, with most of these reports originating from Amazon, according to a report by Bloomberg.
Inconsistencies in Reports and Lack of Details
Amazon did not provide details on the source of the CSAM in its reports, stating that the company used data from external sources to train its AI services. NCMEC executive Fallon McNulty highlighted that the high volume of reports raises questions about the provenance of the data and the security protocols in place.
Concerns and Security Actions
Amazon has taken a cautious stance regarding the screening of its training data, ensuring that known child abuse content is identified and removed. In a statement, an Amazon spokesperson said: "We have taken a deliberately cautious approach to ensure the safety of our customers." The company emphasized that excessive reports are sent to NCMEC to ensure that no cases go unnoticed.
Increase in AI-Related Cases
In recent months, the safety of minors has become an increasing concern in the AI industry. NCMEC reported a shocking surge in cases, which rose from 67,000 in 2024 to over 1 million in 2025. Comparatively, in 2023, the total was only 4,700 cases.
Potential Impacts on Youth
Besides the use of abusive content to train AI models, chatbots have been linked to tragic incidents involving youth. Companies like OpenAI and Character.AI are facing lawsuits after teenagers reported planning suicides using their platforms. Similarly, Meta was sued for not adequately protecting teenagers from sexually explicit conversations with chatbots.
Future Prospects
As Amazon and other tech companies face pressure to address data security in AI, stricter regulations are expected to protect minors and combat the dissemination of CSAM. Thus, accountability and transparency become essential in shaping safety policies in this expanding industry.
Content selected and edited with AI assistance. Original sources referenced above.


