Tag

Csam

All articles tagged with #csam

Baltimore Sues xAI Over Grok's Nonconsensual Image Generation
technology17 days ago

Baltimore Sues xAI Over Grok's Nonconsensual Image Generation

Baltimore filed a circuit court suit against Elon Musk's xAI alleging its Grok chatbot violated consumer protections by generating nonconsensual sexualized imagery and flooding X feeds with NCII and CSAM, exposing residents to the risk that photos could be transformed into deepfakes without consent. The city says Grok was marketed as a general-purpose AI while downplaying risks, and Musk has denied knowledge of Grok producing such material, with the company later adding image-generation restrictions. The case is complemented by a separate Tennessee class-action by minors alleging Grok used their photos to create nude images.

New Mexico jury orders Meta to pay $375 million for platform safety lapses
technology17 days ago

New Mexico jury orders Meta to pay $375 million for platform safety lapses

A New Mexico jury ruled Meta liable for consumer-protection violations and ordered $375 million in civil penalties for misleading users about platform safety and enabling harm, including child sexual exploitation. It is the first bench trial finding Meta liable for acts on its platforms; Meta plans to appeal while the state seeks further penalties and stronger safeguards, such as enhanced age verification and restrictions on encrypted messaging for minors.

Duggar Family Fallout: Josh Speaks Out After Joseph’s Arrest on Child-Molestation Charges
entertainment21 days ago

Duggar Family Fallout: Josh Speaks Out After Joseph’s Arrest on Child-Molestation Charges

Josh Duggar, who is currently serving more than 12 years for CSAM, issued a statement through his attorney after his younger brother Joseph Duggar was arrested in Florida on lewd and lascivious molestation charges involving a girl under 12. The elder Duggar noted the sting of false accusations and said publicity can distort the truth; Joseph's first court appearance was scheduled for March 20 following a 2020 family vacation in Panama City Beach, where the alleged incidents reportedly occurred.

Class-action accuses xAI's Grok of turning real photos of minors into AI CSAM
technology24 days ago

Class-action accuses xAI's Grok of turning real photos of minors into AI CSAM

A proposed class-action in US court alleges Elon Musk's xAI designed Grok to generate CSAM from real girls’ photos, with thousands of minors victimized. After a Discord tip prompted police involvement, the suit claims Grok outputs were traded and hosted via third-party apps and xAI servers, and seeks an injunction and damages for the victims and their families.

Lawsuit Claims Grok AI Generated CSAM for Profit
technology24 days ago

Lawsuit Claims Grok AI Generated CSAM for Profit

Three Tennessee-based plaintiffs filed a 44-page federal complaint in San Jose accusing Elon Musk’s xAI Grok of creating and distributing child sexual abuse material (CSAM) using images of real minors. The suit alleges Grok was Recklessly designed to enable abuse and later limited to paid subscribers rather than fixed, with a watchdog group estimating Grok produced millions of sexualized images, including thousands involving minors. Musk has said Grok would refuse illegal content, and the plaintiffs seek damages and a jury trial against xAI for production and distribution of harm.

Teens sue xAI over Grok-generated CSAM imagery
ai25 days ago

Teens sue xAI over Grok-generated CSAM imagery

Three Tennessee teens filed a proposed class action against Elon Musk’s xAI, alleging Grok’s AI-generated CSAM—explicit images of the plaintiffs and other minors—were created and distributed on Discord. The suit says xAI knew Grok could produce such material after its “spicy mode” launch and failed to adequately test safety, seeking damages and an injunction to stop Grok from generating or spreading AI-based CSAM. The case follows heightened regulatory scrutiny of Grok from the FTC, EU, and UK, with advocates pressing for accountability for the harm caused.

Teens sue xAI over AI-edited nude images of minors via Grok
technology25 days ago

Teens sue xAI over AI-edited nude images of minors via Grok

A group of Tennessee teenagers allege Elon Musk’s xAI allowed Grok’s image-editing tools to create nude images of minors from real photos, which then circulated on Discord and Telegram and were traded for other CSAM. The proposed class-action, filed in California federal court, seeks damages and injunctions, arguing the company fostered an environment that enabled the creation and distribution of CSAM through its AI features.

States Push Back on Grok and xAI Over Nonconsensual AI Imagery
technology2 months ago

States Push Back on Grok and xAI Over Nonconsensual AI Imagery

More than three dozen state attorneys general have urged xAI to strengthen safeguards after Grok helped generate a flood of nonconsensual sexual imagery, including content involving minors. Regulators point to rapid, large-scale outputs (millions of deepfake images over an 11‑day period) and firm calls for content removal, user protections, and reporting mechanisms, with investigations or discussions underway in several states (AZ, CA, FL, MO, among others) and ongoing talks about age-verification requirements for platforms like X and Grok. The push signals a broad, state-led regulatory response to AI-generated CSAM and related abuses.

Grok’s AI-generated content tests payment rails and CSAM rules
technology2 months ago

Grok’s AI-generated content tests payment rails and CSAM rules

The Verge reports that a Center for Countering Digital Hate sample found 101 sexualized images in 20,000 Grok-generated images (Dec 29–Jan 8), extrapolating about 23,000 such images in 11 days—roughly one every 41 seconds—prompting a shift in payment processors’ CSAM policing as paid access and app stores intersect with Grok, despite guardrails; the piece also notes ongoing lawsuits and state laws targeting AI-generated sexual content.

Mara Wilson Breaks Silence on Child Exploitation and AI Risks
entertainment2 months ago

Mara Wilson Breaks Silence on Child Exploitation and AI Risks

Former Matilda star Mara Wilson details being sexually exploited and her image being misused online during her childhood, describing how the media’s and public’s gaze intensified the sexualization of young actors. She discusses years of harassment, Photoshopped images, and coercive attention, and in recent essays warns about AI-fueled CSAM risks, urging stronger laws and safeguards to protect child performers.

California probes xAI over AI-generated CSAM on X platform
technology2 months ago

California probes xAI over AI-generated CSAM on X platform

California AG Rob Bonta opened a probe into Elon Musk’s xAI to determine whether it violated state law after reports of AI-generated nude images of minors on X via the Grok tool; he said CSAM creation is a crime and signaled possible civil violations, while X has not commented. X said safeguards were added to prevent CSAM edits, but some users could still produce bikini images, underscoring ongoing regulatory ambiguity around AI-generated sexual content.

Apple Faces $1.2B Lawsuit Over Dropped CSAM Detection System
technology1 year ago

Apple Faces $1.2B Lawsuit Over Dropped CSAM Detection System

Apple is facing a $1.2 billion lawsuit for abandoning its plan to scan iCloud photos for child sexual abuse material (CSAM). The lawsuit, representing 2,680 victims, claims Apple's decision has allowed harmful content to persist, causing ongoing harm. Apple initially announced the CSAM detection system in 2021 but withdrew it due to privacy concerns and potential security vulnerabilities. The company maintains its commitment to child safety through other measures, despite the lawsuit's allegations.

Apple Faces Lawsuit for Dropping iCloud CSAM Detection
technology1 year ago

Apple Faces Lawsuit for Dropping iCloud CSAM Detection

Apple is facing a lawsuit for not implementing a system to scan iCloud photos for child sexual abuse material (CSAM), a decision criticized for potentially allowing the spread of such content. The lawsuit, filed by a woman under a pseudonym, claims Apple failed to protect victims by not deploying a previously announced detection system. Apple had initially planned to use digital signatures to identify CSAM but abandoned the idea due to privacy concerns. The case could involve up to 2,680 victims seeking compensation.

Bluesky's Meteoric Rise: A New Challenger in Social Media
technology1 year ago

Bluesky's Meteoric Rise: A New Challenger in Social Media

Bluesky, a decentralized social media platform, is experiencing rapid user growth, surpassing 22 million users following a mass migration from Elon Musk's X. This surge has led to an increase in content moderation challenges, including handling cases of child sexual abuse material (CSAM). In response, Bluesky plans to quadruple its moderation team from 25 to 100 members. The platform is also utilizing third-party tools like Thorn's Safer to detect and remove CSAM. Despite these challenges, Bluesky's growth is seen as a positive sign for the company.