Australia says five major social platforms aren’t fully complying with its age-law provisions, specifically citing Meta, Snapchat, TikTok and YouTube for not fully enforcing child account bans, signaling ongoing regulatory pressure on platforms to curb underage access.
An independent Roblox developer argues the platform's safety checks, including age verification, are insufficient and urges parents to monitor children on Roblox 24/7; Roblox counters that safety is a top priority with advanced safeguards and ongoing monitoring, amid broader concerns about protecting young users.
The Register reports that OS-level age verification attempts are tangled in a sprawling ad-tech ecosystem that relies on long-lived cookies and other storage to track users. The piece catalogs numerous vendors and their data practices—collecting IPs, device IDs, precise locations, and user profiles—with cookie lifetimes spanning months to years. Some vendors claim not to use cookies or rely on alternative storage, but the overall landscape shows pervasive data collection that undermines robust age checks and raises privacy concerns about how age gates are implemented and monitored.
The FTC issued a policy statement saying it will not pursue COPPA enforcement actions against general‑ or mixed‑audience sites that collect personal information solely to determine a user’s age through age‑verification tech, provided they meet strict conditions (no extra use or retention, proper third‑party assurances, clear notices, reasonable security, and likely accurate age results). The agency also plans to review the COPPA Rule as age‑verification technologies evolve, with the policy remaining in effect until final rule amendments.
Discord has postponed its planned global age-verification rollout after criticism and privacy concerns, with the company promising clearer timelines and more transparent communication about how the system will work.
Discord has pushed back its global age-verification rollout to the latter half of the year after user backlash, saying only a minority will need to verify their age and that non-face options like credit-card verification will be explored; the company will publish its age-determination methodology, insists it won’t read messages or store verification images, and aims to align with upcoming youth-access rules while addressing widespread trust concerns.
Discord has pushed its global age-verification rollout to the second half of 2026 after backlash and confusion. The company says it will add more age-verification options (including credit-card-based checks), publish a list of verification vendors, and require on-device facial-age estimation with a strict on-device rule, while offering spoiler channels as an alternative to age-gated content. In regions with laws mandating verification (UK, Australia, soon Brazil), adults will verify via vendors like k-ID to access age-restricted content. Discord also plans to publish its age-estimation methodology, open-source its safety engine (Osprey), and note that Persona did not meet the on-device bar, prompting greater vendor transparency. The plan reportedly would affect about 10% of accounts, and Discord emphasizes it does not read users’ messages.
Discord scrapped its UK use of Persona amid backlash over age-verification and data practices, amid concerns about surveillance ties and government data screening; Persona denies direct government involvement and Discord says it won’t partner with Persona again, though the Teen-By-Default age checks remain a broader online verification trend.
Germany’s ruling conservatives approved a motion at a Stuttgart party conference to ban social media use for under-14s, impose stricter digital-age verification, and levy fines on platforms that don’t enforce limits, while calling for EU-wide harmonization of age standards. The move signals growing momentum across Europe for tougher online restrictions, though Germany’s federal structure means state-level coordination is required for nationwide rules; public reactions from students and teachers reveal mixed views on practicality and parental choice.
Discord’s UK-era age-verification pilot with Persona drew backlash over how data was handled and stored, amid fears of government access and privacy risks. Discord said most data were deleted after age confirmation and that Persona is no longer a partner; the test ran for under a month, raising questions about data retention, vendor transparency, and trust as the platform plans broader age checks.
Discord’s planned March rollout will place many users in a teen-by-default mode, requiring facial scans or IDs to access full features unless an adult is clearly identified. High-profile streamers like Eret, Tubbo and Pikachulita warn this could expose sensitive data after a previous breach of ID photos, even as Discord says facial scans stay on-device and IDs are deleted after age verification. Experts say age-verification should protect minors without eroding trust, urging transparent data handling, while creators seek safety for their communities without compromising privacy.
As part of preparing for the UK Online Safety Act, Discord is testing global age-verification via the Persona service, with some users already prompted ahead of a March rollout. Persona is backed by Founders Fund, whose ties to Peter Thiel and Palantir have drawn privacy concerns about surveillance and data use. Discord describes the test as limited and says any collected data will be stored for seven days, though the broader long‑term use remains unclear.
Discord's global 'teen-by-default' age-verification rollout will require facial age estimation or ID to access certain features; UK users now see FAQ language indicating selfies and identity data may be processed by the Persona age-assurance vendor and stored temporarily for up to seven days before deletion, a shift from the promise that data would never leave the device. Discord says data is blurred and only what's needed is used, but concerns persist about vendor ties and past data breaches as the rollout begins in March.
The Electronic Frontier Foundation warns that Discord’s voluntary rollout of age verification—using an age-inference system for most users and forcing government IDs or facial scans via third‑party vendors for some—risks privacy, chilling effects, and misclassification, especially after a 2025 data breach exposed ID data. While Discord claims IDs aren’t linked to accounts and scans stay on-device, privacy risks persist given unreliable facial-age tech and limited audits. EFF urges stopping non-mandated age gates and offers guidance for users facing age gates.
Discord announced a phased global rollout starting in March that will mark all accounts as teen-appropriate by default. Adults will need to upload a face scan or government ID via a third-party verifier to regain full access to features and channels, with updated safety settings and restricted access to age‑restricted spaces. The plan follows a 2025 data breach affecting about 70,000 users and has drawn criticism over privacy and trust in third‑party verification systems.