Did Meta Cover Up Child Safety Risks in VR?

[ad_1]

Warning: This article includes descriptions of sexual harassment of a child.

Meta employees allegedly deleted evidence of sexual harassment of minors on the company’s virtual reality headsets in 2023, according to a report from The Washington Post.

Two current and two former Meta employees disclosed documents to Congress showing Meta was aware of potential safety risks to children. The Meta legal team is accused of screening and, at times, reportedly suppressing internal studies about youth safety on its virtual reality platforms and apps.

The allegations will go before the Senate on Tuesday for review by a Judiciary subcommittee focused on laws and regulations around online safety.

Table of Contents

Four current and former employees brought the allegations

Former Meta employee Jason Sattizahn and a researcher focused on youth and technology — who is also Sattizahn’s domestic partner and requested anonymity because she still works in tech — raised the allegations with two current Meta employees.

They submitted “pages of internal messages, memos and presentations from the past decade,” according to The Washington Post. They allege Meta personnel instructed researchers not to record data that showed children were using the VR platforms.

Among the documents are records of situations in which Sattizahn said he was told not to collect data that could “implicate the company in future engagements with regulators,” according to the Post.

In one example, Sattizahn said a teenager told researchers that users of Meta’s virtual reality glasses had sexually harassed his younger brother “numerous times.” Sattizahn and the youth researcher alleged that Sattizahn’s supervisor ordered the recordings of the teen making the claim deleted.

Metan said some material may have been deleted for privacy, while other allegations were ‘false’

Dani Lever, a Meta spokesperson, told the Post that information gathered from minors under 13 years old without consent from a parent or guardian needed to be deleted under GDPR.

The allegations were “stitched together to fit a predetermined and false narrative,” she said. She stated that the company did not have an overarching ban on research involving children under 13. Lever also said parents or teens can choose to block communication with users they do not know.

History of child safety concerns in VR

This is not the first time Meta insiders have investigated the risks posed when children join virtual worlds. In August 2021, an internal study found some adult users were “uncomfortable” because they encountered children in the VR space.

The FTC in March 2022 looked into Meta’s compliance with COPPA, a federal law related to children’s privacy. In response, Meta formed an internal working group.

In 2023, Meta released new companywide requirements for research into social issues, including considerations for “the risks that naturally stem from conducting and sharing research on sensitive topics and populations,” according to the documents obtained by The Post. These were instituted to enable “accuracy” and to make sure findings were considered as part of other company decisions, a Meta spokesperson told The Post.

Meta says its virtual reality products are intended for users 13 years of age and older. Age-appropriate protections and safety defaults for users between ages 13 and 17 were added in 2023.

Another complication is children lying about their ages. Before the working group was created, internal Meta documents showed that more than half of teens using the Quest 2 headset had misrepresented their ages.

Meta CEO Mark Zuckerberg was among the tech executives who attended a White House Task Force on Artificial Intelligence Education roundtable and a dinner with President Donald Trump last week.

[ad_2]

Share this content:

I am a passionate blogger with extensive experience in web design. As a seasoned YouTube SEO expert, I have helped numerous creators optimize their content for maximum visibility.

Leave a Comment