
A New Mexico court ruled that Meta must pay over $375 million (12.2 billion baht) in damages for spreading false information about the safety of its platform for children.
New Mexico sued Meta in 2023, alleging the company directed underage users to explicit sexual content, exposed them to images of child sexual abuse, and showed them solicitations for such acts and sex trafficking.
New Mexico stated the company did this through its recommendation algorithm, a tool Meta uses to automatically filter content users see on its platforms.
Additionally, Meta is involved in a separate trial in Los Angeles, where a young woman claims she became addicted to platforms like Instagram and YouTube (owned by Google) from a young age due to their intentional design; thousands of similar lawsuits are also pending in U.S. courts.
The New Mexico jury found Meta, owner of Facebook, Instagram, and WhatsApp, responsible for how its platforms harmed children by exposing them to pornography and contact with sexual predators.
The jury determined Meta violated New Mexico’s Unfair Practices Act by providing misleading information to the public about the safety of its platform for underage users.
During the seven-week trial, the jury reviewed internal Meta documents and heard testimony from former employees about the company’s awareness of child sexual predators using its platforms.
Arturo Bejar, former head of engineering at Meta who resigned in 2021, was a whistleblower who testified about experiments on Instagram showing that underage users were exposed to sexual content.
He said his own daughter was solicited for sex by strangers on Instagram.
State prosecutors presented Meta’s internal research showing that at one point, 16% of Instagram users reported seeing unwanted nudity or sexual activity within a single week.
Raul Torrez, New Mexico Attorney General, called the verdict "historic," marking the first successful state lawsuit against Meta on child safety issues.
Meta, led by shareholder and CEO Mark Zuckerberg, disagreed with the ruling and intends to appeal.
The company stated, “We work hard to keep people safe on our platforms and clearly understand the challenges of identifying and removing offenders and harmful content. We remain confident in our efforts to protect teens online.”
In 2024, Instagram launched accounts for teens to give younger users more control over their experience, and last month introduced a feature to notify parents if their children search for self-harm content.
The New Mexico Attorney General ruled that Meta executives knew their platforms harmed children, ignored employee warnings, and lied to the public about what they knew. The court ordered Meta to pay $375 million (12.2 billion baht) in civil fines after a jury found thousands of legal violations, each carrying fines up to $5,000.
. Source:BBC