Skip to content
Home » News » Facebook and TikTok under fire as mother sues Meta over daughter’s self harm and dangerous trends aren’t monitored on TikTok

Facebook and TikTok under fire as mother sues Meta over daughter’s self harm and dangerous trends aren’t monitored on TikTok

Social media like Facebook, Instagram and TikTok have been questioned for years as teens grow depressed over the use, and in some cases have died from doing trends.

Facebook and Instagram logos owned by Meta, who is being sued by mothers for their children using self harm due to the social media platforms.

It was in recent years when one trend has young people dancing outside of their moving car, with nobody in the driver’s seat.

Other times, the harm is more mental, and young people find themselves bullied with low self esteem over the constant use of social media.

One mom is suing Meta, the company that owns Facebook, saying her daughter’s addiction to the app drove her to self harm

A mother in Pueblo, Colorado, has taken it upon herself to sue Mark Zuckerberg’s Meta after her daughter self harmed, blaming the social media platform, according to The Sun.

The mother, Cecilia Tesch, claims that her child ended up with mental and physical hardships over her addiction to Facebook.

Her daughter began using Facebook at the age of seven and is now thirteen years old.

Some of the symptoms of Tesch’s daughter listed in the lawsuit include

  • sleep deprivation
  • body dysmorphia
  • an eating disorder
  • self harm
  • severe anxiety
  • severe depression

Tesch says that the use of the platform became an issue before leading her daughter to self harm, and she would remain awake all night due to the constant notifications.

Eventually, her daughter lost interest in other activities.

Will a lawsuit like this against Facebook hold up in court?

The lawsuit was filed in Denver on July 20, 2022.

Facebook currently has an age requirement for people to be able to use the app, which is 13.

This means that Tesch’s daughter’s use of the app from the ages of seven through twelve was against Facebook’s user policy.

It’s currently unknown if Tesch tried to stop or monitor her daughter’s use of the app before age 13.

In response to this, the lawsuit states that the age verification through the platform is weak and that Meta knows this.

It is claimed that Facebook is aware of the weak verification process and intentionally keeps it that way to make a profit off of vulnerable users.

The lawsuit also claims Tesch and her daughter weren’t aware that there were addicting and mentally harmful impacts caused by Facebook.

Facebook is accused of having design flaws that don’t make harmful impacts known. The design of the platform lets children use and abuse it without parental knowledge.

Tesch isn’t alone in filing lawsuits against Meta

Meta not only owns Facebook, but also owns Instagram, another popular social media platform.

Two mothers chose to sue Instagram in July as well.

These moms claimed that the app pushes girls toward anorexia and suicide attempts.

This is due to the platform pushing recipes that restrict calories and incredibly thin models.

If you or anyone you know is impacted by these issues, you can call the National Suicide Prevention Lifeline. They can be reached at 1-800-273-TALK (8255). You can also reach out by texting the Crisis Text Line at 741741.

Facebook isn’t the only app causing concern for parents- TikTok is failing to stop harmful trends from circulating the app

According to Daily Mail UK, TikTok has been failing to moderate its content as it promotes “rape culture” as well as stealing certain car models and a blackout challenge that has killed children.

More specifically, the app is failing to stop former kickboxer Andrew Tate from making misogynistic comments.

A new way to steal KIAs and Hyundias has been shared. A challenge is teaching children to asphyxiate themselves for views has gained popularity.

There are guidelines in place to not allow dangerous acts, hateful commentary, or promotion of suicide and self harm.

This doesn’t appear to matter as these subjects still circulate the app.

In the U.S., this may change due to the Combating Harmful Actions with Transparency on Social Act.

The bill helps fight dangers to children as well as look into how apps are used to commit crimes.

Facebook, Twitter, and Instagram are all under fire in addition to TikTok. They’re facing backlash for failing to moderate themselves the way they need to.

Categories: News

Top