Louisiana School District Sues TikTok, Instagram Parent Companies Over Teen Mental Health Crisis

AP Photo/Michael Dwyer, File

The mental health crisis among teens has been on a notable rise, especially since the COVID-19 pandemic inspired school shutdowns across the country. However, it wasn't just COVID-19 that led to the spike - it only exacerbated the problem.


We've been seeing it for well over a decade. The American Academy of Pediatrics, American Academy of Child and Adolescent Psychiatry and Children’s Hospital Association have declared it a national emergency.

Rates of childhood mental health concerns and suicide rose steadily between 2010 and 2020 and by 2018 suicide was the second leading cause of death for youth ages 10-24. The pandemic has intensified this crisis: across the country we have witnessed dramatic increases in Emergency Department visits for all mental health emergencies including suspected suicide attempts.

The AAP-AACAP-CHA declaration, however, points to racial justice and social inequity as primary causes. While increased media and activist activity on those subjects may be causing anxiety among teens, what really seems to be causing some of the biggest issues is the growing influence of social media in teens' lives.

Down in Livingston Parish, Louisiana, the local school district wants to take action, and they are going to take it to two of the biggest causes of the teen mental health crisis, in their opinion.

In addition to TikTok and Instagram — along with their parent companies ByteDance and Meta — the board is also suing two internet service providers, alleging they are responsible for "facilitating minors' access" to the social media platforms.

“This action is brought to protect children and families in the State of Louisiana from Defendants' intentional manipulation of children via sophisticated design elements deployed on Instagram and TikTok," the lawsuit says. "Defendants do this to keep Louisiana children addicted to their social media platforms."

The school board has asked the court to prevent the companies from “engaging in the conduct” that they say has negatively impacted their students. It is also seeking monetary recompense to fund preventative education for excessive social media use, along with seeking other damages.


So what is the "conduct" that the companies engage in that causes such problems?

I discussed it a bit in the second half of my radio show yesterday (which you can listen to here), but I want to expand on it a bit.

Instagram and TikTok are very addictive apps. The technology behind them - the algorithms that dictate what content you see and how and when you see it - is designed to keep you looking at the screen longer. Once the algorithm picks up on what you appear to like watching, it gives you more and more of that content, and you will see more and more of it in order to keep you scrolling.

The reasons? Because the app is not the product. The app is the medium. You are the product - your interest, demographic information, and personal tastes are all up for sale to advertisers. The problem comes when children are involved. We have allowed screens to babysit our children, and the algorithms don't distinguish between child and grown-up. If your data is there, it will collect it.

The problem is that we now have pre-teens and teens who are subjected to viral videos and influencers, many of whom are pushing the prominent social issues of the day. There is not a wide array of center-right influencers, so all of those issues prominent on the left - trans issues and racial/social justice - are influencing kids just as much as the impossible beauty standards of today and the conflicting "body positivity" movements online.


The Internet has become a confusing and toxic place for everyone, but the ones most susceptible to it are minors, who have not yet developed a proper, healthy mental mindset. Their brains are still developing, and they still have a lot to learn.

TikTok and Instagram take advantage of this. The kids get addicted to all this messaging, keep scrolling, and see more ads. That ad revenue drives those companies to push forward, not back.

So Livingston Parish's school officials want to put a stop to it. I am not a lawyer and I have no idea just how successful they'll be, but I do think they are starting a good social conversation. Can Meta and ByteDance come to the table and make substantive changes that will improve children's experience on their apps?

I think there's a First Amendment issue, at the very least. But, more importantly, there is a moral and ethical concern that these companies should be more willing to work with. Can you block the algorithms from tracking children's information? Sure, it will take kids longer to get to the content they want, and they'll close the app sooner than you might like, but the improvement to their mental health will be noticeable.

But I am also wary of blaming the companies for what kids are doing. They do bear some blame in how easily exploitable they are, and they have to work on that, but we also bear some responsibility in how much we're letting screens dominate our kids' time.


Instagram and TikTok are definitely addicting apps, and their services to contribute to the growing mental health crisis among teens. How do we as a society stop it?



Join the conversation as a VIP Member

Trending on RedState Videos