Two bills introduced into the State Legislature aim to protect kids...

Two bills introduced into the State Legislature aim to protect kids from encountering harmful social media content. “Our children are in crisis," Gov. Kathy Hochul said. Credit: Office of Gov. Kathy Hochul/Darren McGee

ALBANY — New legislation aims to help keep children and teenagers from being subjected to a barrage of unwanted and potentially harmful "addictive" social media videos and messages.

One bill introduced in the State Legislature on Wednesday would restrict the sending of unwanted content to youths that include negative messages about anxiety-provoking concerns and encourage self-harm and violence — acts sometimes called "doom scrolling," the proposal’s supporters say.

A second bill introduced in the State Legislature on Wednesday would prohibit online platforms from sharing or selling the personal data they collect for advertising purposes about users under 18 years old that can be used to send youths unsolicited content such as video links.

Both bills allow exceptions through verified parental consent. Youths would still be able to view content from friends and sites they follow or generally popular content.

The bills, which propose fining social media companies $5,000 for each offense, are a reaction to spikes in depression, self-poisoning and suicide among youths. Girls and gay and trans youths have been particularly hard it by the messages that can make youths overly focused on their appearance.

“Our children are in crisis and it’s up to us to save them,” said Gov. Kathy Hochul.

She and Attorney General Letitia James attended a Manhattan news conference on Wednesday in support of the bills sponsored by Assemb. Nily Rozic (D-Queens) and Sen. Andrew Gounardes (D-Brooklyn).

“The noise from the social media platforms becomes too much to bear,” James said. “Social media is fueling a national mental health crisis.”

Meta, which operates Instagram and the platform formerly called Facebook, said it will evaluate proposed legislation and work with policymakers “on developing simple, easy solutions for parents on these important industrywide issues.”

“Content that encourages suicide, self-injury, eating disorders, or things like bullying and harassment break our rules and we remove that content when we find it,” Meta stated. “We continue to improve the technology we use to detect and remove this type of content.”

“We’ve built safety and privacy directly into teen experiences,” said Antigone Davis, head of Global Safety at Meta. “We’ve developed more than 30 tools to support families, including parental supervision tools that allow teens and parents to navigate social media safely together and tools to help ensure teens have age-appropriate experiences online.”

TikTok had no immediate comment on the bill, but the company said it works hard to protect its users, especially teenagers, and provides a way for parents to control their childrens’ use. Safeguards include a 60-minute daily screen time limit for users 13 to 18 years old. Users 13 to 15 years old don’t receive “push notifications” after 9 p.m. and users 17 and 18 years old don’t receive push notifications after 10 p.m. 

The bills are prompted in part by two Long Island cases.

In 2022, Chase Nasca, 16, of Bayport was killed after stepping in front of a Long Island Rail Road train. His parents said he had been inundated with thousands of unsolicited TikTok videos that promoted suicide, although his search history showed he was instead seeking videos about Batman, kitchen hacks and motivational workouts. The family has sued the social media platform.

On Wednesday, middle school teacher Kathleen Spence of Yaphank said her daughter, Alexis, was harmed by unsolicited messages a decade ago, when Alexis was 11 years old, a fact made clear in her online posts. Spence said her daughter, now 21, was hit with unwanted messages on Instagram about eating disorders, self-harm and suicide. The family is suing Meta, also known under its previous name of Facebook, which owns Instagram.

“It took several years for our daughter to overcome her social media addiction … thousands of inappropriate posts and images,” Spence said. As a teacher, she said she sees students come to class “half asleep” because they were up late hooked on social media. “They can’t learn like that … social media is the silent killer of our youth."

The Stop Addictive Feeds Exploitation for Kids Act would:

  • Restrict use of algorithms that send content to young users for which they haven’t subscribed or followed. That content entices kids to stay online longer and to be exposed to more advertising. The measure would allow users under 18 years old and their parents to opt out of receiving these “addictive feeds” they didn’t seek.
  • Block addictive, unsolicited feeds between midnight and 6 a.m. unless there is “verifiable parental consent.” Parents could also use the measure to limit the number of hours their youth can spend on addictive feeds.

The New York Child Data Protection Act would prohibit all online sites from collecting, using or selling personal data of anyone under 18 years old for the purpose of advertising. An exception is provided if the online platform receives “informed consent or unless doing so is strictly necessary for the purpose of the website.”

Parents and guardians also would be able to seek damages of up to $5,000 per incident.

Bethpage FCU changes name ... Smithtown WWII vet turns 100 ... What's up on LI Credit: Newsday

Gaetz withdraws as Trump's AG pick ... Sands Meadowbrook proposal ... Bethpage FCU changes name ... Cost of Bethpage cleanup

Bethpage FCU changes name ... Smithtown WWII vet turns 100 ... What's up on LI Credit: Newsday

Gaetz withdraws as Trump's AG pick ... Sands Meadowbrook proposal ... Bethpage FCU changes name ... Cost of Bethpage cleanup

SUBSCRIBE

Unlimited Digital AccessOnly 25¢for 6 months

ACT NOWSALE ENDS SOON | CANCEL ANYTIME