SAFE For Kids Act: Protecting young users from harmful social media feeds

June 13, 2024
SAFE For Kids Act

By Anuradha Gandhi and Rachita Thakur

Introduction

On June 7, 2024, Governor Kathy Hochul of the State of New York celebrated the legislative passage of two nation-leading bills to protect kids online. The Stop Addictive Feeds Exploitation (SAFE) for Kids Act will curtail a child’s access to addictive feeds on social media, and the New York Child Data Protection Act will protect children’s personal data. [1]

New York is leading the nation to protect our kids from addictive social media feeds and shield their personal data from predatory companies,” Governor Hochul said. Together, we’ve taken a historic step forward in our efforts to address the youth mental health crisis and create a safer digital environment for young people. I am grateful to Attorney General James, Majority Leader Stewart-Cousins and Speaker Heastie, and bill sponsors Senator Gounardes and Assembly member Rozic for their vital partnership in advancing this transformative legislation.[2]

The New York Child Data Protection Act, prohibits online sites from collecting, using, sharing or selling personal data of anyone under the age of 18, unless they receive informed consent or unless doing so is necessary for the purpose of the website.[3]

The New York state attorney general, said that “children are enduring a mental health crisis, and social media is fueling the fire and profiting from the epidemic”. She added that she hoped other states would follow with legislation “to protect children and put their mental health above big tech companies’ profits”.[4]

What is the SAFE Act?

It is an act to amend the general business law, in relation to enacting the Stop Addictive Feeds Exploitation (SAFE) for Kids act prohibiting the provision of an addictive feed to a minor. The Stop Addictive Feeds Exploitation (SAFE) for Kids Act requires social media companies to restrict addictive feeds on their platforms for users under 18 unless parental consent is granted. It further prohibits companies from sending notifications regarding feeds to minors from 12.00am to 6.00am.[5]

What are addictive feeds?

The Act defines Addictive feeds as a technology that show users personalized feeds of media that keep them engaged and viewing longer. This technology, has become the primary way in which people experience social media. As these technologies have spread, companies have developed sophisticated machine learning algorithms that automatically process data about the behavior of users, including not just what they “like” but hundreds of thousands of other data points such as how long a user spent looking at a particular post. The machine learning algorithms then make predictions about mood and what is most likely to keep each of us engaged for as long as possible, creating a tailor-made feed to keep users glued. [6]

How does it affect children?

The Act says, addictive feeds have a devastating effect on children and teenagers since their use is causing them to spend more hours on social media which has been tied to significantly higher rates of youth depression, anxiety, suicidal ideation, and self-harm.

Children are particularly as risk, as this algorithm provides them with a non-stop supply of dopamine and they become addicted as they are less capable of exercising the impulse control required to counter these effects. Which can lead to stunt development and cause long term harm. The association between poor mental health and social media use is stronger than the associations between poor mental health and binge drinking, obesity, or hard drug use. The Act further explains that self-regulation by social media companies will not work as addictive feeds are profitable, and enables the company to collect more data. [7]

The EU Perspective on Children’s safety online

The European Union recently opened investigations into Facebook and Instagram for potential breaches of EU online content rules relating to protection of children online. The Meta platforms are suspected to have infringed the bloc’s strict content moderation rules under the 27-nation Digital Services Act or DSA. The DSA took effect last year with the objective of cleaning up online platforms and protecting internet users.[8]

The DSA regulates online intermediaries and platforms. The European Commission that functions as the bloc’s executive arm voiced its concern that Meta is failing to adequately assess and mitigate risks affecting children. Particularly regarding the design of these social media platforms which recommends contents which can potentially “exploit the weaknesses and inexperience” of children and can lead addictive behavior. This also known as the “rabbit hole effect”, is a situation where a minor watching a certain video is directed to watch more of similar videos by the algorithm.[9] This concern is in line with the concerns highlighted by the SAFE Act.

The UK Perspective: The Online Safety Act

The Online Safety Act 2023 is a new set of laws that protects children and adults online. It puts a range of new duties on social media companies and search services, making them more responsible for their users’ safety on their platforms. The government says the Online Safety Act is due to come into force in the second half of 2025. [10]

The Act’s applies to search engines and services that allow users to post content online or to interact with each other. This includes websites, apps and other services, including social media services, and sharing sites, video sharing platforms, and online instant messaging services. The Act has an extraterritorial reach i.e., it applies to services even if the companies providing them are outside the UK. The Act is applicable if UK is a target market and services are provided to UK users.[11]

These services will also be required to show they are committed to removing illegal content, including, child sexual abuse, extreme sexual violence, animal cruelty, selling illegal drugs or weapons, terrorism and promoting or facilitating suicide or self-harm, among others. Further, pornography sites will have to stop children viewing content, by checking ages. Other new offences have been created, include, cyber-flashing – sending unsolicited sexual imagery online and sharing “deepfake” pornography, where artificial intelligence is used to insert someone’s likeness into pornographic content.

Extra enforcement powers have been given to the regulator Ofcom to ensure that companies comply with the new rules and has published draft codes for them to follow. It requires companies to reconfigure the algorithms that decide what content the users see to ensure that the most harmful content is shielded from children’s feed.[12]

Responses to the US Legislation

Many, including teachers and educators welcomed the move, the President of the New York state teachers’ union, said that “educators see the harmful effects of social media on our kids every day, and this legislation is a tremendous first step toward ensuring these influences remain in their proper places”.[13]

While many praised, some held different views, to the ultimate efficacy of the legislation.

A one-size-fits-all approach to kids’ safety won’t keep kids safe,” said a policy analyst at the Center for Democracy and Technology. “This bill still rests on the premise that there is consensus around the types of content and design features that cause harm. There isn’t, and this belief will limit young people from exercising their agency and accessing the communities they need to online.[14]

The Open Technology Institute, commented “rather than protecting children, this could impact access to protected speech, causing a chilling effect for all users and incentivizing companies to filter content on topics that disproportionately impact marginalized communities”.[15]

Further, the Surveillance Technology Oversight Project, a technology watchdog group that is critical of social media companies’ extensive data collection has also raised concerns about the SAFE for Kids Act. It has opined that the Act could affect anonymity and privacy for adolescent internet users. The group raised its fears that the bill’s requirement that social media companies verify users’ ages would result in users (including minors) surrendering even more private information to companies than they were before.[16]

The Indian Perspective: Reducing addition to online games among children and young adults

India too has put its step forth in addressing the concerns relating to children and their privacy. The Digital Personal Data Protection Act passed by the legislature in August 2023 has specific conditions when it comes to collection, use, including but not limited to tracking and monitoring of activities of children.

Furthermore, India’s proposed Digital India Act will enhance protection for children interacting with AI with provisions for as parental controls, cyber-bulling and non-consensual sharing of children data online.[17]

As a progressive step towards growing concerns over addiction among children and young adults, the Tamil Nadu government is in the process of enacting a new legislation which will impose strict time restrictions on online gaming activities. By limiting the amount of time for users below the age of 18 years to three (03) hours a day, the law aims to promote healthier lifestyle and reduce the negative consequences associated with excessive gaming.

To read further on the topic, refer to our article Tamil Nadu to pass legislation to put time restriction on online gaming.

Ahana Bag, Junior Associate Advocate at S.S. Rana & Co. has assisted in the research of this article.

[1] https://www.theguardian.com/us-news/article/2024/jun/08/new-york-social-media-privacy-children

[2] https://www.governor.ny.gov/news/governor-hochul-celebrates-legislative-passage-nation-leading-bills-combat-addictive-social#:~:text=The%20SAFE%20for%20Kids%20Act,for%20specific%20topics%20of%20interest.

[3] https://www.nysenate.gov/legislation/bills/2023/S7695/amendment/A

[4] https://www.theguardian.com/us-news/article/2024/jun/08/new-york-social-media-privacy-children

[5] https://www.nysenate.gov/legislation/bills/2023/S7694/amendment/A

[6] Ibid

[7] https://www.nysenate.gov/legislation/bills/2023/S7694/amendment/A

[8] https://commission.europa.eu/strategy-and-policy/priorities-2019-2024/europe-fit-digital-age/digital-services-act_en

[9] https://economictimes.indiatimes.com/news/international/business/facebook-instagram-face-fresh-scrutiny-under-eus-digital-norms/articleshow/110186451.cms?from=mdr

[10] https://www.gov.uk/government/publications/online-safety-act-explainer/online-safety-act-explainer#:~:text=The%20Online%20Safety%20Act%202023,users’%20safety%20on%20their%20platforms.

[11] Ibid

[12] https://www.bbc.com/news/technology-68225707

[13] https://www.theguardian.com/us-news/article/2024/jun/08/new-york-social-media-privacy-children

[14] https://www.theguardian.com/us-news/article/2024/jun/08/new-york-social-media-privacy-children

[15] https://www.newamerica.org/oti/in-the-news/new-text-same-problems-inside-the-fight-over-child-online-safety-laws/

[16] https://www.cityandstateny.com/policy/2024/05/tech-watchdog-fears-social-media-regulation-bill-could-harm-minors-privacy/396960/

[17] https://www.cnbctv18.com/technology/india-to-introduce-new-digital-act-to-protect-children-from-the-dangers-of-internet-15639141.htm

Related Posts

NHRC’s Advisory on Proliferation of Child Sexual Abuse Material (CSAM)

Personal data of Children under the GDPR

For more information please contact us at : info@ssrana.com