A recently released report from the Office of the Attorney General contains recommendations to protect Minnesotans, especially young Minnesotans, from the threats and harm of technology.
Rep. Zack Stephenson (DFL-Coon Rapids) has included many of them in what he calls the “Prohibiting Social Media Manipulation Act.”
Approved Monday by the House Commerce Finance and Policy Committee, HF4400 would require a social media platform to allow users to indicate what content they do or do not want and require a platform’s algorithm to abide by those preferences.
The central focus of the bill, Stephenson said, “is to put some guardrails, and some regulation on … an almost completely unregulated, massively important sector of our economy.”
“Report on Emerging Technology and its Effect on Youth Well-Being” concluded, among other things, that many social media users, especially youth, are experiencing online bullying and harassment, facilitated by the choices of technology platforms. It also found that cases of online manipulation and fraud often begin with unwanted contact from strangers, facilitated by loose privacy defaults that new users unknowingly agree to when joining a platform.
“Social media is having massive impacts on our society, on our children, and particularly on our children’s metal health,” Stephenson said.
Provisions of the bill heading to the House Judiciary Finance and Civil Law Committee would prohibit platforms from pushing content to users that does not align with users’ preferences simply to maximize their time and engagement on the platform.
New users would automatically have strict privacy settings by default. These would focus on keeping user-generated content within a user’s chosen social network.
Safeguarding new users’ privacy is paramount to protecting their online safety, said Ravi Iyer, research director at the University of Southern California Marshall Neely Center for Ethical Leadership and Decision Making. He previously worked for more than four years at Meta addressing the large-scale societal impact of Facebook.
“You sign up [for these platforms] to talk to your friends and you don’t realize that anyone in the world may be able to contact you,” he said.
In his presentation, Iyer said disturbing, graphic, and sexual content being sent to users is a rapidly growing problem. This unwelcome content is often recommended by AI-powered algorithms.
The bill would prohibit a platform from allowing user-generated content to be scraped and utilized by generative AI, without a user’s consent.
But Robert Singleton, director of policy and public affairs for California and the Western U.S. at the Chamber of Progress, a tech industry trade group, believes the bill is vaguely written, would have a chilling effect on social media platforms, “broadly infringe” on the First Amendment, and be “destined to lose in court.”
Platforms would self-censor out of fear of litigation and decline to show any content that might possibly contradict a user’s preference.
“The lack of clear, specific definitions could prompt social media platforms to broadly interpret the law so as to avoid litigation and fees, resulting in over moderation, removing or restricting a wide variety of content and leaving users with a bland or unvaried online experience,” Singleton said.
Stephenson disagreed, saying the bill is crafted differently from legislation in other states that courts have struck down for being unconstitutional.