Kids Online Ad-Labeling Bill Returns
Legislation would also require annual public reports on potential harms of targeted ads
A just-introduced reintroduction of bipartisan Senate legislation would put major new labeling and reporting requirements on advertisers targeting children, including annual public reports on how such advertising can harm them.
The bill, the Kids Online Safety Act (KOSA) , was introduced in February 2022 and put back into play Tuesday (May 2) by Sens. Richard Blumenthal (D-Conn.) and Marsha Blackburn (R-Tenn.), two prominent players in the Big Tech oversight hearings over the past year-plus that led to the legislation.
There was a push to get the 2022 version passed in the lame-duck session of the last Congress, but time ran out.
“Our bill provides specific tools to stop Big Tech companies from driving toxic content at kids and to hold them accountable for putting profits over safety,” Blumenthal said in a statement. “Record levels of hopelessness and despair — a national teen mental-health crisis — have been fueled by black-box algorithms featuring eating disorders, bullying, suicidal thoughts and more. Kids and parents want to take back control over their online lives.”
Added Blackburn, “Big Tech has proven to be incapable of appropriately protecting our children and it’s time for Congress to step in.”
Among the bill’s ad-related elements is a requirement that any ads aimed at children include “clear, conspicuous, and easy-to-understand” labels with a host of information, including why the child is being targeted and how their personal data was used to target them.
It would also put disclosure mandates on influencers, requiring the disclosure of “endorsements of products, services, or brands made for commercial consideration by other users of the platform.”
The Federal Trade Commission would be directed to come up with new regulations enforcing the labeling requirements.
The bill would also require social media and other user-generated content sites —including video-sharing sites with at least 10 million active U.S. users per month — to produce annual reports to the public.
Those reports must include “an assessment of how recommendation systems and targeted advertising systems can contribute to harms to minors.”
While the legislation’s goal is protecting children, some groups oppose it in its current form because they believe it allows the government too much room to dictate content.
Advocacy group Fight for the Future, for example, said the measure gives state attorneys general too much power to sue platforms over their content recommendations to children, a power it says can be weaponized against LGBTQ and other content. Because the bill has a duty of care requirement to police their sites, the group said, compliance would require the use of some type of artificial-intelligence filter that “will end up cutting off young people from essential, life-saving resources, educational materials and more.”
NEXT TV NEWSLETTER
The smarter way to stay on top of the streaming and OTT industry. Sign up below.
Contributing editor John Eggerton has been an editor and/or writer on media regulation, legislation and policy for over four decades, including covering the FCC, FTC, Congress, the major media trade associations, and the federal courts. In addition to Multichannel News and Broadcasting + Cable, his work has appeared in Radio World, TV Technology, TV Fax, This Week in Consumer Electronics, Variety and the Encyclopedia Britannica.