Large social media companies and streaming platforms — including Amazon, Alphabet-owned YouTube, Meta’s Facebook and TikTok — engage in a “vast surveillance of users” to profit off their personal information, endangering privacy and failing to adequately protect children, the Federal Trade Commission said Thursday.
In a 129-page report, the agency examined how some of the world’s biggest tech players collect, use and sell people’s data, as well as the impact on children and teenagers. The findings highlight how the companies compile and store troves of info on both users and non-users, with some failing to comply with deletion requests, the FTC said.
“The report lays out how social media and video streaming companies harvest an enormous amount of Americans’ personal data and monetize it to the tune of billions of dollars a year,” FTC Chair Lina Khan said in a statement. “While lucrative for the companies, these surveillance practices can endanger people’s privacy, threaten their freedoms, and expose them to a host of harms, from identify theft to stalking.”
According to the FTC, the business models of major social media and streaming companies centers on mass collection of people’s data, specially through targeted ads, which account for most of their revenue.
“With few meaningful guardrails, companies are incentivized to develop ever-more invasive methods of collection,” the agency said in the report.Â
“Especially troubling”
The risk such practices pose to child safety online is “especially troubling,” Khan said.
Child advocates have long complained that federal child privacy laws let social media services off the hook provided their products are not directed at kids and that their policies formally bar minors on their sites. Big tech companies also often claim not to know how many kids use their platforms, critics have noted.
 “This is not credible,” FTC staffers wrote.Â
Meta on Tuesday launched Instagram Teen Accounts, a more limited experience for younger users of the platform, in an effort to assuage concerns about the impact of social media on kids.
The report recommends steps, including federal legislation, to limit surveillance and give consumers rights over their data.
Congress is also moving to hold tech companies accountable for how online content affects kids. In July, the Senate overwhelmingly passed bipartisan legislation aimed at protecting children called the Kids Online Safety Act. The bill would require companies strengthen kids’ privacy and give parents more control over what content their children see online.Â
YouTube-owner Google defended its privacy policies as the strictest in the industry.
“We never sell people’s personal information, and we don’t use sensitive information to serve ads. We prohibit ad personalization for for users under 18, and we don’t personalize ads to anyone watching ‘made for kids content’ on YouTube,” a Google spokesperson said in an email.
Amazon, which owns the gaming platform Twitch, did not immediately respond to a request for comment. Meta, which also owns Instagram, declined comment.
The FTC report comes nearly a year after attorneys general in 33 states sued Meta, saying company for years kept kids online as long as possible to collect personal data to sell to advertisers.
Meta said at the time that no one under 13 is allowed to have an account on Instagram and that it deletes the accounts of underage users whenever it finds them. “However, verifying the age of people online is a complex industry challenge,” the company said.
The issue of how Meta’s platforms impact young people also drew attention in 2021 when Meta employee-turned-whistleblower Frances Haugen shared documents from internal company research. In an interview with CBS News’ Scott Pelley, Haugen pointed to data indicating that Instagram worsens suicidal thoughts and eating disorders for certain teenage girls.Â