A study by the FTC reveals a “broad surveillance” of social media users.

A study by the FTC reveals a “broad surveillance” of social media users.

Apologies for the oversight! Here’s the full translation:

Meta, YouTube, and other sites collected more data than most users imagined, according to a new report from the Federal Trade Commission.

Credit: Cecilia Kang – The New York Times.

The Federal Trade Commission said Thursday that it found several social media and streaming services were conducting “extensive surveillance” on consumers, including minors, collecting and sharing more personal information than most users believed.

The findings stem from a study on how nine companies, including Meta, YouTube, and TikTok, collected and used consumer data. The sites, which mostly offer free services, benefited from the data by using it in targeted advertising aimed at specific users based on their demographics, according to the report. The companies also failed to protect users, especially children and teenagers.

The FTC said it began its study nearly four years ago to provide the first holistic look at the opaque business practices of some of the largest online platforms that have created multibillion-dollar advertising businesses using consumer data. The agency said the report highlighted the need for federal privacy legislation and restrictions on how companies collect and use data.

“Surveillance practices can jeopardize people’s privacy, threaten their freedoms, and expose them to a range of harms, from identity theft to harassment,” said Lina Khan, chairwoman of the FTC, in a statement.

Tech giants are under intense scrutiny for privacy abuses, and in recent years, they have been partly blamed for contributing to a mental health crisis among young people and children, which some social scientists and the head of public health services have linked to the rampant use of social media and smartphones. But despite numerous proposals in Congress to more strictly protect the privacy and safety of children online, nearly all legislative attempts to regulate big tech have failed.

The companies’ efforts at self-regulation have also failed, the FTC concluded in its report. “Self-regulation has been a failure,” it added.

Google, owner of YouTube, “has the strictest privacy policy in our industry: we never sell personal information, and we do not use sensitive information to show ads,” said José Castañeda, a Google spokesperson. He added: “We prohibit the personalization of ads for users under 18 and do not personalize ads for anyone watching ‘content made for kids’ on YouTube.”

The public policy director of Discord for the U.S. and Canada, Kate Sheerin, said in a statement that the FTC report “lumps very different models into one group and paints with a broad brush.” She added that Discord does not operate a formal digital advertising service.

TikTok and Meta, owner of WhatsApp, Messenger, and Facebook, did not immediately respond to requests for comment.

In December 2020, the agency opened an investigation into the nine companies operating 13 platforms. The FTC requested data from each company on operations conducted between 2019 and 2020 and then studied how the companies had collected, used, and stored that data.

The study included the streaming platform Twitch, owned by Amazon, the messaging service Discord, the photo and video sharing app Snapchat, and the discussion forum Reddit. Twitter, now rebranded as X, also provided data.

The study did not reveal the results for each company. Twitch, Snap, Reddit, and X did not immediately respond to requests for comment.

Companies have argued that they have tightened their data collection policies since the studies were conducted. Earlier this week, Meta announced that Instagram user accounts under 18 will be private by default in the coming weeks, meaning only followers approved by the account holder will be able to view their posts.

The FTC found that companies voraciously consumed data about users and often purchased information about people who were not users through data intermediaries. They also collected information from accounts linked to other services.

Most companies collected users’ age, gender, and language spoken. Many platforms also obtained information about education, income, and marital status. The companies did not offer users easy ways to opt out of data collection and often kept sensitive information far longer than consumers would expect, the agency said.

The companies used data to create user profiles (often merging the information they gathered with data about habits collected from other sites) to show ads.

The agency also found that many sites claimed to restrict access to users under 13, but many children remained on the platforms. Teenagers were also treated like adults on many apps, subjecting them to the same data collection as adults.

Many of the companies could not inform the FTC how much data they were collecting, according to the study.

Last year, the FTC proposed changes to strengthen regulations on children’s privacy, and lawmakers are seeking to increase privacy protections for users under 18. In 2022, Ms. Khan launched a regulatory effort to create rules for companies that show ads based on users’ browsing or search history.

The agency has already filed complaints against several tech companies for privacy violations, and in late 2022, it reached a $520 million settlement with Epic Games for allegedly violating a children’s privacy law and deceiving consumers with unjustified charges. That same year, the FTC fined Twitter $150 million for using security data about users for behavior-based advertising.

error: Content is protected !!