Get the latest tech news How to check Is Temu legit? How to delete trackers
NEWS
Facebook

Facebook can tell when teens feel insecure

Jessica Guynn
USA TODAY

SAN FRANCISCO — Facebook knows when teens are feeling "insecure," "worthless," "stressed" or defeated" — and it quietly shared that information with an advertiser.

The social media company says it made a mistake handing over its research to an advertiser, and it says advertisers cannot target its nearly 2 billion users based on their emotional state. But the incident sheds new light on how companies like Facebook regularly mine our our daily lives, and it raises privacy issues for young people whose emotions are being monitored and studied.

According to documents leaked to The Australian newspaper, two Facebook executives prepared a report for one of the country's top banks describing how Facebook gleans psychological insights into the mood shifts of millions of young people in Australia and New Zealand by monitoring their status updates and photos.

The 23-page report showed Facebook's ability to detect when users as young as 14 are feeling emotions such as defeat, stress, anxiety or being overwhelmed. It was prepared by two of Facebook’s top Australian executives, David Fernandez and Andy Sinn, and provides other information on young people's emotional well-being such as when they exhibit "nervous ­excitement" as well as feelings connected to "conquering fears."

Not only is Facebook able to track these emotions, it can track how they fluctuate during the week. "Anticipatory emotions are more likely to be expressed early in the week, while reflective emotions increase on the weekend," the report said. "Monday-Thursday is about building confidence; the weekend is for broadcasting achievements."

Facebook says it fields requests from advertisers to conduct research on its users. This research was done about a year ago to help marketers in Australia and New Zealand understand how people express themselves on Facebook and no ad campaigns resulted from it, it said.

"We have a process in place to review the type of research we perform and in this case that process was not followed," Facebook said in a statement to USA TODAY.

Opening up about our lives on social media can provide much needed support and advice when people feel upset or isolated. But, while people are unburdening themselves, vast amounts of data are quietly being collected and analyzed in the background, some of which is relayed in anonymous, aggregated form to marketers so they can more effectively target advertising dollars.

"People experience social media as a place to share their ups and downs. To show themselves. Now, we must revise our expectations or demand a new standard of practice from Facebook," Sherry Turkle, professor of the social studies of science and technology at MIT, said in an email. "Is our emotional state something that should be 'sold' as a new commodity?"

That's a subject that may concern this next generation of consumers and their parents. "The idea that a child's depression could be turned into a commodity shocks us now, but these things become the 'new normal' very quickly," Turkle said.

Sentiment analysis is commonplace on the Internet. Computer algorithms regularly take our emotional pulse, sifting through mountains of data collected from our daily online expression in real time. This kind of analysis can be used to gauge how people feel about a political candidate or about a particular company or product. Social sentiment analysis analyzes feelings expressed on social media such as Facebook, Instagram or Twitter.

Gauging sentiment is a tricky business. Researchers say algorithms will need to become more sophisticated before they can read our feelings or predict our behavior. But Facebook's global girth gives it a distinct advantage. It employs teams of data scientists to crunch the massive quantities of data it has on its users, parsing it at highly granular levels.

That machine analysis can yield important insights such as detecting and preventing suicidal behavior. But Facebook has also come under heavy scrutiny in the past for secretly conducting research that manipulated the emotions of users by altering what they see in their News Feed without their consent.

When conducting consumer research Facebook says it carefully analyzes whether it would be beneficial to users or if it would have adverse effects on them. It also looks at whether people would be surprised if they knew the research was being conducted. All research is aggregated and anonymous, Facebook says.

Advertisers cannot target individuals based on how they are feeling or how they say they are feeling on Facebook. So someone feeling happy would not necessarily see a different ad than someone feeling sad. But advertisers can target ads to certain age groups and they can time them to run on a certain day.

Facebook critics like Jeffrey Chester want technology companies to provide lawmakers and regulators more insight into their data collection practices, and how that data is used to peddle ads, particularly in the vulnerable teen years, an anxiety-laden transition to adulthood filled with social, emotional and developmental challenges.

"The Australian leak reveals that Facebook continues to view its users — even young ones — as nothing more than cash cows that can be manipulated for marketers," said Chester, executive director of the Center for Digital Democracy, a nonprofit watchdog group in Washington. "Facebook appears to be engaging in its own form of 'psyops' where it can use the immense power of its platform to manipulate young people."

Follow USA TODAY senior technology writer Jessica Guynn @jguynn

Featured Weekly Ad