A Blog by Jonathan Low

 

May 13, 2017

What Did Facebook Think Would Happen When It Helped Brands Target Teens Who Feel ' Worthless?'

While no one specific action appears to have derailed Facebook yet, the accumulated violations of personal information and cultural norms may be fueling the rise of alternative media which may eventually arrest the company's growth. JL

Sam Machkovech reports in ars technica:

Facebook executives promote advertising campaigns that exploit Facebook users' emotional states. Facebook's algorithms can determine, and allow advertisers to pinpoint, "moments when young people need a confidence boost;" emotional states including "worthless," "insecure," "defeated," "anxious," "silly," "useless," "stupid," "overwhelmed," "stressed," and "a failure." Image-recognition tools used on Facebook and Instagram reveal to advertisers "how users express themselves.
Facebook's secretive advertising practices became a little more public thanks to a leak out of the company's Australian office. This 23-page document discovered by The Australian (paywall), details in particular how Facebook executives promote advertising campaigns that exploit Facebook users' emotional states—and how these are aimed at users as young as 14 years old.
According to the report, the selling point of this 2017 document is that Facebook's algorithms can determine, and allow advertisers to pinpoint, "moments when young people need a confidence boost." If that phrase isn't clear enough, Facebook's document offers a litany of teen emotional states that the company claims it can estimate based on how teens use the service, including "worthless," "insecure," "defeated," "anxious," "silly," "useless," "stupid," "overwhelmed," "stressed," and "a failure."
The Australian says that the documents also reveal a particular interest in helping advertisers target moments in which young users are interested in "looking good and body confidence” or “working out and losing weight." Another section describes how image-recognition tools are used on both Facebook and Instagram (a wholly owned Facebook subsidiary) to reveal to advertisers "how people visually represent moments such as meal times." And it goes into great detail about how younger Facebook users express themselves: according to Facebook Australia, earlier in the week, teens post more about "anticipatory emotions" and "building confidence," while weekend teen posts contain more "reflective emotions" and "achievement broadcasting."
This document makes clear to advertisers that this data is specific to Australia and New Zealand—and that its eyes are on 6.4 million students and "young [people] in the workforce" in those regions. When reached for comment by The Australian, a representative for Facebook Australia issued a formal and lengthy apology, saying in part, "We have opened an investigation to understand the process failure and improve our oversight. We will undertake disciplinary and other processes as appropriate."Facebook Australia did not answer The Australian's questions about whether these youth-targeted advertising practices were the same or similar to those at other international Facebook offices. (The document's scope is actually more compliant with US FTC regulations, which apply to users 13 and younger, than with Australian ones, which apply to users 14 and younger.)
The Australian's report does not include screen shots of the document, nor does it describe sample advertising campaigns that would take advantage of this data. Two Facebook Australia executives, Andy Sinn and David Fernandez, are named as the document's authors.

Facebook's ability to predict and possibly exploit users' personal data probably isn't news to anybody who has followed the company over the past decade, but this leak may be the first tacit admission by any Facebook organization that younger users' data is sorted and exploited in a unique way. This news follows stories about Facebook analyzing and even outright manipulating users' emotional states, along with reports and complaints about the platform guessing users' "ethnic affinity," disclosing too much personal data, and possibly permitting illegal discrimination in housing and financial ads.Update: Facebook has issued a statement disputing The Australian's report. "The premise of the article is misleading," the company wrote in its authorless statement. "Facebook does not offer tools to target people based on their emotional state. The analysis done by an Australian researcher was intended to help marketers understand how people express themselves on Facebook. It was never used to target ads and was based on data that was anonymous and aggregated."
Just like the company said in its original apology, it repeated this vague explanation: "Facebook has an established process to review the research we perform. This research did not follow that process, and we are reviewing the details to correct the oversight." However, the statement didn't acknowledge why Facebook did not make any distinction clear to The Australian. As of press time, The Australian has not updated its report, nor has it printed or disclosed full pages of the quoted to either confirm or dispute Facebook's response.

0 comments:

Post a Comment