Great article this.
— Hayden W. Bell 🍃🍂 (@wickertongue) July 18, 2020
"So she effortlessly recognised the telltale indicators of pseudoscience marketing – unproven and sometimes dangerous treatments, promising simplistic solutions and support." https://t.co/cdUyqEgt5B
Facebook, as an advertising machine, “knows only correlations between advertisers’ requirements and users’ profiles. It neither knows nor cares about cancer or anything else.”
This has been dramatically demonstrated in experiments conducted by imaginative journalists. In September 2017, for example, researchers from ProPublica did a test to see if the machine would help them to promote three posts to antisemitic users.It did. At one point in the process, for example, the automated system asked the researchers if they wished to “INCLUDE people who match at least ONE of the following: German Schutzstaffel, history of ‘why Jews ruin the world’, how to burn Jews, Jew hater”. “Your potential audience selection is great!” it told the researchers. “Potential audience size: 108,000 people.” And all for $30. After ProPublica contacted Facebook, the company removed the antisemitic categories and said it would explore ways to fix the problem, such as limiting the number of categories available or scrutinising them before they are displayed to buyers.There’s no point in trying to anthropomorphise this. Facebook is clearly not run by Nazis. But what its software engineers have built is an incredibly powerful, beautifully engineered machine for matching advertisers with people who might be receptive to their messages. And it’s clear that advertisers love that machine because it gives them a warm feeling that their advertising budgets may be spent more effectively on Facebook rather than on billboards or TV ads. Which, of course, sadly also means that the much-hyped advertising boycott spurred by the #blacklivesmatter protests will have little impact on Facebook’s bottom line. Morals matter, but money talks.archived*
*a link – see a note on notes and links