The Price of Algorithmic Engagement
The New York Times has an article about how people who run businesses focused on products for children Instagram place ads designed to attract women, they instead had their ads seen mostly by men. And of course, by men with a history of sex crimes. This is almost inevitable given that the algorithms are designed to drive engagement with the platform. Instagram doesn’t care about its users, it cares about manipulating the emotions of its users to keep them on the platform.
The Times found that several businesses who tried to advertise to woman instead got mostly men, despite what Instagram’s ad categorization claimed were the actual target audiences. Meta, Instagram’s parent company, probably avoids fraud claims by the use of fine print that mentions that what it tells ad purchasers in the main print isn’t actually true, but morally it certainly seems as if Instagram is making a promise to these people and failing to deliver. Instead, they get people who aren’t their target and have no intention of spending money on their products. All to serve engagement.
The Times discovered that there was a significant overlap between the accounts these ads were shown to and the accounts of men who follow child influencers:
An analysis of the users who interacted with the ads posted by The Times found an overlap between those two worlds. [the ad world and the world of the study the Times ran several months ago laying out how men send sexualized comment sot child influencers -ed] About three dozen of the men followed child influencer accounts that were run by parents and were previously studied by The Times; one followed 140. In addition, nearly 100 of the men followed accounts featuring or advertising adult pornography, which is barred under Instagram’s rules.
And, of course, these ads also attracted brazenly sexual comments. One man even offered to pay for sex with one of the child models. That’s Instagram — brining the world together.
Inevitably, someone is going to point out that there are studies that show social media companies don’t actually drive radical content or this kind of behavior, that the people end up their because they choose to end up there. Except we keep seeing stories like this — ads meant for women and children get instead pushed to men who sexualize children. And leaked documents show that, for example, Facebook deliberately tweaked its algorithm to emphasize angry and polarizing content. And several studies have shown that fresh YouTube accounts get pushed to radicalizing content, usually right-wing radicalization.
The interplay of the algorithms and accounts is complex. The choices of users clearly play a role, but just as clearly so do the choices about how these companies build their algorithms. It is very hard to see where the balance of responsibility lies since the algorithms are secret, but it’s clear that engagement is the driving force behind what these algorithms produce, and that engagement often means driving anger or other anti-social behavior.
The behavior the Times discovered is not only about Instagram or the way it helps make the internet less safe for children, though those are obviously important issues. At its heart, it is as much about how little we know about what these companies are actually building and how they use their algorithmic tools to enrich themselves at the expense of society. It is long past time these tools be made public in some form. It could be to limited researchers and government agencies, but something has to change.
Because is it not helpful to anyone but Instagram to have sexual predators overrun ads for children’s jewelry.