Facebook flags thousands of kids as interested in gambling

We know that Facebook tracks what we do to flag our interests for use in targeted advertising.

But the algorithm it uses to do so is marking thousands of kids as being in booze and gambling, which could lead to them being targeted with ads that aren’t appropriate to show to minors, according to a joint investigation by The Guardian and the Danish Broadcasting Corporation (DBC).

The investigation found that Facebook’s ad tools flag 740,000 children under the age of 18 as being interested in gambling. Another 940,000 kids are marked as interested in alcoholic beverages.

As the Guardian points out, such interests are automatically generated, based on what the platform observes of a user’s activity. That data then gets fed to advertisers, who can use it to target specific subgroups that show signs of potentially being interested in whatever the advertisers are pushing.

Facebook said in a statement that advertising alcohol or gambling to minors on the social network is forbidden:

We don’t allow ads that promote the sale of alcohol or gambling to minors on Facebook and we enforce against this activity when we find it. We also work closely with regulators to provide guidance for marketers to help them reach their audiences effectively and responsibly.

But there are reportedly instances where Facebook will, in fact, let kids be targeted for interests in these age-inappropriate areas. The investigation’s reporters got input from a Facebook insider who gave the example of an anti-gambling service that might reach out to offer support to children who might have a gambling problem.

The Guardian also highlights a more insidious example of how such targeting might be used. The publication pointed to young people who are addicted to video games such as Fortnite, Candy Crush and Call of Duty – addicts whom the UK’s National Health Service (NHS) recently opened up a clinic to treat.

Also Read:  Facebook Patches “Memory Disclosure Using JPEG Images” Flaws in HHVM Servers

Developers of such games, with their profitable loot boxes of consumable virtual items that can be redeemed to get yet more virtual loot, could target their ads to children who’ve been flagged as having an interest in gambling – all without breaching Facebook’s regulations about not marketing gambling to kids.

Facebook actually got in trouble earlier this year for knowingly refusing refunds to parents whose kids didn’t realize that the money they were spending in games like Ninja Saga was real. That naivete led to kids unknowingly racking up thousands of dollars in charges.

For advertisers of content prohibited on Facebook who decide to skirt Facebook’s about advertising to children, they’ve got preselected audiences thanks to Facebook’s flagging users by interest. Nor does the platform have a proactive way to stop them. It relies primarily on automated review to flag prohibited ads, but those reviews don’t necessarily stop the ads from running in the first place.

The Guardian points to a recent lawsuit Facebook settled over this failing. In January, it got lawsuited into creating a scam ads reporting tool, and donating £3m to a consumer advocate group, by UK financial celeb Martin Lewis.

Lewis’s name and face had been slathered on all sorts of financial scams that he’d never endorse – scams that Facebook’s detection tools repeatedly failed to block. In fact, Lewis’s claimed that Facebook had published over 50 fake advertisements that used his face and name without his permission.

You might also like More from author

Comments are closed.