It shouldn’t come as a surprise that whatever you do on Facebook eventually comes back to you in the form of targeted ads. It turns out, however, that Facebook’s advertising tools have also marked hundreds of thousands of children as interested in gambling and alcohol, opening them up to some not-so-great advertisements.
A joint investigation by The Guardian and the Danish Broadcasting Corporation found that 740,000 children under 18 years old are flagged in Facebook’s ad tools as being interested in gambling. Another 940,000 are marked as interested in alcoholic drinks. Part of the reason why this might have occurred is that Facebook doesn’t break out interest categories by age or reason.
There’s plenty of ways this automatic labeling can go wrong. For starters, exploitative games—think deceptive loot boxes—can target children marked as interested in gambling and that technically wouldn’t breach any of Facebook’s rules. (Facebook already got in trouble earlier this year for knowingly refusing refunds to parents whose kids unknowingly racked up thousands of dollars in charges.)
Facebook currently lists ads referencing alcohol and gambling as “restricted content.” For alcohol, Facebook’s policy page states, “Ads that promote or reference alcohol must comply with all applicable local laws, required or established industry codes, guidelines, licenses and approvals, and include age and country targeting criteria consistent with Facebook’s targeting guidelines and applicable laws.” For gambling, it notes real-money games or lotteries must have prior written permission and must target people 18 years or older.
A Facebook spokesperson told Gizmodo over email, “We don’t allow ads that promote the sale of alcohol or gambling to minors on Facebook and we enforce against this activity when we find it. We also work closely with regulators to provide guidance for marketers to help them reach their audiences effectively and responsibly.”
The problem is Facebook isn’t exactly proactive about ads that run afoul of its policies. It primarily relies on an automated review process for ads that don’t follow the rules. In a perfect world, that means any ad that violated Facebook’s policies would be rejected before running. However, the world is not perfect and neither is automated review. As it stands, dubious ads could already reach a child’s eyeballs before Facebook’s even aware and can take action. Users can report ads, sure, but by then there’s no telling how many people it’s reached.
In general, Facebook has fumbled how it handles children on its platform. While it does restrict anyone under 13 years old from making accounts, plenty of people break that rule and go unreported. It also only recently stopped showing ads for gun accessories to children last year. Facebook launched a Messenger Kids app that’s exclusively designed for kids as young as 6 last year, in spite of protests from child health advocates. Then, this past July, it was reported that the app let children talk to unauthorized strangers—the one thing it wasn’t supposed to pre