Facebook is taking aim at scammers who use bait-and-switch tricks to sneak sketchy ads past its inspectors.
Hackers have long been able to bypass the social network’s rules against ads for things like diet pills, pornography, and gambling through a practice called “cloaking.” Cloaked ads are made to link to a different page when clicked by the company’s vetting team.
No longer, Facebook says. The company has built new artificial intelligence software to detect cloaking attempts and added more people to its review staff to identify them manually.
“We don’t want these bad actors and these negative experiences on the platform whatsoever,” says Facebook product director Rob Leathern, who’s in charge of policing the site for nefarious activity. “The effects it has on peopleit’s jarring, it’s a bad experienceso we’re committed to addressing it.”
The company has been working to choke off these sorts of schemes for years by bulking up its vetting capabilities. But it’s been much more public about its misinformation battles in general in recent months because of public scrutiny on its role in spreading fake news and pressure from advertisers who are skeptical of its black-box metrics.
While cloaking clearly remains a headache for Facebook’s enforcers, it’s gotten much harder to pull off, according to various hacker forums. Doing so now requires a list of all the various IP addresses and traffic fingerprints of Facebook’s review teams and third-party partners as well as knowledge of how its automated systems work.
Still, it’s easy to find websites that purport to offer such services. Many of them come with a disclaimer about their risky nature.
“Cloaking isn’t magic, and it isn’t some get-rich-quick thing,” one of these warnings reads. “Its a war between the marketer and the ad networks [and] search enginesa constant back and forth.”
Its a war between the marketer and the ad networks [and] search enginesa constant back and forth.
Facebook hopes that its latest crackdown will finally up this risk to the point where it no longer makes economic sense for marketers to even try.
“We want to increase the costs for these scammers so they have less incentive,” Leathern said. “We want to make it clear to the bad actors out there that we’re ramping up enforcementwe’ll take down ad accounts, we’ll take down pages.”
The push is part of a broader effort to drum out misleading content of any kind from the platform that Facebook’s been publicizing heavily since last year’s presidential election. Other moves have included shutting down vast bot networks used to defraud advertisers, suppressing links from websites with bad ad experiences, and adding warning labels to links from sites that tend to get low ratings from reputable fact-checkers.