Did you know you can see the ad boxes Facebook sorts us into?
Fitbit? Pollination? Jaguars? Snakes? Mason jars?
OK, fine, Facebook, I’m not surprised that I’ve clicked on those things. But when did I ever click on anything related to Star Trek: Voyager? Or Cattle?!
My “this feels weird” reaction makes me one of the 51% of Facebook users who report that they’re not comfortable that the ad-driven company creates a list that assigns each of us categories based on our real-life interests.
It’s called “Your ads preference.” You can view yours here. If you drill down, you can see where Facebook gets its categorization ideas from, including the things we click on or like, what our relationship status is, who employs us, and far more.
Most people don’t even know that Facebook keeps a list of our traits and interests. In a new survey from Pew Research Center that attempted to figure out how well people understand Facebook’s algorithm-driven classification systems and how they feel about Facebook’s collection of personal data, the majority of participants said they never knew about it until they took part in the survey.
Overall… 74% of Facebook users say they did not know that this list of their traits and interests existed until they were directed to their page as part of this study.
Once the participants were directed to the ad preferences page, most – 88% – found that the platform had generated material about them. More than half – 59% – said that the categories reflected their real-life interests. But 27% said that the categories were “not very” or “not at all” accurate in describing them.
And then, after they found out how the platform classifies their interests, about half of Facebook users – 51% – said they weren’t comfortable about Facebook’s list creation.
The Pew Research Center’s conclusions come out of a survey of 963 US adults over the age of 18 who have a Facebook account. It was conducted from 4 September to 1 October, 2018. You can see the full methodology here.
What inputs does Facebook’s algorithm chew over?
The “Your ad preferences” page, which is different for every user, is only one factor that Facebook uses to slice users’ lives into categories to which advertisers can market. Unless you’ve drilled down on that page’s categories and told it to forget about certain things that you’ve posted, liked, commented on or shared, all of that activity will be taken into account by the algorithm.
But as we well know, Facebook also follows us around off the site. One of the outcomes of the two days of Congressional grilling that Facebook CEO Mark Zuckerberg went through in April 2018 was that Facebook coughed up details on how it tracks both users and non-users when they’re not even on the site.
In explaining why Facebook collects non-users’ data, David Baser, Product Management Director, said that one tool, Audience Network, lets advertisers create ads on Facebook that show up elsewhere in cyberspace. In addition, advertisers can target people with a tiny but powerful snippet of code known as the Facebook Pixel: a web targeting system embedded on many third-party sites. Facebook has lauded it as a clever way to serve targeted ads to people, including non-members.
Beyond those tools, Pew Research Center said that Facebook also has a tool that enables advertisers to track users who’ve “converted”: in other words, they saw or clicked on a Facebook ad and then gone off and purchased whatever it was for. Bear in mind that you can opt out of that on your ads preference page.
Out of the ocean of data that comes from all those sources, Facebook knows us by demographic, by our social network and personal relationships, our political leanings, what’s happening in our lives, what foods we prefer, our hobbies, what movies we watch, what musicians we shell out money to hear, and what flavor of digital device we use. That’s a lot of grassland for advertisers to graze on.
It’s no surprise, then, that 88% of Facebook users in the study said they were assigned categories, while 11% found, after being directed to their ad preferences page, that they don’t exist in slice-and-dice advertising terms: they were told that they “have no behaviors.”