SINCE THE CAMBRIDGE ANALYTICA scandal erupted in March, Facebook has been attempting to make a moral stand for your privacy, distancing itself from the unscrupulous practices of the U.K. political consultancy. “Protecting people’s information is at the heart of everything we do,” wrote Paul Grewal, Facebook’s deputy general counsel, just a few weeks before founder and CEO Mark Zuckerberg hit Capitol Hill to make similar reassurances, telling lawmakers, “Across the board, we have a responsibility to not just build tools, but to make sure those tools are used for good.” But in reality, a confidential Facebook document reviewed by The Intercept shows that the two companies are far more similar than the social network would like you to believe.
The recent document, described as “confidential,” outlines a new advertising service that expands how the social network sells corporations’ access to its users and their lives: Instead of merely offering advertisers the ability to target people based on demographics and consumer preferences, Facebook instead offers the ability to target them based on how they will behave, what they will buy, and what they will think. These capabilities are the fruits of a self-improving, artificial intelligence-powered prediction engine, first unveiled by Facebook in 2016 and dubbed “FBLearner Flow.”
One slide in the document touts Facebook’s ability to “predict future behavior,” allowing companies to target people on the basis of decisions they haven’t even made yet. This would, potentially, give third parties the opportunity to alter a consumer’s anticipated course. Here, Facebook explains how it can comb through its entire user base of over 2 billion individuals and produce millions of people who are “at risk” of jumping ship from one brand to a competitor. These individuals could then be targeted aggressively with advertising that could pre-empt and change their decision entirely — something Facebook calls “improved marketing efficiency.” This isn’t Facebook showing you Chevy ads because you’ve been reading about Ford all week — old hat in the online marketing world — rather Facebook using facts of your life to predict that in the near future, you’re going to get sick of your car. Facebook’s name for this service: “loyalty prediction.”
Facebook explains it can comb through its 2 billion users and produce millions “at risk” of jumping ship from one brand to a competitor.
Spiritually, Facebook’s artificial intelligence advertising has a lot in common with political consultancy Cambridge Analytica’s controversial “psychographic” profiling of voters, which uses mundane consumer demographics (what you’re interested in, where you live) to predict political action. But unlike Cambridge Analytica and its peers, who must content themselves with whatever data they can extract from Facebook’s public interfaces, Facebook is sitting on the motherlode, with unfettered access to staggering databases of behavior and preferences. A 2016 ProPublica report found some 29,000 different criteria for each individual Facebook user.
Zuckerberg has acted to distance his company from Cambridge Analytica, whose efforts on behalf of Donald Trump were fueled by Facebook data, telling reporters on a recent conference call that the social network is a careful guardian of information:
The vast majority of data that Facebook knows about you is because you chose to share it. Right? It’s not tracking. There are other internet companies or data brokers or folks that might try to track and sell data, but we don’t buy and sell. … For some reason, we haven’t been able to kick this notion for years that people think we will sell data to advertisers. We don’t. That’s not been a thing that we do. Actually it just goes counter to our own incentives. Even if we wanted to do that, it just wouldn’t make sense to do that.
The Facebook document makes a similar gesture toward user protection, noting that all data is “aggregated and anonymized [to protect] user privacy,” meaning Facebook is not selling lists of users, but rather essentially renting out access to them. But these defenses play up a distinction without a difference: Regardless of who is mining the raw data Facebook sits on, the end result, which the company eagerly monetizes, are advertising insights that are very intimately about you — nowpackaged and augmented by the company’s marquee machine learning initiative. And although Zuckerberg and company are technically, narrowly correct when they claim that Facebook isn’t in the business of selling your data, what they’re really selling is far more valuable, the kind of 21st century insights only possible for a company with essentially unlimited resources. The reality is that Zuckerberg has far more in common with the likes of Equifax and Experian than any consumer-oriented company. Facebook is essentially a data wholesaler, period.
The document does not detail what information from Facebook’s user dossiers is included or excluded from the prediction engine, but it does mention drawing on location, device information, Wi-Fi network details, video usage, affinities, and details of friendships, including how similar a user is to their friends. All of this data can then be fed into FBLearner Flow, which will use it to essentially run a computer simulation of a facet of a user’s life, with the results sold to a corporate customer. The company describes this practice as “Facebook’s Machine Learning expertise” used for corporate “core business challenges.”
Experts consulted by The Intercept said the systems described in the document raise a variety of ethical issues, including how the technology might be used to manipulate users, influence elections, or strong-arm businesses. Facebook has an “ethical obligation” to disclose how it uses artificial intelligence to monetize your data, said Tim Hwang, director of the Harvard-MIT Ethics and Governance of AI Initiative, even if Facebook has a practical incentive to keep the technology under wraps. “Letting people know that predictions can happen can itself influence the results,” Hwang said.
No comments:
Post a Comment