Explore our Digital Marketing Strategy and Planning Toolkit

Facebook: Is your life in their database?

Author's avatar By Jonathan Gabay 04 Apr, 2018
Essential Essential topic

Everything you need to know about the Cambridge Analytica scandal, the use of big data what the future holds for Facebook

Each year, targeting technologies get increasingly sophisticated. Machine Learning has turned simple algorithms into sophisticated Big Data which helps companies precisely focus specific messages at apt times to appropriate audiences

Many rely on extrapolating data from Silicon Valley giants to target users with ads across platforms. Some companies openly market stockpiles of raw data to third parties. Companies like Nielsen, Comscore, Xaxis, Rocketfuel, and an array of anonymous data brokers nestle on top of a summit of hoard of consumer information. Reportedly much of this data has been exploited to power political campaigns around the world.

From a consumer perspective, the mushrooming sector, with its ever-blooming spores of bits and bytes of data on all of us of has remained mostly unregulated. The wider public has for years, trusted Facebook, in particular, to act as responsible custodians of personal details

However, some actually within the information exchange market may have thought otherwise. Brands are able to amalgamate their own data with personal information from data brokers.  Matching algorithms and cross-referring data against other information, like voter files, brands, from toothpaste producers to cosmetic companies, can manipulate ‘anonymous’ user data from Facebook or other providers.

Facebook users can be microtargeted by deeply embedded interests and demographics.  Facebook marketing tools like ‘Custom Audiences Campaigns’ enable brands to upload pre-selected lists of people to the platform, and then with its ‘Lookalike Audiences’ tool recognize others with similar traits to them. (Facebook recently announced that it would put a stop to another feature that allows large third-party data brokers, Axciom and Experian, to directly add their own ad targeting to the social network.

Facebook eye

A timeline of abusing an ethical approach to psycho-targeting

As a brand psychologist, over the years I have increasingly seen companies take psychographics into serious account when targeting.  My own approach has been to deal with human-based emotions and biases - as opposed to relying on purely algorithms or artificial intelligence- motivated campaigns.

In early 2014, two-years prior to an extraordinary election marred by bloodied scratches of false stories, cyberattacks, and extraneous disinformation campaigns, thousands of Americans were asked to participate in a quiz.

Amazon’s Mechanical Turk platform, paid between $1- $2 for people to answer personality questions, as well as reveal not simply personal, but friends’ Facebook details.  A similar request was put out on Qualtrics, a survey website. Before long, people began to suspect the request violated Amazon’s own rules.

Aleksander Kogan, a Cambridge University psychology lecturer led the compilation of the data. He was allegedly paid by the political consulting firm, Cambridge Analytica to gather as much Facebook data on Americans in key U.S. states as possible.

The firm, later claimed that its digital armoury which included, as its then CEO boasted in September 2016, a psychometric "model to predict the personality of every single adult in the United States of America" meant the company was instrumental in the 45th American Presidential winning campaign.

Given sufficient data, Cambridge Analytica could effectively gerrymander the mind of the electorate; micro-targeting voters with emotionally tailored, inconspicuous online ads.

Eventually some 50 million user profiles were data harvested. The controversy became the centre of a global firestorm, leaving Facebook executives, which originally hosted the information, scrambling to douse the flames on its biggest crisis to date.

CEO Mark Zuckerberg explained that Facebook first learned about the Cambridge Analytica project in December 2015 from a Guardian newspaper article. Facebook was guaranteed that the data had been deleted.

Facebook barely mentioned Kogan’s main collaborator, Joseph Chancellor, a former postdoctoral researcher at Cambridge University who began working at Facebook that same month. Subsequently, Facebook said it was reviewing Chancellor’s role.

Concerns about the Cambridge Analytica project—also detailed in 2017 by reporters at Das Magazin and The Intercept—first emerged in 2014 inside the university’s Psychometrics Centre.  As the data harvest was underway, the school turned to an external arbitrator in an effort to resolve a dispute between Kogan and his colleagues. According to the magazine Fast Company, there were concerns about Cambridge Analytica’s interest in licensing the university’s own cache of models and Facebook data.

There were also suspicions that Kogan’s work for Cambridge Analytica, may have improperly used the school’s own academic research and database, which held millions of Facebook profiles.

Kogan denied he had used academic data for his project. The arbitration ended inconclusively after Kogan cited a nondisclosure agreement with Cambridge Analytica.

Michal Kosinski, then deputy director of the Psychometrics Centre, said in November 2017 that he couldn’t be certain that the centre’s data hadn’t been inappropriately used.

A Cambridge University spokesperson said that it had no evidence suggesting that Kogan had used the Centre’s resources, and that it had sought and received assurances to that effect. He emphasized that Cambridge Analytica had no affiliation with the University.

The university’s own database, with over 6 million anonymous Facebook profiles, remains perhaps the largest known public cache of Facebook data for research purposes. For five years, Kosinski and David Stillwell, then a research associate, used a popular early Facebook app created by Stillwell, “My Personality,” to collect Facebook data via personality quizzes (with users’ consent).

In a 2013 paper in the Proceedings of the National Academy of Sciences, they used the database to show how people’s social media data can be used to score and predict human personality traits with surprising accuracy.

Cambridge University’s psychometric predictor, Kogan, led his own workshop at Cambridge devoted to pro-sociality and well-being.  He initially discussed psychometrics with Cambridge Analytica in January 2014. He subsequently offered to license the school’s prediction models for Cambridge Analytica’s affiliate, SCL Group. [That is not unusual as universities regularly license their research for commercial purposes to gain funding]. The negotiations collapsed. Kogan then enlisted Chancellor, and the two co-founded a company, Global Science Research, to build their own cache of Facebook data and psychological models.

Reportedly Facebook said Kogan’s permission to harvest significant masses of data was constrained to academic use. Sharing the data with third parties contravined its rules

At the Psychometrics Centre’s request, Kogan, Chancellor, and SCL offered written undertakings that none of the university’s intellectual property had been sent to the firm.  The matter was dismissed.

Within a few months, Kogan and Chancellor finished their own data-harvest, of in excess of 34 million psychometric scores and data on 50 million Facebook profiles. Cambridge Analytica paid around $800,000.  By the summer of 2015, Chancellor boasted on his LinkedIn page that Global Science Research now possessed “a massive data pool of 40-plus million individuals across the United States—for each of whom we have generated detailed characteristic and trait profiles.”

In December 2015, as Facebook investigated the data harvest, Chancellor began working at Facebook Research. (According to his company page, his interests included “happiness, emotions, social influences, and positive character traits.”)

In December 2015, after another Guardian report, Amazon banned Kogan.  By then, thousands of Americans, along with their friends—millions of U.S. voters who never even knew about the quizzes—were unknowingly drawn into propaganda campaign, waged perhaps not by Russians - as suggested by propagators of ‘fake news’ -but Britons and Americans.

Special counsel Robert Mueller, who spearheaded investigations into possible links between the Trump campaign and Russia, reportedly still wants to know where Cambridge Analytica’s data went.

In 2017, his team obtained search warrants to examine the records of Facebook. It also interviewed the 45th President’s son-in-law Jared Kushner, and Trump campaign staffers, as well as subpoenaing Steve Bannon. (From 2014 to mid-2016, The former Trump adviser was a vice president at Cambridge Analytica).

Previous Trump adviser, Lt. General Michael Flynn, who pled guilty in the Mueller probe to lying about his conversations with Russian officials, disclosed in August 2017 that he was employed as an adviser to Cambridge affiliate SCL Group.

Cambridge Analytica repeated its claim that it deleted the Facebook data in 2015.  Not only that, in 2016, it carried out an internal audit to ensure all data had been deleted.

Michal Kosinski, (the former deputy director of the Psychometrics Centre) remained sceptical about Cambridge Analytica’s claims. “CA would say anything to reduce the legal heat they are in,” he wrote in an email in 2017 November.

Whilst unnerving, pragmatically speaking, whatever Cambridge Analytica may, or may not have done for the 45th President, exploring and speaking at digital marketing fairs around the world, for me it seems clear that for political, cultural, sports and commercial brands alike, psychological based messaging – used ethically – is here to stay.

Information has become more than binary data – but the DNA of billions of people’s life stories.

New Data – Same Story

Cambridge Analytica campaigns reportedly used freely available data to target voters along psychological lines. In 2013 commentary in Science, Kosinski warned of the detail revealed in one’s online behaviour—and what might happen if non-academic entities got their hands on this data, too.

“Commercial companies, governmental institutions, or even your Facebook friends could use software to infer attributes such as intelligence, sexual orientation, or political views that an individual may not have intended to share,” Kosinski wrote.

Recent marketing experiments on Facebook by Kosinski and Stillwell have shown that advertisements geared to an individual’s personality—specifically an introverted or extroverted woman—can lead to up to 50% more purchases of beauty products than untailored or badly tailored ads.

Marketing to the weak-willed?

At the same time, they noted, “Psychological mass persuasion could manipulate people to behave in ways that are neither in their best interest nor in the best interest of society.”

For instance, certain ads could be targeted at those who are considered “vulnerable” to believing fraudulent news stories on social media, or likely to share them with others.

A research paper seen by reporters at Cambridge Analytica’s offices in 2016 suggested the company was interested in research about people with a low “need for cognition”—that is, people who don’t use cognitive processes to make decisions or who lack the knowledge to do so. In late 2016, researchers found evidence indicating that Trump had found disproportionate support among that group—so-called “low information voters.”

in a 2017 document obtained by The Australian, a Facebook manager told advertisers that the platform could detect teenage users’ emotional states in order to better target ads at users who feel “insecure,” “anxious,” or “worthless.” Facebook has said it does not do this, and that the document was provisional.

Influence is contagious

Facebook’s own experiments in psychological influence date back at least to 2012, when its researchers conducted an “emotional contagion” study on 700,000 users. By putting certain words in people’s feeds, they could influence users’ moods in refined and predictable ways.

However, their report attracted widespread criticism for failing to obtain participant consent. Chief operating officer Sheryl Sandburg apologized.

Following the controversy, it was widely predicted that users would leave Facebook in their droves. However, despite it all, most are (for now) choosing to remain on the platform. LIKE it or not - in today’s digital world, it offers contemporary human contact – the alternative for millions is simply untenable

Author's avatar

By Jonathan Gabay

Jonathan Gabay is one of Europe’s premier creative branding authorities. He is author of 15 books including university textbooks on copywriting. His latest title is Brand Psychology. Jonathan is a regular keynote speaker for major brands around the world. News organizations including: CNN, BBC, Sky and many more trust Jonathan to explain the stories behind the biggest brand news headlines.

Recommended Blog Posts