BY TRACEY LIEN
LOS ANGELES TIMES /TNS
SAN FRANCISCO – Many Facebook users rely on the social network to figure out what’s going on in the world. But what if the world Facebook shows them is wildly distorted?
That’s the question raised after a former employee of a data mining firm that worked for Donald Trump’s presidential campaign alleged the company used Facebook to bombard specific individuals with misinformation in hopes of swaying their political views.
The accusations raised alarm across the Atlantic on Monday, sparking an investigation into the firm, Cambridge Analytica, by the United Kingdom’s Information Commissioner’s Office.
In the U.S., Sen. Ron Wyden, D-Ore., sent a letter asking Facebook Chief Executive Mark Zuckerberg whether the social media giant was aware of other data violations on its platform, and why it failed to take action sooner.
Stock price dives
The controversy drove Facebook’s stock price down nearly 7 percent on Monday, suggesting that investors are feeling skittish about the regulatory liabilities of a company that has spent the last year dogged by questions of fake news and Russian propaganda.
The scope of Facebook’s problems ballooned after Christopher Wylie, a political strategist who used to work for Cambridge Analytica, alleged on NBC’s “Today” show Monday that the firm believed that if it could “capture every channel of information around a person and then inject content around them, you can change their perception of what’s actually happening.”
By mining Facebook user data, Wylie said the company could tailor the ads and articles individual users would see – a practice he calls “informational dominance.”
In a video secretly recorded by Britain’s Channel 4, Mark Turnbull, managing director of Cambridge Analytica’s political division, suggests users targeted by the firm wouldn’t know their online experience was being manipulated.
“We just put information into the bloodstream of the internet … and then watch it grow, give it a little push every now and again … like a remote control,” he said. “It has to happen without anyone thinking, ‘that’s propaganda,’ because the moment you think ‘that’s propaganda,’ the next question is, ‘who’s put that out?’”
Turnbull, according to Channel 4, also bragged about the firm’s practice of recording politicians in compromising situations with bribes and sex workers.
In a statement sent to the Los Angeles Times, Cambridge Analytica accused Channel 4 of entrapment and rejected the allegations made in the report. In a separate statement, also issued Monday, the firm said it did not carry out “personality targeted advertising” for President Donald Trump’s campaign.
The company obtained the Facebook data from millions of accounts through a Cambridge University psychology professor who had permission to gather information on users of the social media platform, but violated Facebook guidelines by passing it on to a third party for commercial purposes.
Although Cambridge Analytica said in a news release over the weekend that it deleted this data as soon as it learned it broke Facebook’s rules, Wylie alleged that the firm continued to use the information.
What’s worrisome about Cambridge’s alleged practice, say social media and psychology experts, is that it works on even the most rational of people.
“Attribution theory teaches us that if you hear the same thing from multiple sources, then you start believing that it might be true even if you originally questioned it,” said Karen North, a social media professor at the University of Southern California who has also studied psychology.
Full of disinformation?
In Cambridge Analytica’s case, Wylie on Monday accused the firm of going beyond simply serving targeted ads to people on Facebook. He alleged that the firm “works on creating a web of disinformation” so that unwitting consumers are confronted with the same lies and false stories both on and off Facebook.
The ability to target ads at individuals isn’t unique to Facebook. But what makes the social media giant’s role profound is the breadth and depth of information it collects and the sheer number of people who use the service.
Last year 67 percent of Americans told Pew Research that they get at least some of their news on social media. In 2016, 64 percent of those who got their news from social media got it from only one source – most commonly Facebook.
Since the 2012 presidential campaign, Facebook has been the “No. 1 destination” for digital media strategists looking to influence politics, according to Laura Olin, a digital strategist who ran social media strategy for former President Barack Obama’s re-election campaign.
Prime media outlet
Prior to that election, campaigns spread their focus among Facebook, Twitter and traditional media outlets, she said. But in 2012, three things became clear:
•People were spending more of their online time on Facebook than anywhere else.
•It reached a broader demographic than its competitors.
•Ads could be targeted more effectively on Facebook than on other platforms.
The Obama campaign that year was able to aim advertisements and messages at voters based on gender, location and existing political beliefs.
In 2013, 47 percent of Americans used Facebook as a source for news, according to research from Pew. In 2016, that number had grown to 63 percent. Facebook itself has nearly 2.2 billion people who visit its website and app every month, and its subsidiaries continue to grow, with Instagram commanding nearly a billion monthly active users, WhatsApp recording more than a billion users, and Messenger at more than 900 million users.
The social network has pledged to more than double its current team of 10,000 content moderators by the end of 2018 to keep false and misleading information in check. But with hundreds of millions of photos, videos and articles uploaded to Facebook every day, safety and security experts question whether this will be enough.