Your guide to Cambridge Analytica and Facebook
Get ready to find out if your Facebook data has been swept up in the Cambridge Analytica scandal.
Carl Court/Getty Images
Facebook users find out if their data was shared with Cambridge Analytica, starting Monday
Get ready to find out if your Facebook data has been swept up in the Cambridge Analytica scandal.
Starting Monday (4/9/2018), the 87 million users who might have had their data shared with Cambridge Analytica will get a detailed message on their news feeds. Facebook says most of the affected users (more than 70 million) are in the U.S., though there are over a million each in the Philippines, Indonesia and the U.K.
In addition, all 2.2 billion Facebook users will receive a notice titled “Protecting Your Information” with a link to see what apps they use and what information they have shared with those apps. If they want, they can shut off apps individually or turn off third-party access to their apps completely.
Reeling from its worst privacy crisis in history — allegations that this Trump-affiliated data mining firm may have used ill-gotten user data to try to influence elections — Facebook is in full damage-control mode. CEO Mark Zuckerberg acknowledged that he made a “huge mistake” in failing to take a broad enough view of what Facebook’s responsibility is in the world. He’s set to testify before Congress next week.
Cambridge Analytica whistleblower Christopher Wylie previously estimated that more than 50 million people were compromised by a personality quiz that collected data from users and their friends. In an interview aired Sunday on NBC’s Meet the Press, Wylie said the true number could be even larger than 87 million.
That Facebook app, called “This is Your Digital Life,” was a personality quiz created in 2014 by an academic researcher named Aleksander Kogan, who paid about 270,000 people to take it. The app vacuumed up not just the data of the people who took it, but also — thanks to Facebook’s loose restrictions — data from their friends, too, including details that they hadn’t intended to share publicly.
Facebook later limited the data apps can access, but it was too late in this case.
Zuckerberg said Facebook came up with the 87 million figure by calculating the maximum number of friends that users could have had while Kogan’s app was collecting data. The company doesn’t have logs going back that far, he said, so it can’t know exactly how many people may have been affected.
Cambridge Analytica said in a statement Wednesday that it had data for only 30 million Facebook users.
Delete Facebook? It's a lot more complicated than that
In recent days, Facebook users have piled onto the hashtag #DeleteFacebook, threatening to desert their Facebook accounts to protest the social media giant's mishandling of their personal information.
Despite all the talk, it's unlikely a significant number of them will walk, even after allegations that the consulting firm Cambridge Analytica obtained and kept the data of tens of millions of users to help get Donald Trump elected, and Facebook didn't stop it.
Instead, some people are toying with a social media sabbatical or detox — or just using Facebook less. The prospect of severing that digital lifeline to family and friends and leaving behind an extensive archive of treasured moments is unthinkable, especially when there are few good alternatives apart from Instagram, which is also owned by Facebook.
Eighty-four percent of users are somewhat or very concerned how their data may be used by Facebook, according to a new survey this week from investment firm Raymond James. Yet nearly half of people — 48% — surveyed indicated they would not cut back on how much they use the social network.
"While a relatively high percentage indicated they expect to use Facebook less, we believe these user concerns could ease as the news cycle slows," Raymond James said in a research note.
The reason? Facebook has become a utility that people the world over can't do without.
"It is part of the global Internet infrastructure now," says Safiya Noble, a University of Southern California professor and author of Algorithms of Oppression: How Search Engines Reinforce Racism. "Many people no longer use the phone book to find people or Consumer Reports to evaluate products and services. They rely upon their social networks through Facebook."
Many Facebook users say they made their peace with the Big Brother collection of their data a long time ago.
"While I don't love what’s come out of the Cambridge Analytica findings, I can't say that I'm surprised," says Josh Johnson, 28, of Louisville. Johnson, who calls himself a social media influencer, has no plans to delete his Facebook account.
"We're living in a digital age where everything we do, say, and search for is tracked, recorded and logged away somewhere. If people are really beginning to delete their Facebook over these findings, they’d better go ahead and delete all their social accounts and go back to landline phones as well."
Michele Brosius, a 49-year-old blogger from Pillow, Pa., says she's not deleting her Facebook account, either. She knew from the moment she put her data on the Internet that it was up for grabs. Facebook isn't the only one tracking her. Anytime she uses a store rewards card, a credit card, takes surveys or picks up an electronic device, she knows someone's watching her.
"Being connected is part of my life," Brosius says. "I have no plans to go off the grid to regain my privacy."
Still, Facebook has never in its 14-year existence seen such a tidal wave of negative sentiment.
Facebook's product is intimacy. It's the No. 1 place people connect with family and friends, where they share feelings, thoughts, opinions. That deep level of intimacy requires a corresponding level of trust. And people have been questioning whether they should trust Facebook on and off for years. But this is the first incident that has prompted so many of them to really reconsider — or at least recalibrate — their relationship with Facebook. And that's not good news for the Silicon Valley company.
"I don't think we've seen a meaningful number of people act on that," Facebook CEO Mark Zuckerberg told CNN last week of the #DeleteFacebook chatter on Facebook and other social media. "But, you know, it's not good."
Nearly half — 45% — of Facebook users say they will use Facebook somewhat less or significantly less, and 8% say they will stop using it altogether, according to the Raymond James survey.
Elon Musk deleted the Facebook pages for his companies, Tesla and SpaceX. Facebook, he tweeted, "gives me the willies." Playboy said Wednesday it would deactivate its Facebook accounts over concerns about the social network's mishandling of user data.
For years Facebook has been a happy online home for Matthew Frankel, a 46-year-old communications strategist and father of two from Montclair, N.J., where he kept a digital scrapbook of special moments to share with friends. It took Cambridge Analytica to convince Frankel to spend significantly less time on Facebook.
Frankel says he's not the only one rethinking his "blind trust" in Facebook, and he expects other people will cut back on their usage, too.
"At the end of the day, how much information do I need to share? How much information do I want to share? Does it really matter if I am getting a like or not?" he said. "I don't want my day to be dependent on that."
Cambridge Analytica has only compounded growing ambivalence about the role Facebook plays in people's lives. Even before the data leak, people had begun to wonder how healthy it is to spend hours a day or a week hunched over their phone or computer scrolling through friends' updates. Some users were already scaling back how much time they spend on Facebook, weary of the toxic content flowing through it: violent live videos, fabricated news articles, conflicts over the presidential election and Donald Trump and divisive messages from Russian operatives.
Facebook has also come under fire for exploiting vulnerabilities in human psychology to hook people on social media, hijacking their time and attention and undermining their well-being. In recent months, Facebook admitted that passive use of Facebook — aimless scrolling through the news feed — can be bad for mental health.
Restoring the faith of its users is critical for Facebook, which is hunkered down in damage control mode. Zuckerberg plans to testify on Capitol Hill for the first time. And on Wednesday, Facebook said it will roll out a new system for its users to better control their privacy and security settings.
It's too little too late for Rob Getzschman. The 40-year-old video editor from Los Angeles deactivated his account Monday, telling friends that "there wasn't a lot of meaningful human-to-human contact on Facebook, and in many cases, it was uselessly negative interactions with virtual strangers."
"We use Facebook by habit," he said in an interview. "That's the hard thing. The behavior is so ingrained. We have something to share, and Facebook is our default mode to share it."
Getzschman says he has been trying to break up with Facebook for years. He experimented with removing the Facebook app from his phone to dial back how much time he was spending on it. One time, he deleted the Facebook app for a year and had time to read the entire Game of Thrones series on his e-reader.
"The thing that I like is that Facebook is not even an option anymore," he says of quitting the social network. "There's always the mind-set, 'Oh maybe I'll just check in and see what people are doing.' It's nice to have Facebook off my brain."
Leaving Facebook gave Getzschman the same palpable relief he felt when he ditched MySpace years ago.
"There's this digital hoarding part of you. All of those memories. All of those images. All of those interactions. But you don't have to have that record for the rest of your life," he said. "That's just what Facebook wants you to do because it's monetizable content. It's nice to let go of all of it."
How to delete all those Facebook apps you probably have
More specifically, take a close look at apps that let developers glom onto your personal information and then potentially share, or re-sell, that information to others.
To recap: this week's tech headlines have been all about Facebook and the crisis for the social network. The social network revealed that consultancy Cambridge Analytica had used vast amounts of data from Facebook to build profiles of American voters to help Donald Trump's campaign. Cambridge got the data via a researcher who created one of those seemingly harmless personality quiz apps that asked users to answer questions about their digital lives.
Facebook says it has stricter controls than it used to, and will now take a good, hard look at all its app developers to weed out abuses. You can take that at face value and either believe them, or be highly skeptical. (I'm in the latter camp.)
While you wait for Facebook to (hopefully) change, you can take action. Get rid of as many apps as you can now.
Unfortunately, Facebook makes this really hard to do. More on that in a second.
Many Facebook users don't realize how often they've clicked a button to grant app developers access to their lives, ages and likes in exchange for the luxury of not having to register with e-mail addresses or other personal information. They grant sign-on access via Facebook with one click, and in turn, those app developers can get personal data.
The takeaway: It's smarter to register for access with the app itself, instead of using the Facebook sign-in.
In the meantime, check your Facebook setting to see how many apps have been granted access. I did this week, expecting the list to be around 100 or so. (I'm a tech reporter— check out apps all the time.) Instead, it nearly topped 400. Four hundred! From the ones that I actually use and like, like Airbnb, Booking.com and Gas Buddy to apps that have been dead for years, like Path, Phanfare and Revision 3.
To delete the apps, click the checkmark next to the question mark at the top right of the News Feed, select Settings, then Apps on the left-side menu, and then Apps, Web sites and Plug-ins.
From there, take a look at who you've granted access to, and start deleting those apps you don's use.
Here's where you'll see just how difficult Facebook makes it. There is no Select All button, or even a way to select multiple apps at once. You'll have to delete each one, one by one, And each time, Facebook will say: "This will remove the app from your account, your bookmarks and the list of apps you use ." And then this kicker, that the app "may still have the data you shared with them." For information about removing this data, contact the developer, Facebook advises.
Facebook's messaging is the same for all apps, and you have to endure it with every delete.
"It's easier to get rid of a car that's a lemon and return it to the dealer than it is to get rid of Facebook apps," says Jeremiah Owyang, an analyst with Kaleido Insights. "This is the Facebook business model. We are the inventory, and they're not going to let us just walk away."
But you can. It will just take some time. But it's worth it.
Read Facebook CEO Mark Zuckerberg's planned testimony before Congress
HEARING BEFORE THE UNITED STATES HOUSE OF REPRESENTATIVES COMMITTEE ON ENERGY AND COMMERCE
April 11, 2018
Testimony of Mark Zuckerberg Chairman and Chief Executive Officer, Facebook
I. INTRODUCTION
Chairman Walden, Ranking Member Pallone, and Members of the Committee,
We face a number of important issues around privacy, safety, and democracy, and you will rightfully have some hard questions for me to answer. Before I talk about the steps we’re taking to address them, I want to talk about how we got here.
Facebook is an idealistic and optimistic company. For most of our existence, we focused on all the good that connecting people can bring. As Facebook has grown, people everywhere have gotten a powerful new tool to stay connected to the people they love, make their voices heard, and build communities and businesses. Just recently, we’ve seen the #metoo movement and the March for Our Lives, organized, at least in part, on Facebook. After Hurricane Harvey, people raised more than $20 million for relief. And more than 70 million small businesses now use Facebook to grow and create jobs.
But it’s clear now that we didn’t do enough to prevent these tools from being used for harm as well. That goes for fake news, foreign interference in elections, and hate speech, as well as developers and data privacy. We didn’t take a broad enough view of our responsibility, and that was a big mistake. It was my mistake, and I’m sorry. I started Facebook, I run it, and I’m responsible for what happens here.
So now we have to go through every part of our relationship with people and make sure we’re taking a broad enough view of our responsibility. It’s not enough to just connect people, we have to make sure those connections are positive. It’s not enough to just give people a voice, we have to make sure people aren’t using it to hurt people or spread misinformation.
It’s not enough to give people control of their information, we have to make sure developers they’ve given it to are protecting it too. Across the board, we have a responsibility to not just build tools, but to make sure those tools are used for good.
It will take some time to work through all of the changes we need to make, but I’m committed to getting it right. That includes improving the way we protect people’s information and safeguard elections around the world. Here are a few key things we’re doing:
II. CAMBRIDGE ANALYTICA
Over the past few weeks, we’ve been working to understand exactly what happened with Cambridge Analytica and taking steps to make sure this doesn’t happen again. We took important actions to prevent this from happening again today four years ago, but we also made mistakes, there’s more to do, and we need to step up and do it.
A. What Happened
In 2007, we launched the Facebook Platform with the vision that more apps should be social. Your calendar should be able to show your friends’ birthdays, your maps should show where your friends live, and your address book should show their pictures. To do this, we enabled people to log into apps and share who their friends were and some information about them.
In 2013, a Cambridge University researcher named Aleksandr Kogan created a personality quiz app. It was installed by around 300,000 people who agreed to share some of their Facebook information as well as some information from their friends whose privacy settings allowed it. Given the way our platform worked at the time this meant Kogan was able to access some information about tens of millions of their friends.
In 2014, to prevent abusive apps, we announced that we were changing the entire platform to dramatically limit the Facebook information apps could access. Most importantly, apps like Kogan’s could no longer ask for information about a person’s friends unless their friends had also authorized the app. We also required developers to get approval from Facebook before they could request any data beyond a user’s public profile, friend list, and email address. These actions would prevent any app like Kogan’s from being able to access as much Facebook data today.
In 2015, we learned from journalists at The Guardian that Kogan had shared data from his app with Cambridge Analytica. It is against our policies for developers to share data without people’s consent, so we immediately banned Kogan’s app from our platform, and demanded that Kogan and other entities he gave the data to, including Cambridge Analytica, formally certify that they had deleted all improperly acquired data — which they ultimately did.
Last month, we learned from The Guardian, The New York Times and Channel 4 that Cambridge Analytica may not have deleted the data as they had certified. We immediately banned them from using any of our services. Cambridge Analytica claims they have already deleted the data and has agreed to a forensic audit by a firm we hired to investigate this. We’re also working with the U.K. Information Commissioner’s Office, which has jurisdiction over Cambridge Analytica, as it completes its investigation into what happened.
B. What We Are Doing
We have a responsibility to make sure what happened with Kogan and Cambridge Analytica doesn’t happen again. Here are some of the steps we’re taking:
Safeguarding our platform. We need to make sure that developers like Kogan who got access to a lot of information in the past can’t get access to as much information going forward.
• We made some big changes to the Facebook platform in 2014 to dramatically restrict the amount of data that developers can access and to proactively review the apps on our platform. This makes it so a developer today can’t do what Kogan did years ago.
• But there’s more we can do here to limit the information developers can access and put more safeguards in place to prevent abuse.
• We’re removing developers’ access to your data if you haven’t used their app in three months.
• We’re reducing the data you give an app when you approve it to only your name, profile photo, and email address. That’s a lot less than apps can get on any other major app platform.
• We’re requiring developers to not only get approval but also to sign a contract that imposes strict requirements in order to ask anyone for access to their posts or other private data.
• We’re restricting more APIs like groups and events. You should be able to sign into apps and share your public information easily, but anything that might also share other people’s information — like other posts in groups you’re in or other people going to events you’re going to — will be much more restricted.
• Two weeks ago, we found out that a feature that lets you look someone up by their phone number and email was abused. This feature is useful in cases where people have the same name, but it was abused to link people’s public Facebook information to a phone number they already had. When we found out about the abuse, we shut this feature down.
Investigating other apps We’re in the process of investigating every app that had access to a large amount of information before we locked down our platform in 2014. If we detect suspicious activity, we’ll do a full forensic audit. And if we find that someone is improperly using data, we’ll ban them and tell everyone affected.
Building better controls Finally, we’re making it easier to understand which apps you’ve allowed to access your data. This week we started showing everyone a list of the apps you’ve used and an easy way to revoke their permissions to your data. You can already do this in your privacy settings, but we’re going to put it at the top of News Feed to make sure everyone sees it. And we also told everyone whose Facebook information may have been shared with Cambridge Analytica.
Beyond the steps we had already taken in 2014, I believe these are the next steps we must take to continue to secure our platform.
III. RUSSIAN ELECTION INTERFERENCE
Facebook’s mission is about giving people a voice and bringing people closer together. Those are deeply democratic values and we’re proud of them. I don’t want anyone to use our tools to undermine democracy. That’s not what we stand for. We were too slow to spot and respond to Russian interference, and we’re working hard to get better. Our sophistication in handling these threats is growing and improving quickly. We will continue working with the government to understand the full extent of Russian interference, and we will do our part not only to ensure the integrity of free and fair elections around the world, but also to give everyone a voice and to be a force for good in democracy everywhere.
A. What Happened
Elections have always been especially sensitive times for our security team, and the 2016 U.S. presidential election was no exception.
Our security team has been aware of traditional Russian cyber threats — like hacking and malware — for years. Leading up to Election Day in November 2016, we detected and dealt with several threats with ties to Russia. This included activity by a group called APT28, that the U.S. government has publicly linked to Russian military intelligence services.
But while our primary focus was on traditional threats, we also saw some new behavior in the summer of 2016 when APT28-related accounts, under the banner of DC Leaks, created fake personas that were used to seed stolen information to journalists. We shut these accounts down for violating our policies.
After the election, we continued to investigate and learn more about these new threats. What we found was that bad actors had used coordinated networks of fake accounts to interfere in the election: promoting or attacking specific candidates and causes, creating distrust in political institutions, or simply spreading confusion. Some of these bad actors also used our ads tools.
We also learned about a disinformation campaign run by the Internet Research Agency (IRA) — a Russian agency that has repeatedly acted deceptively and tried to manipulate people in the US, Europe, and Russia. We found about 470 accounts and pages linked to the IRA, which generated around 80,000 Facebook posts over about a two-year period.
Our best estimate is that approximately 126 million people may have been served content from a Facebook Page associated with the IRA at some point during that period. On Instagram, where our data on reach is not as complete, we found about 120,000 pieces of content, and estimate that an additional 20 million people were likely served it.
Over the same period, the IRA also spent approximately $100,000 on more than 3,000 ads on 5 Facebook and Instagram, which were seen by an estimated 11 million people in the United States. We shut down these IRA accounts in August 2017.
B. What We Are Doing
There’s no question that we should have spotted Russian interference earlier, and we’re working hard to make sure it doesn’t happen again. Our actions include:
• Building new technology to prevent abuse. Since 2016, we have improved our techniques to prevent nation states from interfering in foreign elections, and we’ve built more advanced AI tools to remove fake accounts more generally. There have been a number of important elections since then where these new tools have been successfully deployed. For example:
• In France, leading up to the presidential election in 2017, we found and took down 30,000 fake accounts.
• In Germany, before the 2017 elections, we worked directly with the election commission to learn from them about the threats they saw and to share information.
• In the U.S. Senate Alabama special election last year, we deployed new AI tools that proactively detected and removed fake accounts from Macedonia trying to spread misinformation.
• We have disabled thousands of accounts tied to organized, financially motivated fake news spammers. These investigations have been used to improve our automated systems that find fake accounts.
• Last week, we took down more than 270 additional pages and accounts operated by the IRA and used to target people in Russia and Russian speakers in countries like Azerbaijan, Uzbekistan and Ukraine. Some of the pages we removed belong to Russian news organizations that we determined were controlled by the IRA.
• Significantly increasing our investment in security. We now have about 15,000 people working on security and content review. We’ll have more than 20,000 by the end of this year.
• I’ve directed our teams to invest so much in security — on top of the other investments we’re making — that it will significantly impact our profitability going forward. But I want to be clear about what our priority is: protecting our community is more important than maximizing our profits.
• Strengthening our advertising policies. We know some Members of Congress are exploring ways to increase transparency around political or issue advertising, and we’re happy to keep working with Congress on that. But we aren’t waiting for legislation to act.
• From now on, every advertiser who wants to run political or issue ads will need to be authorized. To get authorized, advertisers will need to confirm their identity and location. Any advertiser who doesn’t pass will be prohibited from running political or issue ads. We will also label them and advertisers will have to show you who paid for them. We’re starting this in the U.S. and expanding to the rest of the world in the coming months.
• For even greater political ads transparency, we have also built a tool that lets anyone see all of the ads a page is running. We’re testing this in Canada now and we’ll launch it globally this summer. We’re also creating a searchable archive of past political ads.
• We will also require people who manage large pages to be verified as well. This will make it much harder for people to run pages using fake accounts, or to grow virally and spread misinformation or divisive content that way. o In order to require verification for all of these pages and advertisers, we will hire thousands of more people. We’re committed to getting this done in time for the critical months before the 2018 elections in the U.S. as well as elections in Mexico, Brazil, India, Pakistan and elsewhere in the next year.
• These steps by themselves won’t stop all people trying to game the system. But they will make it a lot harder for anyone to do what the Russians did during the 2016 election and use fake accounts and pages to run ads. Election interference is a problem that’s bigger than any one platform, and that’s why we support the Honest Ads Act. This will help raise the bar for all political advertising online.
• Sharing information. We’ve been working with other technology companies to share information about threats, and we’re also cooperating with the U.S. and foreign governments on election integrity.
At the same time, it’s also important not to lose sight of the more straightforward and larger ways Facebook plays a role in elections.
In 2016, people had billions of interactions and open discussions on Facebook that may never have happened offline. Candidates had direct channels to communicate with tens of millions of citizens. Campaigns spent tens of millions of dollars organizing and advertising online to get their messages out further. And we organized “get out the vote” efforts that helped more than 2 million people register to vote who might not have voted otherwise.
Security — including around elections — isn’t a problem you ever fully solve. Organizations like the IRA are sophisticated adversaries who are constantly evolving, but we’ll keep improving our techniques to stay ahead. And we’ll also keep building tools to help more people make their voices heard in the democratic process.
IV. CONCLUSION
My top priority has always been our social mission of connecting people, building community and bringing the world closer together. Advertisers and developers will never take priority over that as long as I’m running Facebook.
I started Facebook when I was in college. We’ve come a long way since then. We now serve more than 2 billion people around the world, and every day, people use our services to stay connected with the people that matter to them most. I believe deeply in what we’re doing. And when we address these challenges, I know we’ll look back and view helping people connect and giving more people a voice as a positive force in the world.
####
I realize the issues we’re talking about today aren’t just issues for Facebook and our community — they’re challenges for all of us as Americans. Thank you for having me here today, and I’m ready to take your questions.
Can Facebook be trusted with your personal info? Voter harvesting scheme shows perils for users
That's the question many Americans are asking after revelations that a data-mining firm working for the Trump campaign improperly got its hands on the personal information of tens of millions of Facebook users and created detailed profiles that were used to target unsuspecting voters in the presidential election.
For many, the incident raises troubling new questions about how Facebook manages third-party access to the sensitive information of its 2 billion users, including what safeguards the social media giant has in place to prevent apps from sharing information and whether it has any way of knowing when it's shared more broadly than intended.
Facebook says a researcher, Cambridge University's Aleksandr Kogan, gained access to the data of 270,000 Facebook users in 2013 through a personality quiz app that required Facebook users to grant access to their personal information including friends and "likes."
According to Facebook, he then gave that information to Cambridge Analytica, the firm that claimed it helped President Trump win the 2016 election. The Trump campaign says it did not use data from Cambridge Analytica.
Facebook says the transmission of data to Cambridge Analytics was a violation of its rules and, on Friday, it suspended the firm. On Monday Facebook announced that Cambridge Analytica had agreed to an independent audit by a digital forensics firm. But the auditors were turned away by the UK Information Commissioner's Office, which is pursuing its own investigation.
Before apps gain access to Facebook users, the Silicon Valley company says it conducts "a robust review" to determine if apps have a legitimate need for users' data. It also noted it has restricted how much personal information outsiders can obtain since the Cambridge Analytica incident.
"We actually reject a significant number of apps through this process. Kogan’s app would not be permitted access to detailed friends’ data today," Facebook said.
It's unclear if that statement will assuage worried users. It certainly hasn't lowered the political heat in the U.S. and Europe, where calls are intensifying for new regulations.
And on Monday the markets reacted. Facebook had one of its worst trading days since 2012. The nearly 7% plunge in Facebook shares led a sell-off in tech stocks.
Cambridge Analytica "is another indication of systemic problems at Facebook," Pivotal Research analyst Brian Wieser wrote in a research note Sunday night.
Those systemic problems have dramatically worsened since the presidential election, with Facebook coming under intense fire on multiple fronts: Russian operatives using Facebook to manipulate voter sentiment during the presidential election, Facebook accounts spreading "fake" news, the potential for its advertising system to be used for racist targeting and its slow response to violent or harmful content on the platform.
Wieser says he does not believe the latest public relations nightmare will dent Facebook's advertising business because advertisers are unlikely to suddenly pull back on spending.
And that's the problem, says Jeffrey Chester, executive director of the Center for Digital Democracy.
"The Cambridge Analytica scandal gives us a glimpse of how Facebook makes billions of dollars off of our personal information without ever dealing with the consequences," said Chester, a longtime privacy critic of Facebook.
Tapping the personal information people freely share on the social network to target advertising is the special sauce that has turned Facebook's business into one of the world's most powerful and lucrative. But it's also Facebook's greatest weakness, making it vulnerable to criticism from privacy activists, regulators and lawmakers.
Over the weekend, U.S. and British lawmakers and privacy activists slammed Facebook, with some demanding that Facebook chief executive Mark Zuckerberg personally appear at legislative hearings.
Siva Vaidhyanathan, professor of media studies at the University of Virginia and author of the upcoming book on Facebook Antisocial Media, says Facebook's policies betrayed users.
"This was not a data breach. This was Facebook being Facebook," said Vaidhyanathan, who is calling on the FTC to open a new investigation into how people's Facebook information flows to third-party applications.
In 2007, Facebook gave third parties who created an app on the Facebook platform access to the personal information of Facebook users including friend lists, interests and "likes." The move coaxed more people to join Facebook and spend more time there, fueling the rapid rise of the social network from 58 million users to more than 2 billion.
Hundreds of app developers requested, and in some cases required, access to information in Facebook users' profiles, tapping into mountains of information about Facebook users and their friends, including hometowns, education, careers, birthdays, photos, relationship statuses and religious and political leanings.
From the Obama reelection campaign in 2012 to the success of Farmville and Words with Friends, Facebook's lax policy has allowed many outsiders vacuum up data on Facebook users and their friends, Vaidhyanathan said.
In 2015, with criticism growing, Facebook restricted third-party access to information about friends.
Marc Rotenberg, president of the Electronic Privacy Information Center, says the Cambridge Analytica incident is a textbook violation of the settlement Facebook reached with the FTC in 2011 which required that Facebook users give permission before their data was shared beyond the privacy limits they set on Facebook.
"EPIC and many consumer privacy organizations worked hard to get that order in place. It prohibited precisely what happened. But the FTC failed to enforce the order and we now live with the consequences," Rotenberg said.
The consequences can be chilling. Some third parties have been able — with the help of voter registration files, shopping histories, real estate records and other documents — to identify exactly who users are and pitch them with pinpoint accuracy on everything from a new car to a presidential candidate.
Facebook says it discovered that Cambridge Analytica accessed the data of 270,000 Facebook users in 2015 and received assurances that the firm had deleted the data. On Friday Facebook acknowledged that Cambridge Analytica may have held onto it, prompting Facebook to start an investigation and suspend the firm.
The average Facebook users has nearly 200 friends so if Cambridge Analytica accessed the friends of 270,000 users, the data of more than 50 million people could have been obtained.
That news, which broke over the weekend, didn't seem to rattle Facebook users, who have shrugged each time the Silicon Valley company has played fast and loose with their privacy.
"That really doesn’t concern me," McKenzie Guymon, a 29-year-old blogger from Utah, said of the Cambridge Analytica leak. "I didn’t let Facebook make up my mind in the election. In fact, I stayed off of Facebook because of the amount of nastiness that was going on with people who were fighting about the election."
Even John McGrath, who says he rushed to delete his Facebook account when he heard the news, didn't end up pulling the trigger. He says a four-year-old photograph of his daughter playing the piano stopped him in his tracks.
"It feels like the Gambinos have hijacked my family album," McGrath, a 48-year-old software developer and product manager from San Francisco who works for Amazon, wrote in a Facebook post.
So instead he removed all the identifying information he could: where he lives, works, where he went to school, family relationships, contact information and more.
"You can't do psychometric profiling with no data," McGrath said, "so I'm taking my data back."
Here's how to protect your personal info on Facebook
Reports that the political research firm Cambridge Analytica obtained the personal data of 50 million-plus Facebook users through a personality-quiz app should have anybody on the social network uneasy over the third parties they’ve let into their account.
Facebook allows apps to access much of users' profile information but has tightened up some controls. For instance, it prevents apps from seeing the personal data of people in your friends' list, the giant loophole the British researcher legitimately used to access data that the researcher allegedly then, against Facebook's rules, sold to Cambridge Analytica — a misuse the social network knew about three years ago.
Things after that depend on what the app or site asked and what you allowed. An app can ask for access to anything in your profile — and can declare some of that information “required” — and you have to decide if you trust it with that data and if you trust the developer to delete your information should you later remove the app.
Apps used to be a big deal on Facebook, leading to the huge popularity of Farmville and Words with Friends, meaning that even if you haven't downloaded a Facebook app, you may already have given an app developer leeway to access your details.
Once you add an app to your Facebook profile or use your Facebook account to log into another site, it’s easy to forget the exposure you incurred and with whom you did business. Both things — the “what” of an app or site’s access to your data and the “who” of that outside company — matter.
You can check both in a desktop browser or Facebook’s mobile apps.
In a browser, click or tap the downward-facing triangle at the top right, then select “Settings” then “Apps.”
In Android, tap the three-line button at the top right, select “Account settings,” then “Apps.”
In iOS, that button is at bottom right, after which you tap “Settings,” “Account Settings” then “Apps.”
In either mobile app, tap a “Logged in with Facebook” banner.
You’ll now see a list of apps and sites, grouped by who on Facebook can see you use them — everyone, friends only, a custom setting or only you.
At a minimum, they all get what any stranger logged into Facebook would: “your name, profile picture, cover photo, gender, networks, username, and user ID.”
Things after that depend on what the app or site asked and what you allowed.
For example, in my account it showed that the Amex Offers app had access to my friends list, Likes and current city. I turned off friends-list and Likes sharing but let the city setting stand: American Express already knows where I live. The reading app Flipboard, however, only had access to my public profile.
To evict an app, click or tap the “X” next to its name. If you don’t recognize it, bring up its details and then click or tap “App Privacy Policy.”
Login with Facebook?
When you add an app or use Facebook to sign up at a different site, you should see a Facebook dialog explaining your data exposure, some of which you can decline.
For instance, using Facebook to log into the Guardian — the British newspaper that, with The New York Times, first reported the extent of the Cambridge Analytica data heist — yields a notice that the Guardian will see your public profile and email address.
That Facebook dialog lets you hide your email, although the Guardian’s own site will then ask you to provide it anyway.
(USA TODAY also lets readers sign up via Facebook, subject to sharing public profiles; in addition, this site employs Facebook’s comments system.)
What happened before the election?
This entire system assumes that an app developer will tell the truth, which did not happen in the Cambridge Analytica case.
As Facebook explained in a Friday night post announcing that British company’s suspension, the “thisisyourdigitallife” app created by University of Cambridge psychology professor Aleksandr Kogan purported to be an academic research exercise.
But Kogan then gave Cambridge the data coughed up by some 270,000 people in 2013 — including details about their Facebook friends, an option that Facebook ended in 2014 and which allowed that app’s reach to hit that 50 million figure.
After this clear violation of Facebook’s rules — “Don't sell, license, or purchase any data obtained from us or our services” — Cambridge used that information to construct a ad-targeting operation used by President Trump’s 2016 campaign, according to The New York Times report.
Cambridge Analytica has said it fully complied with Facebook's terms of service, while the Trump campaign denied using voter data from Cambridge Analytica. Kogan hasn't responded to USA TODAY requests for comment.
Facebook’s post added that it now reviews apps seeking “detailed user information.” In a statement forwarded by Facebook's corporate communications, vice president for global operations Justin Osofsky said, “We actually reject a significant number of apps through this process.”
Facebook did not, however, say if it had individually notified users of Kogan’s app about this treachery. The company does not have a policy requiring any such notification of what it does not consider a data breach. After all, you allowed the app in the first place.
In this case, the social network is leaving you on your own.
Facebook CEO Mark Zuckerberg: 'My mistake' for abuse of voter info
"We didn’t take a broad enough view of our responsibility, and that was a big mistake. It was my mistake, and I’m sorry. I started Facebook, I run it, and I’m responsible for what happens here," Zuckerberg's testimony, released ahead of a Wednesday appearance before the House Committee on Energy and Commerce said.
Zuckerberg spoke with lawmakers Monday, a day before he is scheduled to testify before the combined Senate Commerce and Judiciary committees Tuesday and the House Commerce committee Wednesday.
The Facebook co-founder comes to Washington amid an upsurging drumbeat of unrest about Facebook's recent operational miscues. The social network found itself manipulated during the 2016 election season, with Russian influence attempting to foment discord among voters. Subsequently, it was learned that tens of millions of Facebook users' information may have been misused by a consulting firm that assisted the presidential campaign of Donald Trump.
The Facebook CEO has recently announced several initiatives to better secure user information and prevent misuse of the network. The latest: an election research committee including independent researchers who will assist Facebook in rooting out weaknesses.
"Facebook will seek researchers' help in preventing future election manipulation on the social media platform and, in its campaign to improve privacy, has suspended two more data firms as part of an ongoing investigation into the potential misuse of user data by political targeting firm Cambridge Analytica.
The formation of an independent election research commission is part of Facebook's strategy to help prevent a repeat of the 2016 presidential election campaign in which more than 3,000 ads were bought by 470 fake accounts and pages run by the Internet Research Agency, a shadowy organization in St. Petersburg, Russia. Many of the ads sought to spread divisive messages to stir up voters and public outrage.
"The goal is both to get the ideas of leading academics on how to address these issues as well as to hold us accountable for making sure we protect the integrity of these elections on Facebook," said Facebook CEO Mark Zuckerberg in a note posted on his Facebook page Monday. "Looking back, it's clear we were too slow identifying election interference in 2016, and we need to do better in future elections."
This new election research initiative — and the additional suspension of the firms — comes as Zuckerberg prepares to testify to Congressional committees Tuesday and Wednesday. He is also expected to meet with some Congressional leaders today.
Facebook is improving its privacy settings to make them easier for users to understand and is toughening advertising verification methods to prevent societal tampering in political issues, the company said last week.
The social networking giant continued to take steps to recover from the scandal that it now thinks could involve the data misuse of as many as 87 million people, mostly in the United States, when their data was improperly shared by Cambridge Analytica. The social network said it will begin notifying users Monday if their data was shared as part of the incident.
Concerns about possible ties to the British political consulting firm led Facebook to suspend two more firms — San Francisco-based research firm CubeYou and Canadian digital marketer AggregateIQ — as part of its investigation.
Facebook suspended CubeYou Sunday after CNBC notified Facebook the researcher had quizzes on the platform that collected user data and shared it with marketers. CubeYou had been working with the Psychometrics Lab at Cambridge University, also the workplace of psychology professor Aleksandr Kogan who Facebook said passed along data to Cambridge Analytica.
The data was generated by a personality app Kogan created, which was downloaded 270,000 times but connected to more Facebook users who were friends of those who used the app. Facebook last month said it had suspended Cambridge Analytica because it improperly obtained that data from Kogan.
As part of its investigation, CubeYou faces an audit by Facebook. "In addition, we will work with the UK (Information Commissioner's office) to ask the University of Cambridge about the development of apps in general by its Psychometrics Centre given this case and the misuse by Kogan,," said Ime Archibong, Facebook's vice president of product, in a statement.
CubeYou's "You Are What You Like" quizzes had disclaimers similar to the Kogan-created "This Is Your Digital Life" app, about the data being used for academic purposes. Under Facebook's settings at the time, the app could gain access to friends of those who answered the quiz. CubeYou CEO Federico Treu told CNBC the app did not gain access to friends' information, but only to friends who also opted into the app on their own.
Facebook suspended the other company, AggregateIQ, based in Victoria, British Columbia, after whistleblower Christopher Wylie said Aggregate IQ had worked with Cambridge Analytica parent company Strategic Communication Laboratories.
“In light of recent reports that AggregateIQ may be affiliated with SCL and may, as a result, have improperly received FB user data, we have added them to the list of entities we have suspended from our platform while we investigate," said Facebook in a statement. "Our internal review continues, and we will cooperate fully with any investigations by regulatory authorities.”
Aggregate IQ, on its website, "has never managed, nor did we ever have access to, any Facebook data or database allegedly obtained improperly by Cambridge Analytica."
Also in advance of Zuckerberg's testimony, U.S. and European consumer groups asked Facebook to establish privacy standards as stringent as those due to be implemented by the European Union next month.
"The targeting of internet users, based on detailed and secret profiling with opaque algorithms, threatens not only consumer privacy but also democratic institutions," says a letter addressed Monday to Zuckerberg from the members of the Transatlantic Consumer Dialogue.
The E.U. will adopt new data rules called the General Data Protection Regulation (GDPR) on May 25. Facebook's chief operating officer Sheryl Sandberg said in January Facebook's planned improvements to user privacy settings would give the company "a very good foundation to meet all the requirements" of the GDPR.
The U.S. and E.U. consumer groups urged Zuckerberg to commit to "global compliance with the GDPR" when testifying before Congress this week. "These are protections that all users should be entitled to no matter where they are located," read the letter, signed by Jeffrey Chester, executive director of the Center for Digital Democracy, and Finn Lützow-Holm Myrstad of the Norwegian Consumer Council.
Cambridge Analytica active in elections, big data projects for years
LONDON — Cambridge Analytica, the British firm accused of improperly harvesting Facebook data to help Donald Trump win the U.S. presidency, and its parent company quietly worked behind the scenes in elections and on big data projects for years with clients that spanned the globe.
In its 25 years of existence, the firm or its parent company, Strategic Communication Laboratories Group (SCL), has worked for political and military clients in Afghanistan, Kenya, Mexico, Nepal, Somalia and for the U.S. State Department.
The State Department paid SCL nearly $500,000 in 2017 for information on how Islamic State extremist propaganda motivates recruits to commit terrorism, according to public data held by its Global Engagement Center unit.
Cambridge Analytica’s investors have included Trump benefactor Robert Mercer and former White House aide Steve Bannon, two of Trump’s biggest boosters.
Now the firm is under scrutiny for its information-gathering tactics.
Facebook accuses the firm of harvesting private information improperly from 50 million of the social media site's users to make predictions about their likely voting habits. The breach was troubling enough to draw scrutiny from the Federal Trade Commission.
Special counsel Robert Mueller wants Cambridge Analytica to turn over its internal documents as part of his investigation into Russian meddling in the U.S. presidential election. Britain's Parliament is investigating the firm's involvement in "Brexit," the nation's 2016 referendum to leave the European Union.
On Monday and Tuesday, British broadcaster Channel 4 aired a video of Cambridge Analytica CEO Alexander Nix saying his firm could entrap politicians in compromising situations, prompting British regulators to seek a warrant to inspect the company's databases and servers. Cambridge Analytica suspended Nix on Tuesday.
The company denies any wrongdoing in the Facebook case. It said in a statement that it does not use, hold or have access to Facebook data or data from other social media platforms.
Mark Zuckerberg, Facebook's CEO, said the social network had "made mistakes" over the scandal. "I started Facebook, and at the end of the day I'm responsible for what happens on our platform," he said in a Facebook post Wednesday.
What is Cambridge Analytica, and who owns it?
"When you try to wind SCL back and figure out where it came from and how it got to where it is now, it's very difficult to do," said Martin Moore, director of the Center for the Study of Media, Communication and Power at King's College London.
SCL, founded in 1993 by former Saatchi & Saatchi advertising executive Nigel Oakes, has operated at least 18 separate companies in Britain and 12 in the United States, according to corporate filings in Britain and the U.S. compiled by Wendy Siegelman and Ann Marlowe, independent researchers in New York, and verified by USA TODAY. Some of these companies are now dormant or dissolved.
SCL has 17 international offices, in such countries as Argentina and the United Arab Emirates. Cambridge Analytica says it has 107 full-time employees. Most of the employees work in its central London headquarters.
Cambridge Analytica first emerged as an offshoot of SCL about five years ago, according to public filings.
The company declined through a spokesman to comment on its corporate structure.
Public records reveal a smattering of shareholders and some of its most prominent clientele.
Cambridge Analytica's election work has been partly funded by Mercer, the American hedge fund billionaire, according to federal election data published by OpenSecrets.org. Mercer is a major contributor to conservative causes, including Breitbart News.
Bannon, ousted White House chief strategist, was among the shareholders in a U.S.-based affiliate, Cambridge Analytica LLC. Bannon, who divested his stake in April, held shares worth between $1 million and $5 million, according to his White House financial disclosure form. He was also a vice president at the company from 2014 to 2016 and received a monthly consulting fee until 2016.
Bannon could not be reached for comment.
President Trump's former national security adviser Michael Flynn disclosed in August that he held a brief advisory role with Cambridge Analytica. Flynn has pleaded guilty to a felony count of lying to the FBI about conversations he had with Russia's ambassador.
Who are Cambridge Analytica's clients, and what work has it done?
The Trump campaign paid $6 million in 2016 to Cambridge Analytica to conduct large-scale polling and place hyper-targeted messages and online ads in front of U.S. voters in key swing states, Federal Election Commission records show.
U.S. Sen. Ted Cruz's 2016 presidential campaign paid Cambridge Analytica nearly $6 million for work before the company went to work for Trump, the records show.
In most instances, SCL and Cambridge Analytica won't divulge who pays for its work. But case studies published on its website show the scope and complexity of SCL projects.
- In Afghanistan, SCL undertook "audience profiling" to understand local attitudes toward religious affairs and governance.
- In Kenya, Cambridge Analytica has taken credit for helping to re-elect President Uhuru Kenyatta, a vote marred by concerns over its legitimacy.
- In Mexico, it used commercial and political data to gauge the effect of United States policy on the drug trade and violent crime in 13 cities under the influence of drug cartels.
- In Nepal, it gathered data on anti-social behavior among Maoist insurgents.
- In Somalia, it assessed the viability of establishing a telephone network across the war-torn country.
"I call (SCL) information mercenaries. They go wherever there is something to be done," said Paul-Olivier Dehaye, a Switzerland-based mathematician who studies and campaigns for issues related to data protection and privacy through his organization PersonalData.IO.
Dehaye has filed freedom of information requests in Britain to determine how Cambridge Analytica collects its data and whether that data extraction violates British law.
"There's no doubt they have been trying to manipulate data for their purposes," said David Miller, a professor of sociology at Bath University in England and an authority on propaganda.
"It's not clear how good they are at doing it," Miller said. "The difficulty in understanding this organization is that a lot of what it says about itself turns out not to be to quite right."
What role did Cambridge Analytica play in the 2016 U.S. presidential elections?
The Facebook case is lifting the veil on some of Cambridge Analytica's actions and tactics.
The social media platform, which suspended Cambridge Analytica from its platform on Friday, said the data analytics firm improperly received, used and then held onto user data passed to it by Cambridge University psychology professor Aleksandr Kogan.
Kogan, Facebook claims, devised and then exploited a personality predictor application to get unsuspecting Facebook users to give away information about themselves and their Facebook friends.
The app Kogan created, "thisisyourdigitallife," was downloaded only 270,000 times but was able to connect to other data points, such as "likes" and hometowns, for millions of other Facebook users through the app users' network of friends and family.
He then passed the information to Cambridge Analytica for Trump's campaign.
Facebook has since changed its privacy policies to prevent such connections.
Kogan told USA TODAY late last year that he had "very limited knowledge" of Cambridge Analytica's activities. He did not return a request for comment on the Facebook allegations or to further explain his side of the story.
In an emailed reply to USA TODAY questions, Cambridge Analytica said it provided research, data science and digital and TV marketing services for Trump's campaign. The company said it used polling, voter files, early and absentee voting returns released by each state and both its own and Trump campaign data to compile more than 5,000 data points on 230 million American voters. The company said it did not use "marketing psychology" to target Trump supporters.
Gary Coby, director of advertising at the Republican National Committee and a top media strategist on the Trump election campaign, said Cambridge Analytica was on the ground with the Trump social media team, sharing San Antonio office space with other Trump media execs, in the months leading up to the 2016 presidential election.
"They had a lot of great people, really smart people," he said.
But Coby downplayed the firm’s role in Trump’s victory.
"They helped on some surveying, some of the data sides, had multiple ad buying teams under me," Coby said.
The campaign, as a whole, focused more on direct-response marketing — or getting voters to donate or volunteer immediately after seeing an ad — rather than “psychographic targeting,” he said. "Psychographic targeting" uses data points to create personality profiles of American voters that can then be reached with more targeted political advertising, a tactic Cambridge Analytica claims as a specialty.
Around the time Cambridge Analytica began work for Trump, Nix, the company's CEO, reached out, via a public-speaking agency, to Julian Assange, the founder of the whistle-blowing website WikiLeaks, seeking information about 33,000 missing emails belonging to candidate Trump's opponent, Hillary Clinton.
Cambridge Analytica said through a spokesperson who declined to be named that Nix made the approach to WikiLeaks to find what information it had and where it came from. The official said the company was not asked by the Trump campaign or anyone else, including Russia, to make the inquiry.
Assange in a tweet confirmed the approach from Nix but said WikiLeaks never acted on the request.
In October and November of 2016, WikiLeaks published thousands of emails allegedly stolen from Clinton's campaign boss John Podesta by Russian-linked hackers.
Last December, the House Intelligence Committee, which had been investigating Russia's interference in the U.S. presidential election, called on Nix to testify behind closed doors.
On Tuesday, in a video broadcast on Britain's Channel 4, Nix is shown belittling representatives of the committee, saying Republican members asked him just three questions. "After five minutes — done," he said.
Cambridge Analytica board suspends CEO, pending probe into misuse of Facebook user data
The board of Cambridge Analytica, the political data firm that allegedly exploited information from 50 million Facebook users to help Donald Trump's campaign, suspended CEO Alexander Nix on Tuesday for his comments secretly recorded by a British broadcaster.
In a series of broadcasts by Britain's Channel 4, Nix was filmed making controversial statements about his firm's work on elections, including how Cambridge Analytica played a major role in Trump's presidential victory, including "all the data, all the analytics, all the targeting."
The suspended CEO also suggested to a potential client that his company could portray politicians in compromising situations. Nix's suspension was effective immediately.
"Mr. Nix’s recent comments secretly recorded by Channel 4 and other allegations do not represent the values or operations of the firm and his suspension reflects the seriousness with which we view this violation," the board of directors said in a statement.
The broadcasts come amid questions about how Cambridge Analytica gained access to people's online profiles, as well as criticism against Facebook for its alleged inaction to protect users’ privacy.
A British parliamentary committee on Tuesday summoned Facebook CEO Mark Zuckerberg to answer questions on whether personal data was improperly used to influence elections.
Facebook sidestepped questions on whether Zuckerberg would appear, saying the tech giant is focused on its own reviews.
In the latest broadcast, which aired Tuesday evening in Britain, Nix downplayed his private testimony before the House Intelligence Committee in December when he was asked about his firm's work for Trump's presidential campaign.
Nix claimed that Republican lawmakers asked him just three questions. "After five minutes — done," he said about his testimony behind closed doors. "They’re politicians, they’re not technical. They don’t understand how it works," he added.
Nix, in the video shown Tuesday, also claimed credit for Cambridge Analytica's work with data and research that he said allowed Trump to win the election with a narrow margin of "40,000 votes" in three swing states, giving Trump an electoral college victory, despite losing the popular vote.
Since Trump’s election, Cambridge Analytica has flip-flopped over its role in the campaign. The company initially claimed credit for helping elect Trump, but Nix also sought to portray the firm's role as minimal amid investigations into alleged Russian meddling in the 2016 election.
Channel 4's broadcast came a day after the network showed surreptitiously obtained video of Nix saying his company could entrap politicians. Monday night's broadcast in Britain showed one exchange in which Nix said the company could "send some girls around to the candidate’s house." Ukrainian girls, he said, "are very beautiful. I find that works very well."
Cambridge Analytica, in a statement Monday, denied that it or its affiliates "use entrapment, bribes or so-called honey-traps" against politicians. It also denied any wrongdoing over the Facebook data it acquired from Cambridge University psychology professor Alex Kogan.
The television station said it filmed a series of meetings at London hotels from November to January, during which a Channel 4 reporter posed as an operative for a wealthy client hoping to get candidates elected in Sri Lanka.
In addition to Nix, other senior Cambridge Analytica executives, including Mark Turnbull, the firm's managing director, attended the meetings.
In videos of the meetings broadcast by Channel 4, Cambridge Analytica executives boasted that it and its parent, Strategic Communications Laboratories, had worked in more than 200 elections across the world, including Nigeria, Kenya, the Czech Republic, India and Argentina.
In another exchange, Turnbull described how Cambridge Analytica can discreetly publicize damaging material about a political opponent on social media and the Internet.
"We just put information into the bloodstream of the Internet, and then, and then watch it grow, give it a little push every now and again ... like a remote control. It has to happen without anyone thinking, ‘that’s propaganda,’ because the moment you think ‘that’s propaganda,’ the next question is, 'Who’s put that out?' "
Tuesday night's broadcast also featured an on-the-record interview with Hillary Clinton conducted in October, when she was promoting her book. She said it would be "very disturbing" if Cambridge Analytica were found to be involved in Russia’s alleged attempt to influence the election.
"So you’ve got CA, you’ve got the Republican National Committee — which of course has always done data collection and analysis — and you’ve got the Russians. And the real question is, how did the Russians know how to target their messages so precisely to undecided voters in Wisconsin or Michigan or Pennsylvania — that is really the nub of the question," Clinton said in the interview.
Cambridge Analytica denied any involvement with Russia and said any such allegation is false.