With Facebook still reeling from the Cambridge Analytica scandal that impacted upwards of 87 million users, Mark Zuckerberg is headed to D.C. this week to testify before Congress about a range of privacy issues. With the social networking giant still in the midst of a public relations nightmare, Zuckerberg will be under a lot of pressure to answer some hard-hitting questions from lawmakers. In fact, Facebook went so far as to hire a team of consultants to ensure that Zuckerberg — who has historically been somewhat shy and awkward in interviews — appears more charming than robotic.
Ahead of Zuckerberg’s appearances, Congress today released Mark Zuckerberg’s prepared testimony. As you might expect, Zuckerberg’s remarks touch on a few issues and includes an admission that Facebook didn’t do enough to prevent third-party actors like Cambridge Analytica from abusing tools that were offered to developers with good intentions. What’s more, Zuckerberg takes full responsibility for not realizing how Facebook’s platform might be used for sinister purposes.
“But it’s clear now that we didn’t do enough to prevent these tools from being used for harm as well,” Zuckerberg said. “That goes for fake news, foreign interference in elections, and hate speech, as well as developers and data privacy. We didn’t take a broad enough view of our responsibility, and that was a big mistake. It was my mistake, and I’m sorry. I started Facebook, I run it, and I’m responsible for what happens here.”
As to what Facebook is doing to make the platform safer going forward, Zuckerberg explains:
We need to make sure that developers like Kogan who got access to a lot of information in the past can’t get access to as much information going
We made some big changes to the Facebook platform in 2014 to dramaticallyrestrict the amount of data that developers can access and to proactively review the apps on our platform. This makes it so a developer today can’t do what Kogan did years ago.
But there’s more we can do here to limit the information developers can accessand put more safeguards in place to prevent abuse.
We’re removing developers’ access to your data if you haven’t used theirapp in three months.
We’re reducing the data you give an app when you approve it to only yourname, profile photo, and email address. That’s a lot less than apps can geton any other major app platform.
We’re requiring developers to not only get approval but also to sign acontract that imposes strict requirements in order to ask anyone for accessto their posts or other private data.
We’re restricting more APIs like groups and events. You should be able tosign into apps and share your public information easily, but anything that might also share other people’s information — like other posts in groups you’re in or other people going to events you’re going to — will be much more restricted.
- Two weeks ago, we found out that a feature that lets you look someone up by their phone number and email was abused. This feature is useful in cases where people have the same name, but it was abused to link people’s public Facebook information to a phone number they already had. When we found out about the abuse, we shut this feature down.
Investigating other apps.We’re in the process of investigating every app that had access to a large amount of information before we locked down our platform in 2014. If we detect suspicious activity, we’ll do a full forensic audit. And if we find that someone is improperly using data, we’ll ban them and tell everyone affected.
Building better controls. Finally, we’re making it easier to understand which apps you’ve allowed to access your data. This week we started showing everyone a list of the apps you’ve used and an easy way to revoke their permissions to your data. You can already do this in your privacy settings, but we’re going to put it at the top of News Feed to make sure everyone sees it. And we also told everyone whose Facebook information may have been shared with Cambridge Analytica.
Zuckerberg’s full testimony can be read over here.