Click to Skip Ad
Closing in...

Irony: Facebook employees hate it when their private memos are shared with third parties

Updated Mar 30th, 2018 8:47AM EDT
Facebook Memo Leak
Image: Stuart Ramson/AP/REX/Shutterstock

If you buy through a BGR link, we may earn an affiliate commission, helping support our expert product labs.

Facebook is still in damage control mode after the world found out a shady company like Cambridge Analytica was able to extract user data for 50 million people and use that data to influence the presidential election. It all happened with Facebook’s assistance — the company made that data available to developers without considering that it might be abused. Or perhaps the company did consider it, but it simply didn’t care.

An internal memo that was leaked on Thursday reveals that Facebook’s top execs have been thinking about the “ugly” side effects of sustained growth, although growth remained a top priority for the company. The leak also brings us a twist related to privacy breaches. Facebook employees are witnessing what it’s like to have data exposed that they thought was private.

It was BuzzFeed that got a hold of an internal memo from Andrew Bosworth that was shared with employees only a day after a shooting death in Chicago was posted on Facebook Live, in June 2016. Bosworth, a top Zuckerberg lieutenant, is known internally for his provocative remarks and his bluntness. Here’s an excerpt from his memo:

So we connect more people

That can be bad if they make it negative. Maybe it costs a life by exposing someone to bullies. Maybe someone dies in a terrorist attack coordinated on our tools.

And still we connect people.

The ugly truth is that we believe in connecting people so deeply that anything that allows us to connect more people more often is *de facto* good. It is perhaps the only area where the metrics do tell the true story as far as we are concerned.

That isn’t something we are doing for ourselves. Or for our stock price (ha!). It is literally just what we do. We connect people. Period.

That’s why all the work we do in growth is justified. All the questionable contact importing practices. All the subtle language that helps people stay searchable by friends. All of the work we do to bring more communication in. The work we will likely have to do in China some day. All of it.

Bosworth is practically acknowledging that Facebook can have adverse effects on our lives — and the memo was shared in the summer before the presidential election.

https://twitter.com/boztank/status/979478961582325760

What’s weird about the memo is that Bosworth deleted it after it hit the press. He also took to Twitter to distance himself from these remarks. He wrote the note, but doesn’t stand by it. Okay.

Zuckerberg also distanced himself from these opinions.

We’ve never believed the ends justify the means. We recognize that connecting people isn’t enough by itself. We also need to work to bring people closer together. We changed our whole mission and company focus to reflect this last year.

Then again, you can easily argue that the whole Cambridge Analytica scandal happened because of Facebook’s desire for growth at all costs. The company shared all that user data with developers so that more developers create Facebook apps so that more users are entertained inside Facebook.

The desire for growth, keeping users hooked on the newsfeed, and the need to make ad-based cash, is also what made possible the Russian meddling in the US election — that includes their ability to create and share fake news and buy ads on the network without any actual interference.

Facebook employees, meanwhile, aren’t happy that their internal chats are shared with the world. So let me get this straight, Facebook people do not appreciate having the content that was shared privately with a certain group of people broadcast to a third-party audience? You don’t say!

Following BuzzFeed’s scoop, The Verge published reactions from employees.

Some took Bosworth’s side, others criticized the deletion of the post, and others were annoyed that leakers were operating inside the company.

“Deleting things usually looks bad in retrospect,” said one person. “Please don’t feed the fire by giving these individuals more fuel (eg, Facebook execs deleting internal communications”). If we are no longer open and transparent, and instead lock-down and delete, then our culture is also destroyed — but by our own hand.”

“How fucking terrible that some irresponsible jerk decided he or she had some god complex that jeopardizes our inner culture and something that makes Facebook great?” a different person said, calling out leakers.

Others said leakers can’t really be discovered before sharing internal content.

“I don’t think we’ve seen a huge internally leaked data breach, but I’ve always thought our ‘open but punitive’ stance was particularly vulnerable to suicide bombers,” the person said. “We would be foolish to think that we could adequately screen against them in a hiring process at our scale. … We have our representative share of sick people, drug addicts, wife beaters, and suicide bombers. Some of this cannot be mitigated by training. To me, this makes it just a matter of time.”

Some people also suggested that Facebook might be infiltrated with foreign agents whose purpose is to destabilize it.

““Imagine that some percentage of leakers are spies for governments,” a person said.” A call to morals or problems of performance would be irrelevant in this case, because dissolution is the intent of those actors. If that’s our threat — and maybe it is, given the current political situation? — then is it even possible to build a system that defaults to open, but that is able to resist these bad actors (or do we need to redesign the system?).”

Then some Facebook employees realized that it’s only natural for people to leaked previously shared content given that Facebook’s products are based on sharing stuff with others. Here’s that post:

It’s interesting to note that this discussion is about leaks pushing us to be more cognizant of our sharing decisions. The result is that we are incentivized toward stricter audience management and awareness of how our past internal posts may look when re-surfaced today. We blame a few ill-intentioned employees for this change.

The non-employee Facebook user base is also experiencing a similar shift: the move toward ephemeral and direct sharing results from realizing that social media posts that were shared broadly and are searchable forever can become a huge liability today.

A key difference between the outside discussion and the internal discussion is that the outside blames the Facebook product for nudging people to make those broad sharing decisions years ago, whereas internally the focus is entirely on employees.

Others got the irony of the situation too. “Another employee made a similar plea for empathy. “Can we channel our outrage over the mishandling of our information into an empathy for our users’ situation? Can the deletion of a post help us better understand #deletefacebook? How we encourage ourselves to remain open while acknowledging a world that doesn’t always respect the audience and intention fo that information might just be the key to it all. Maybe we should be dogfooding that?”

Maybe you should.

Chris Smith Senior Writer

Chris Smith has been covering consumer electronics ever since the iPhone revolutionized the industry in 2008. When he’s not writing about the most recent tech news for BGR, he brings his entertainment expertise to Marvel’s Cinematic Universe and other blockbuster franchises.

Outside of work, you’ll catch him streaming almost every new movie and TV show release as soon as it's available.