Facebook certainly likes to tinker. You might even say it’s in the company’s DNA. And with over 1 billion registered users across the globe, Facebook is in a unique position to run all sorts of testing at a scale that most other companies could only dream of. Unfortunately, most of the stories we’ve seen about such testing suggests that Facebook is more prone to abusing its power than using it for productive purposes.
DON’T MISS: Here’s what the world’s first Model X owners think of Tesla’s SUV
For instance, you might recall a Facebook research experiment from 2014 where the company intentionally altered the news feeds of upwards of 700,000 users in order to see if it affected their moods. The stated purpose of the study, which was published in Proceedings of the National Academy of Sciences, was to measure “emotional contagion through social networks.”
When word of Facebook’s study emerged, many people were indignant, quick to call the experiment unethical. While that categorization might be a bit extreme, news of another Facebook experiment might fit that bill more appropriately.
In a recent report from The Information, we learn that Facebook at one time purposefully caused its Facebook app on Android to crash repeatedly, all in an effort to see how many crashes it would take to prevent users from returning. In other words, Facebook sought out to determine how addictive their site truly was, even in the face of a frustrating and ‘broken’ user experience.
Facebook has tested the loyalty and patience of Android users by secretly introducing artificial errors that would automatically crash the app for hours at a time, says one person familiar with the one-time experiment. The purpose of the test, which happened several years ago, was to see at what threshold would a person ditch the Facebook app altogether. The company wasn’t able to reach the threshold. “People never stopped coming back,” this person says.
Oddly enough, the uproar from Facebook ‘manipulating’ the moods of its users seemed to be far greater than when word of this story first broke. Personally, I think the outrage stemming from Facebook’s news feed experimentation was a tad overblown. But purposefully crashing an app? That seems to be a lot more problematic and worrisome.
Notably, the report claims that Facebook only conducted this experiment once and that it occurred several years ago. The larger takeaway, though, is that Facebook now has a whole lot of power at its disposal. It has a gargantuan user base and the site today truly has no competitors to speak of. That being the case, Facebook should be more cognizant of the power it wields and, to be blunt, should stop treating its users like pawns in ill-conceived experiments.