Every social network user has at some point typed something they knew they’d regret sharing and has promptly erased it before clicking “post.” However, Slate’s Jennifer Golbeck reports that these discarded thoughts don’t completely disappear — rather, Facebook uses a code that keeps track of every time you delete a would-be message and sends metadata about that message back to its own data bases.
Just what is Facebook doing with information on these non-posts, you ask? Golbeck cites a new research paper written by Facebook data scientist Adam Kramer and Carnegie Mellon Ph.D. student Sauvik Das that examines the reasons for Facebook users’ “self-censorship” and takes a look at millions of users’ “aborted status updates, posts on other people’s timelines, and comments on others’ posts.”
Facebook isn’t keeping a database on all these non-posts’ contents, mind you — it’s simply keeping a record of all the data surrounding self-censored posts such as what time it was almost posted and whether it was set to be posted on a friend’s page or on the user’s own page. Kramer and Das say that Facebook wants to understand all the reasons that people decide against posting because the company “loses value from the lack of content generation” every time a would-be post gets the axe.
“Consider, for example, the college student who wants to promote a social event for a special interest group, but does not for fear of spamming his other friends — some of who may, in fact, appreciate his efforts,” the authors write in explaining their interest in self-censoring behavior.
Golbeck concludes that Facebook’s desire to get users to post absolutely everything that comes into their heads is somewhat perverse because the company is essentially encouraging its users to lower the standards of what they share with their friends.
“So Facebook considers your thoughtful discretion about what to post as bad, because it withholds value from Facebook and from other users,” she writes. “Facebook monitors those unposted thoughts to better understand them, in order to build a system that minimizes this deliberate behavior.”