Now You See It, Now You Don’t! A Study of Content Modification Behavior in Facebook

@SNOW/WWW, 2017, by Fuxiang Chen and Ee-Peng Lim

In social media, content posts and comments can be edited or removed. We therefore define two types of content modification, namely: (a) content censorship (CC), and (b) content edit (CE). Content censorship refers to complete deletion of some content post or comment, whereas content edit refers to edits made to a content post or a comment.

Dataset

We selected 57 public Facebook pages of three different regions (Hong Kong, Singapore and United States) ranging from News to Community, Event, and Group pages in our study. All these pages contain content mainly in English language.


Figure 1: Studying and Tracking Periods

We download the posts and comments created during the period from 1 January 2016 to 23 August 2016 (Study Period) and track them for changes that occur during a period of three weeks, from 8 August 2016 to 23 August 2016 (Tracking Period). The tracking period covers many versions of the Facebook pages. By comparing every two consecutive versions of posts and comments, we determine two types of changes to the posts and comments in these pages. A post or a comment is edited if its content varies in two consecutive versions. A post or a comment is deleted if it is present in the earlier version but not the next version.

Empirical Data Analysis

We first investigate the likelihood of edit and removal for post and comments.


Figure 2: Likelihood of Posts/Comments getting Edited/Removed

We observed that posts are more likely to be edited than removed (see Figure 2). We believed that Facebook users generally spend more time on crafting and writing a post, and thus the posts are not likely to be removed. In contrast, comments are more likely to be removed than edited. We believed that Facebook users generally spend much less time to write comments. The loss of efforts is minimal when a comment is removed.

We also investigate the recency effects of Facebook Modification in Facebook in two aspects. We first analyse the content censorship and edit actions performed on posts and comments created during the tracking period. For each censorship and edit action, we determine the number of days between the content creation date and the action date. We then bin each action by its number of days, and count the number of censorship and edit actions in each bin.


Figure 3: Recency Effect in CE (Tracking Period)


Figure 4: Recency Effect in CC (Tracking Period)

Figures 3 and 4 show the edit and censorship action count for different day bins respectively. Figure 3 shows that the number of posts and comments edits achieves the highest volume in the first day, and then decreases exponentially until the 7th day. Similarly, Figure 4 shows the same exponential decreasing trend in the number of posts and comment censorship. Thus, this suggests that users are more likely to perform content modification on the more recent posts/comments.

CC & CE Annotation

After detecting the edited posts and comments, we then seek to understand the reasons behind these post/comment modification using a manual annotation task.

Manual judgements on these post/comment removals and edits show that majority of the content censorship is related to negative reports on events and personal grouses, and content edit is mainly performed to improve content quality and correctness.

For more details, please refer to our paper entitled: “Now You See It, Now You Don’t! A Study of Content Modification Behavior in Facebook” by Fuxiang Chen and Ee-Peng Lim, 4th Workshop on Social News on the Web @ WWW ’17 (SNOW 2017), Perth, Australia, April 2017.

Acknowledgement

This research is supported by the National Research Foundation, Prime Minister’s Office, Singapore under its International Research Centres in Singapore Funding Initiative.