skip to Main Content

Let’s pick our own social media editors

A design and policy proposal for improving the quality of social media

    Marc Smith and Ben Shneiderman

The great promise of social media is being eclipsed by the dismal reality of abuse and attack that many users experience.  

Athletes, celebrities, and prominent voices along with everyday people have retreated from social media spaces that are polluted with abuse and threats. Faced with these dangerous experiences many users turned to the platforms for assistance, only to be disappointed.  While platforms can and do moderate, they prefer not to intervene in every dispute and conflict among billions of culturally diverse users.  When they choose to do so, the diversity of views around the world means that no decision will make all people happy.  

The recent English Football League and Premier League four-day boycott of social media promoted awareness of abusive behavior.  While the hashtags #StopOnlineAbuse and #EnoughIsEnough trended on Twitter, these calls for action to reduce toxic content and abuse were clearly ineffective. Even if the platforms did a better job moderating social media content, our societies may not want to grant that much power to a central authority no matter how benevolent.


Could a marketplace for editors deliver a safer social media space? Today, paying moderators is costly for social media companies, often more expensive than the highly paid developers, high tech hardware, or vast amounts of electricity they consume.  An editors’ marketplace for moderation could be more affordable and tailored for each user.

Why not put the power of moderation in the hands of all users?  We already pick who to follow in social media, why not also pick who can be our editors?  Every user could be an editor as soon as some other user picks them as an editor.  People may grant editor status to their sports teams, religious or community organizations, political parties, or prominent influencers and thought leaders.  These people and organizations can hide and label any content or user they select.  People who have selected them as editors will not see these messages.  Editors may make good choices or bad ones, but individuals and organizations will become known for their quality editing skill and choices.  

A model for user selected editors in social media

The tweets or posts that editors hide will be flagged with a thin line:

This tweet was hidden by your editor, [USERNAME] because of [REASON].
Click to display the message.

If users occasionally check the messages that are deleted by their editors, users can judge if they still agree with their editor’s choices.  If needed, a single click can revoke the editor status users have previously granted.  

Each user’s portfolio of editors can scour social media, so that users are relieved to find a stream of content that is more likely to be free of toxic, low quality information, and disinformation.  As editors work to reduce the audience for confrontational, abusive content, the incentive to create abusive content declines as well. The Premiere League and others who face these problems can ask their followers to select them, or other good-faith actors, as editors, enabling a self-service solution to the harassment and abuse.

Social media makes a promise that all may speak.  Social media platforms provide an accelerator and amplifier for information, both high and low quality.  The power to diminish the distribution of information is a critical power to grant.  But that is why it should be explicitly granted and easily revoked: something even the best moderation from the platforms or other single authorities cannot provide. While all may speak, not all deserve to be heard.  Users have a right to choose who they hear, aided by the help of the editors they choose.  

One shoe does not fit all and a single distant authority should not govern what social media users can say to one another. 

Referees on the playing field make for fair play; Editors would do the same on social media. Protecting social media spaces and their users from abuse is a valid goal, which would make these environments welcoming and attractive to more users.

Social media creates marketplaces of ideas, but these social markets require the scrutiny of accountants and auditors who can work to reduce the amount of fraud. Social media platforms are young technologies that have many negative aspects that will take time to discover and mitigate.  Like the development of seat belts and air bags, safety equipment is also necessary for social media.  Better design standards and requirements could lead to the development of platforms that are built with all the needed systems and features for safe, high quality knowledge sharing. Car manufacturers must comply with many requirements and vast social media platforms could do the same given their equal impact on health and safety. Safer social media platforms will similarly encourage greater participation and healthier discussions.

Marc Smith is a sociologist who directs the Social Media Research Foundation and the Connected Action consulting group.

Ben Shneiderman is a computer scientist at the University of Maryland and member of the National Academy of Engineering.

Back To Top