Tag Archives: moderation

On Comments

August 23rd was the last day for comments at NPR‘s website. Given limited resources for moderation, a relatively small cohort of readers that comments and the growing scale of interactions across social media platforms, it’s understandable that NPR made this decision.
 
As I told NPR’s former ombudsman, as a life-long consumer of NPR news and programs, I’m saddened that one of the world’s great public media organizations is backing away from investing in creating and maintaining a healthy forum for the public to discuss the news on a platform owned by the public, not private technology companies.
 
On the one hand, this decision frees NPR staff from moderation duties, lifting the weight of battling trolls to adjudicating disputes or enduring abuse and allowing community managers to focus on moderating social media discourse. On the other, if NPR and other public media houses back away from hosting the conversations and shift them to social media platforms, the data and relationships represented in those people move with them.
 
Getting online comments wrong is easy. Building a healthy online community is hard, but outlets like TechDirt and forums like MetaFilter show that it’s not only possible but sustainable. Good comments are valuable in their own right. At their best, they’re improvements upon the journalism they’re focused upon, but they require convening a community and investing in editorial moderation and tools. At their worst, online comment sections are some of the most toxic spaces online, not only turning off readers but causing damage to public understanding of science or technology.
 
Ideally, comment sections provide valuable forums for people to share their thoughts on the issues and decisions that affect them, but the technologies and strategy that create architectures of participation need to continue to improve. Given political polarization, the need for public spaces that reward meaning participation and foster civic dialogue instead of shouting matches is critical to our politics.
 
Communities across the country rely upon public media to report on local government and inform us about what’s being done in our name. Social media and smartphones offer new opportunities for journalists and editors to report with communities, not just on them.
Like Margaret Sullivan, I think news organizations should fix comments, not abandon them. That’s why the Coral Project matters. I hope that as that effort matures, it will demonstrate the value of comments done well. If so, it should be adopted by NPR to host conversations with the people formerly known as its audience on public media webpages again.
Advertisements

3 Comments

Filed under blogging, journalism, open source, social media, technology

On Moderation

The First Amendment prohibits Congress from making laws abridging the freedom of speech and generally has been interpreted to apply to state and local governments. In my experience, it does not provide untrammeled rights for an individual to say anything, at any time, in any context. The First Amendment also does not apply to a community on Facebook which was created and maintained by a private individual.

There are many public spaces and contexts in America where moderation by judges, speakers, teachers and other community leaders leading discussions can and must make decisions about speech.

To put it another way, moderation is not the antithesis of open government.

Many parliamentary procedures are based upon Robert’s Rules of Order, which require whomever is leading the meeting to effectively serve as a moderator, wielding a mighty big gavel.

Courtrooms are moderated by a judge, who maintains order in the court. Town halls are conducted by mayors, councils and/or media, all of whom serve as moderators. Classrooms and libraries are moderated by teachers and librarians, who lay out rules for participation and use that enable all students and members of a community to have the opportunity to learn and participate.

In each context, there are rules and consequences. People in a courthouse may be held in contempt after sufficient outbursts. If someone keeps making off-topic comments at microphone at a town hall, for instance, a town councilor running a meeting might ask him or her to answer the question that was posed or to cede the space. Students who insult other students or the teacher, interrupt a class, answer questions with off-topic subjects or threaten others with violence are asked to leave a class — or even suspended or expelled.

In online forums, I think a team of moderators who rotate and adjudicate decisions based on a transparent set of rules would be appropriate. I generally think of the blogs and communities I maintain as classrooms and moderate accordingly.

As the creator and moderator of the Google Plus Open Government & Civic Technology community, I’ve been faced with decisions every week since I clicked it into life, including removing posts or, unfortunately, sometimes banning users. Spam has been an ongoing challenge. I’ve shared my own standards for communication moderation online, which inform how I handle comments on social media and blogs in general

It’s critical for online forum creators and moderators to be clear about the expectations for members of a community, from topical focus to frequency of postings to commercial content to behavior towards others, and to act transparently to address the concerns of those communities. It’s not easy, as we’ve seen on Wikipedia or Reddit or blog comments, but if we’re going to have any hope of fostering civic dialogue online, it’s critical that we all figure it out together, building better tools and models that neither amplify the loudest voices in the chat room nor chill voices speaking truth to power than need to be heard.

Leave a comment

Filed under government 2.0, journalism, social bookmarking, social media, technology

Reflections on online cruelty and kindness

This morning, I read an interesting reflection on dealing with online cruelty in the New York Times by Stephanie Rosenbloom:

In the virtual world, anonymity and invisibility help us feel uninhibited. Some people are inspired to behave with greater kindness; others unleash their dark side. Trolls, who some researchers think could be mentally unbalanced, say the kinds of things that do not warrant deep introspection; their singular goal is to elicit pain. But then there are those people whose comments, while nasty, present an opportunity to learn something about ourselves.

Easier said than done. Social scientists say we tend to fixate on the negative. However, there are ways to game psychological realities. Doing so requires understanding that you are ultimately in charge. “Nobody makes you feel anything,” said Professor Suler, adding that you are responsible for how you interpret and react to negative comments. The key is managing what psychologists refer to as involuntary attention.

When I checked her reference, I found that Rosenbloom made an error with her citation of research, along with failing to link to it: the 2011 report on teens, kindness and cruelty on social networking sites by the Pew Research’s Internet and Life Project she cited found that a vast majority of young people (88%) had “seen someone be mean or cruel to another person on a social network site,” not 69%. That percentage refers to a happier statistic: “69% of social media-using teens think that peers are mostly kind to each other on social network sites.

On that count, I’m glad the author chose to end with a reflection on kindness and the psychology involved with focusing on positive comments and compliments, as opposed to the negative ones. Anyone who wants to see how a positive feedback loop works should look at how Justin Levy’s friends & networks are supporting him, or how dozens and dozens of friends, family and strangers supported me when I lost my beloved greyhound this week.

I’m not sure about the New York Times editor’s summary — that the “Web encourages bad behavior,” through anonymity and lack of consequences.

I think that what we see online reflects what humans are, as a mirror, and that what we see on social media (which is really what is discussed here, not the World Wide Web) is 
1) a function of what the platforms allow, with respect to the architecture of participation, and
2) what the community norms established there are.

Compare newspapers’ online comments, YouTube comments and Twitter to what you find in the comments at Ars Technica, BoingBoing or even, dare I say it, in the blogs or public profiles I moderate. As Anil Dash has observed, the people who create and maintain online forums and platforms bear responsibility for what happens there:

It’s a surprisingly delicate balance to allow robust debate and disagreement on politics, current events, technology choices, or even sports (hello, tribalism) while guiding conversations away from cruelty, anger, or even hatred, whether we lead a classroom or an online discussion. The comments we allow to stand offline or online largely determine the culture of the class, town hall or thread they’re made within:

While people bear always responsibility for their own cruel actions or words, it’s incumbent upon those of us who host conversations or share our thoughts publicly online to try to respond with empathy, kindness and understanding where we can, and with polite but resolute moderation when others do not respond to those tactics or attack our friends and communities.

[IMAGE SOURCE: Amanda Lenhart, Pew Research Center]

2 Comments

Filed under article, blogging, friends, microsharing, personal, research, social bookmarking, social media, technology, Uncategorized

Classrooms and community: my moderation standards for Google+, Facebook and blog comments

Over the past few months, I’ve seen a lot of spam and pornography links on Google Plus, on Facebook and on the blogs I maintain. Fortunately, blogs, Google and Facebook both give us the ability to moderate comments and, if we wish, to block other people who do not respect the opinions or character of others.

Last night, I’m seeing a lack of clarity about my approach to online community, so here’s how I think about it, with a nod to the example set by Arizona State University journalism professor Dan Gillmor.

I can and do block spammers and people posting links to pornography.

I will leave comments on on my blogs, precisely because I value conversations, despite the issues that persist online. I have been moderating discussion in online forums and blogs for many years, including those of my publishers. My full thoughts on the value of blog comments — and the social norms that I expect people comments to live within — are on this blog. To date, there are 196 comments on the post.

Vilely insulting me won’t help your case. Insulting others will ruin it. I was a teacher in my twenties. I would not tolerate disrespectful behavior in my classroom, either to me or to other students. If you can’t be civil and continue to insult others, much less the person hosting the forum, you were asked to leave and see the principal.

If the behavior persists, you will lose the privilege of participating in “class.” Eventually, you get expelled. On Google+ or blogs, that takes the form of being defriended, banned or blocked from my public updates. I prefer not to block users but I will do it. I respect your right to speak freely on your own blog, Twitter, Facebook or Google+ account, whether that involves cursing or ignorance.

I strongly believe in the First Amendment. Governments should not censor citizens. That said, I do not, however, feel obligated to host such speech on my own blog, particularly if it is directed towards other commenters. I believe that building and maintaining healthy communities, online of offline, requires that the people hosting them enforce standards for participation that encourage civil dialogue.

I hope that makes sense to friends, readers and colleagues. If not, you are welcome to let me know in the comments.

3 Comments

Filed under blogging, education, social media, technology