One of my favorite insights on the subject of online community is from Tom Chick:
This is a companion discussion topic for the original entry at http://blog.codinghorror.com/what-if-we-could-weaponize-empathy/
One of my favorite insights on the subject of online community is from Tom Chick:
Engaged moderators are an important factor in maintaining empathy. The community as a whole can slip, because lets face it, we’re all human. Its easy for us to make a mistake and fall into one of the traps listed above, and without guard rails (moderators), the community will naturally lose its sense of empathy. Moderators are only a part of the recipe, but they really are the secret ingredient.
Love this article and every single thing you wrote, even though I’m guilty of at least a few of those… while reading I was wondering is he talking about me?
But that’s not the point of this comment. I feel that forums need some mechanism outside of flagging for users to point out these things. It’s impossible for moderators and administrators to read every single topic and post and see whether people are acting out. Sure, the really bad instances will be reported through the flag interface, or turn into a rant that everybody can see, but for the most part people think of “report” and “flag” as a way to report really bad behavior, and aren’t willing to take that step.
But what about the smaller things, like griefing, that can often be done in what appears to be a polite way to outsiders? What about people just generally being a little unpleasant? Or those Linux fanatics that feel the need to post on every single Windows topic no matter what?
I’m not sure what the answer is, but I feel like if we want empathy we also need to build some type of karma system, even if it’s completely behind the scenes and just alerts the moderators. Something that effectively is an “I don’t really like this person” button.
Because moderators can’t chaperone every single conversation in every single topic.
“At the risk of sounding aspirational…”
Jesus Christ, what have we come to? That’s a sickness in America, that positive attitudes are viewed with suspicion. Truly a disease.
Putting that aside…
Okay, I come from a long background in Eastern practice - 45 years. My teacher, Swami Kriyananda, defined maturity as the ability to relate to realities other than our own."
That’s a standard I can live with, because the Internet proves it every day. I bet if we conducted a study we would find that 99.5 percent of Internet whiners and attention-seekers are under 19 (or, given the brain-addling influences of modern culture, let’s include a reasonable buffer: say, 25).
Past that age, people who haven’t at least begun to learn that their own happiness is at stake when they cultivate contractive attitudes must be considered spiritually ill, i.e., lacking awareness of the purpose and possibilities of life.
I say, why NOT exclude them. Do we eat iron bars? Do we consume viruses simply because we can. (Hey, there are no values. Right.) Why invite these dangerously infectious people into our presence. My teacher rightly gave no energy to negative people, chronic complainers, whiners, and prideful “contrarians.” And, of course, he was accused of discouraging free discussion.
Really, I see the company I keep as a life-or-death matter. Not only can trolls harm our physical and emotional health, they can keep us from realizing the true purpose of life which is to expand the heart and decrease our own and others’ suffering, and cultivate happiness and fulfillment for all.
Facebook is apparently looking at this:
each week eight million Facebook members use tools that allow users to report a harmful post or photo. (The tools can be used by clicking on the little upside-down arrow in the upper right corner of a post or the options button at the bottom of photos.)
So it’s buried under “options”, or an expansion?
Creating empathy on Facebook has not been easy. Researchers have learned that a few letters can have a profound impact. For example, in the first iteration of these tools, Facebook gave users a short list of vague emotions — like “embarrassing” — to communicate why they wanted a post removed. At the time, 50 percent of users seeking to delete a post would use the tool, but when Facebook added the word “it’s” to create a complete sentence (“It’s embarrassing”), the interaction shot up to 78 percent.
This, on the other hand, is fascinating. Changing from
Increased interactions 28%? Wow. Looks like I’ll be doing that for Discourse…
+1 on this idea. I only wish we could shout this from the rooftops of every Internet discussion site. Way to go, guy!
The most common problem is that people either won’t initiate empathy, or make it so conditional that they will turn it off completely if you identify with the “wrong group”. And empathy is weaponized - often by those behaviors that are problematic. The words that are designed to appeal to emotion and not reason. When you’ve lost the argument or don’t have one on the rational side, it is all too easy to start pushing emotional buttons.
Human shields are weaponized empathy.
Hear, hear…more like this, please. For empathy, I suggest Nonviolent Communication as a resource; for beginners, the book, “Nonviolent Communication: A Language of Compassion” (1999), covers the basics and then some.
Here is one place where empathy has been weaponized. Not an discussion forum but the game Journey. On this long video  John Nesky (feel designer) talks about all the things they had to do to prevent players to be mean to each other.
For example, trimming a lot of features out of the game, players still found ways to be mean to each others. A player will lead another on top of a mountain only to push him down when they reach the top. As a result they removed the arms of the characters.
Limiting the number of online players to only 2 also allowed both players to not band with others and force them to stick together.
The talk also have other gems that I think could immensely improve the way we discuss online.
(post withdrawn by author, will be automatically deleted in 24 hours unless flagged)
Either Sara needs to let that topic go, or she needs to find a dedicated place (e.g. GMO discussion areas) where others want to discuss it as much as she does, and take it there.
Sounds like a simple suggestion, right? But isn’t the tendency to do just that what creates the “echo chamber” and polarization? Sometimes that persistence is exactly what is necessary to get an important point through. Sure it’s pretty hard to distinguish the difference most of the time.
I’m a notorious axe-grinder. In a professional setting, an example might be pointing out how the decision to accept an unreasonable timeline is compromising the requirements, and the project is slowly coming to be about doing the exact things it was supposed to provide an alternative to. In a personal setting it’s main expression is probably the pain I feel listening to a discussion about the latest football game. Meanwhile I’m thinking, “C’mon, someone please can we talk about how to influence the US government to address climate change?”.
In a personal setting, I concede ground (at least some of the time), or just go find a different conversation, but in a professional or activism situation it’s self defeating to do so. The status quo is the ultimate axe, and it’s always grinding.
Honest atheists are usually welcome in religious discussions (it is called evangelism). There is a line between being contrary and clearly stating the reasons (on both sides). The “home team” can also become hateful. And I’m not sure how you learn or strengthen your opinions without exercising by looking at contrary views.
I rarely see GMOs, but often see Lizzie Borden, Social Justice Warrior. (Or Carrie Nation, if you prefer, since you axed). This is one of the roots of #Gamergate - gamers want to discuss game play, not misogyny, portrayals, in-game demographics, etc. especially with someone who doesn’t care about games.
But what if everything Fred says is true? What if the game in question is really that bad or worse? There is negativity, but there is simple truth. It isn’t “negativity” to warn people not to walk where the pet poop area is. Sometimes the truth stinks or hurts. If you follow the “always be positive rule”, no review would allow less than 3 out of 5 stars, even for a hotel that burned down, or a restaurant where you got food poisoning.
I was a moderator for some gaming message boards for a few years, and encountered all of these characters regularly.
Regulating this behaviour is easy with small forums like this. You know the community. With massive forums like that, however, picking up the context of someone’s behaviour is an intensive process. I knew who the “bad apples” were on the boards I frequented, but when dealing with hundreds of active boards about varying subjects, you have three choices:
Regardless, you then must contend with the user response. Some people will be happy with your actions, and others will think you’re a detriment to the community. I would lean more towards options 1 or 3 if I didn’t know the context. Some people thought I was out for blood, or was “corrupt”. If they’re especially persistent, they might even persuade harmless users to believe their accusations, and then you’re the one breeding apathy.
I’ve always heard that SomethingAwful ran a tight ship. Maybe their model is the one to follow.
Okay, this right here. This is a great example of the kind of thing that I consider prejudicial hate speech from a poster that apparently is incapable of empathy (most surely not actually incapable of empathy, but it is not apparent in the context of this post), and it truly upsets me.
It doesn’t clearly and unambiguously violate any guidelines that can easily be pointed to, so I don’t expect it to be removed, but I don’t want to see it, because I believe people voicing opinions like this are toxic.
This is why Discourse needs an ignore list.
Interesting theme for posts last few days.
Online “community” is an illusion. If you want real community, stop spending so much time online and actually talk to your neighbors, you know, your ACTUAL community. You’ll never build a community online when people can hide behind a persona and say whatever they want.
SomethingAwful also required a paid subscription to even go on there. I imagine people will behave themselves if they had something tangible like money to lose should someone in power ever find their behavior unacceptable.
The blog title reminded me of
which truly would be an awesome weapon to wield (even more so over TCP/IP)…
Without reading any comments or the full article for now, I just wanted to add this bit analogy: sports. In my case, soccer.
Being a bad amateur and playing around in every single opportunity I can get (not only in my homeland btw), I empathize with this.
When you join a non-competitive game, there are many kinds of people you will find and, in general, the game is much much better for everyone if people really do play cooperatively, as you said, acting as a group of friends who care about each other.
There’s a space, you join in, some folks always go there around the same time, others want to get in, score a goal, and get out… However the community behaves within the space, it is indeed only fun when people start caring.
This is also a reason why politics is so bad in my country, but that’s a whole nother topic!
I feel a key factor to empathy building is transparency about identity. One reason why Jane Elliot’s Blue Eyes/Brown Eyes experiment works so well is because in the time it takes for a person to notice eye color the brain has already started the neurological process of building empathy. This is why the inability to make eye contact is undoubtedly the most frustrating part of video chatting. Eye contact and the other tacit forms of trust-building are impossible to replicate in written discourse, even if a picture of the user is next to their remarks. Likewise, shaking hands used to be a way to see if the other person had a knife up their sleeve, and waving at a distance was a signal that you were unarmed, basically ways to identify yourself as “someone you don’t need to fear.” What are the internet equivalents to these early forms of trust in a discussion?