(netsanity.net) |
Online social interaction has been a fraught topic since its earliest days, and although the technology has progressed, the fear remains. So, too, does the promise.
“Stranger danger” has new relevance in the digital age, where users take it as a given that the person on the other end of the keyboard may not be who they say they are.
In addition to the age-old stranger danger trope, online forums are full of cyberbullying, hate speech, and abuse.
But social media offers benefits that often outweigh these risks, and many vulnerable and isolated groups are finding community online.
How can users and providers tip the balance, facilitating more positive and pro-social engagement and discouraging abuse and bullying?
Experts have been weighing in on this issue for years, and many of the suggestions made a few years ago have been tried, without much success.
Ideas, such as the ones listed in this 2013 BBC article, including “add a report abuse button to everything,” “get the police to do it,” and “force people to use their own names” are all strategies that have backfired.
For example, Facebook’s real name policy has been widely panned for its negative impact on women, abuse victims, and minorities.
Their option to report a violation of community standards has been co-opted by trolls, is differentially enforced, and is frequently used to silence the very people it was meant to protect.
Twitter’s introduction of a report button met significant skepticism, for fear of the same issues cropping up.
And when it comes to having the police handle it, criminalization often won’t work when it comes to abusive speech online.
Dr. Martin Graff, head of research in psychology at the University of South Wales and an expert in online interaction, told Vice, “most of the harassment that people are worried about is not actually illegal, not something you can be jailed for. When words are hurtful, the best solution is to sit people together, and bring some context.”
Adding context is critical.
When Nextdoor found that their neighbourhood social app had a racial profiling problem, they tackled it through design changes that encouraged users to think more carefully about what they were posting.
They saw a 75% reduction in racial profiling posts.
This offers hope for designers of other platforms, also.
Quartz reminds designers that, “There’s no such thing as an unbiased platform. Every digital interaction encourages certain behaviors, and every media channel has limitations. Actively shaping those parameters to encourage civil, factual discussion isn’t only justifiable—it’s nearly an ethical necessity.”
Virtual reality may provide a new frontier for the “active shaping” that Quartz is so optimistic about, by re-humanizing the voice across the digital divide.
There is evidence that people are more civil when speaking face to face, and VR may bring that dynamic back into the digital realm.
But without careful design choices, VR will run into the same issues of rampant misogyny and abuse that other platforms experience, as evidenced by the fact that there have already been reports of sexual harassment in VR spaces.
In 2015 UNESCO released a 60-page report on the topic of countering online hate speech.
Their conclusion? The issue is complex.
The digital environment is not fully understood, and human behaviour online is adapting at a rapid pace.
Awareness, responsiveness, and intentional design are going to have to be part of the solution on the side of providers.
And for users? Psychology may offer some answers.
Cognitive behavioural training, cognitive bias modification, inhibition training, and empathy training are all strategies that can help prevent abuse and shift the internal motivations that drive abusers to tweet, post, and hashtag with such violence.
Changing the way users think may help cut down on the negative aspects of online interaction, and it may also help more people find meaningful connections with each other.
That potential for connection is one of the true gifts of the digital era.
About Tiffany Sostar
Tiffany is a published academic, an editor with the Editors Association of Canada, an independent scholar and researcher, and a self-care and narrative coach. She is particularly interested in the intersection of technology and identity - how our tools shape our selves and change our stories, and in how the nature of work is changing as we incorporate more technology into our daily lives.
No comments:
Post a Comment