Twitter Launches New Tool That Lets Women Report Harassment

Twitter has been taking an active approach against harassment and inappropriate content this year, and now, the social media site is using yet another weapon in its fight against online trolls and bullies.
twitter
Ariel Zambelich / WIRED

Twitter has been taking an active approach against harassment and inappropriate content this year, and now, the social media site is using yet another weapon in its fight against online trolls and bullies.

The company has partnered with a nonprofit known as Women, Action & Media, or WAM, on a tool that will allow people to report abuse and harassment on Twitter and get the issue resolved within 24 hours. WAM will monitor the incoming reports and bring them to Twitter, as well as track Twitter's responses to help the company improve its policies around harassment.

The move, which was first reported by The Wall Street Journal on Friday, comes just months after Twitter CEO Dick Costolo was bombarded with questions about harassment on Twitter during an earnings call question-and-answer session this summer. Since then, the company has taken what some feel is an aggressive approach to chasing bad actors off the site. In August, Costolo announced that Twitter would be suspending all accounts that were spreading images from the video of photojournalist James Foley being beheaded by ISIS. And, after Robin Williams' daughter Zelda received a flood of disturbing images of him on Twitter in the wake of her father's death, Twitter also committed to delete any images of the deceased, upon the family's request.

More recently, however, gender-based abuse has been one of Twitter's biggest problems, as women who speak out against sexism in gaming have been subjected to death and rape threats online. The partnership with WAM will be focused on gender-related issues, but will also look out for harassment that includes racists or violent threats.

Managing all that may be a challenge, however for WAM's two-person team. As WIRED's recent feature story showed content moderation is not only a thankless job, but a psychologically damaging one. Plus, even once an account is suspended, it's easy for trolls to create another account and pick right back up again. That's one reason why trying to mitigate harassment online has often been likened to a web-wide game of Whack-a-mole. Plus, WAM's own website explains that the non-profit can't force Twitter's hand or require it to moderate or delete content. "We're not Twitter, and we can't make decisions for them," it says. "We're going to do our best to advocate for you with them, though."

The larger goal is to give Twitter a better understanding of just what is happening on its site, so the company can better address these issues in the future. "We're using this pilot project to learn about what kind of gendered harassment is happening on Twitter, how that harassment intersects with other kinds of harassment (racist, transphobic, etc.), and which kinds of cases Twitter is prepared (and less prepared) to respond to," WAM's site reads.

Still, it's a promising sign that Twitter is listening to its users and doing something to address their safety concerns head on. It may take time to develop systems that can truly keep dangerous and damaging content at bay, but understanding the root of these issues is a good place to start.