Facebook's Plan to Stop People From Livestreaming Sex? You.

The live broadcasting of dick vids, beheadings, and shootings on Facebook seems inevitable. And it’ll be up to you to help stop them from spreading.
Image may contain Glasses Accessories Accessory and Magnifying
Then One/WIRED

The internet, the old adage goes, is for porn. But, with a few clicks online, you can find a lot more than that: drugs, graphic violence, you name it. In a way, the internet is the unbound and tangled web of our collective human psyche (at times, more id, than ego). So, if someone wants to see or show pretty much anything, you can bet it's happening somewhere online.

That’s why it can be somewhat perplexing when social media companies like Facebook pretend that their island of the internet is a totally under-control, family-friendly space.

Facebook announced yesterday that, along with some other new features, it’ll soon incorporate a new dedicated, searchable place on its mobile app for live video broadcasts from around the world---an important business move that keeps it a step ahead of competitors and will hopefully keep you and the other 1.5 billion people on Facebook on the site longer to look at ads. As part of the announcement, it showcased how newscasters, celebrities, and athletes have welcomed you into their lives live. What Facebook didn't mention is that if anyone can broadcast live, that means, well, anyone can broadcast live. (Remember Chatroullette? Yeah, people on the internet can be gross.)

Even with standards in place, the live broadcast of dick vids, sex, beheadings, shootings, and their ilk seems inevitable. The company understands that and says it’ll be largely up to people like you to report them to a global review team to stop them from spreading. As live streaming on Facebook becomes more popular, whether that approach will work---and how Facebook will ultimately adjudicate what we can share live---remains to be seen.

The Community

Facebook and other social media companies have long grappled with how to monitor your activity on their platforms. As a business dependent on you sharing lots of stuff, Facebook wants you to feel free to do so; but it also wants to create a space where everyone is comfortable to do so, too. To maintain that kind of environment, Facebook has "Community Standards" (read: official policies) outlining in part what kind of stuff is banned from the platform, including nudity, hate speech, and graphic violence.

Live broadcasts will be subject to those same rules. A Facebook spokesperson says that the company believes that the vast majority of broadcasters will go live to "share experiences in the moment with their friends and family." "But if someone does violate our Community Standards while using Live, we want to interrupt these streams as quickly as possible when they're reported to us," she said in an email. "So we've given people a way to report violations during a live broadcast."

In other words, it will be up to you to stop crude behavior from popping up live on Facebook. The company also seems to be hoping that the very fact that you’re sharing live to people in your network will serve as a check to keep your streams PG-13. But depending on context to keep behavior in check depends upon, well, context; not everyone uses Facebook the same way.

This kind of reporting process has long been the norm for social media platforms like Facebook. "They rely on the labor of people given freely to police content," says Sarah T. Roberts, an assistant professor at the University of Western Ontario, who studies commercial content moderation on social media.

Roberts says it's disingenuous to argue that Facebook Live won't be used for all kinds of things that could potentially violate the company’s community standards ("as well as perhaps the law," she adds). "For many people, Facebook is their primary experience of the internet, and they’ll use whatever tools are given to them to propagate material that others would find unsavory," she says. "That’s a condition of the internet itself."

The Reality

So what’s a company like Facebook to do? For its part, Facebook seems to believe that depending predominantly on all of us to report unsavory activity can and will work.

However, Roberts says, if Facebook relies on regular people to see and report crude content, that means we’ve already seen it. As my colleague Davey Alba reported, many people were outraged last year when a video shot from the perspective of a Virginia gunman began to automatically play as they passed it in their News Feeds. Even if they reported the video, it had already gone viral. It was too late for those who had unwillingly seen the graphic images.

You can easily imagine the same kind of situation happening with live video---except its also live. The immediacy and intimacy of live broadcasting can be extremely powerful for politicians, musicians, and journalists to speak directly to you from around the world. But it could also be jarringly powerful for, say, a terrorist organization live broadcasting a beheading or a small porn business live streaming sex. And since Facebook defaults to saving the live video as a replayable one, unsavory "live" content can spread even after the broadcast ends.

With live video, time itself will be its own challenge. Last year, Monika Bickert, Facebook’s head of global policy management, told The New York Times that reviewing these kinds of reports typically take around 48 hours for safety issues. Since then, Facebook has tried to review (and, if needed, remove) most videos in under 24 hours, a spokesperson says. But, when it comes to stopping the spread of an unsavory livestream, even a 24-hour review period won’t be fast enough.

"The big thing for them is that you can never actually moderate bad things to go away completely," says Annemarie Dooling, the engagement editor for Racked, who has been managing commenting communities for ten years at major media companies like the Huffington Post. "You just can't do it."

Part of the problem, she says, is that while flagging bad behavior is helpful, too many people will simply report things they don't like, meaning moderators have to sift through tons of complaints before getting to the most egregious reports. People who report bad behavior, she adds, need to use discretion and the correct reporting tools.

The Work

Roberts says Facebook will ultimately have to depend not only on people like you or me to report violations, but also compensated content moderators working for Facebook, which the company acknowledges. "We've been aggressively growing the global team that reviews these reports and blocks violating content as quickly as possible," a company spokesperson tells WIRED. In fact, moderators may even review a live video that suddenly becomes hugely popular, she says, even if it hasn't been flagged for the company.

And yet there are some actions Facebook can take to limit the more gross streams from popping up in your News Feed or Facebook Live experience. Dooling argues that instead of trying to get rid of all the bad things, Facebook should make sure that the best stuff rises to the top. To do that, it could promote verified people they trust (like celebrities or public figures) as well as other people who have consistently streamed cool things.

Hemanshu Nigam, the chief executive of SSP Blue, an online safety consultancy that helps companies moderate content, has worked with one social media company on how to moderate broadcasts live. He explains that a three-second delay, like what networks have long used on live TV, could help social media companies that have a big moderation team monitor activity before you see it. In reality, however, he says that as live becomes more popular, companies won’t be able to have large enough teams to watch every feed. "Live is at the nascent stage when it comes to moderation," he says.

Instead, he argues companies like Facebook need to enforce its community standards as harshly as it can---the threat of enforcement may help prevent bad behavior over time, allowing the community to self regulate. Nigam says the technology will also evolve such that a company like Facebook could use, say, filters to monitor certain pixel colors (those that could signal nudity or blood), pushing something like that to the front of the moderation queue. (A Facebook spokesperson says the company isn't currently using any algorithmic methods to monitor videos.)

Ultimately, it will be up to real, fallible humans at Facebook who will have to make quick assessments about live streams (like they already do for photos and videos) that point to larger questions about free speech online. As we’ve seen before, one person’s terrorist group can be another person’s freedom fighters. One person's art can be another's pornography. One culture's totally normal practice (like breastfeeding) can be sexualized in another. Facebook Live is only available in 60 countries so far, but like all things Facebook, the company plans to go fully global.

And as it spreads, it will have to continue to address how it can foster a safe environment and protect people without ignoring important cultural norms---all as the world becomes more immediate, intimate, and live.