Facebook’s New Attempt at Transparency Is Obliviously Opaque
After years of being notoriously tight-lipped on its inner workings, Facebook announced Thursday that the company is launching an initiative that will seek to provide answers to the "hard questions" about the social media platform. The initial discussions will center around predetermined subjects of terrorism, censorship, and fake news, but the platform has also set up a dedicated email address for users to submit their own feedback in the form of questions and concerns about the choices Facebook has made over the years.
The new program was put forth by Facebook via blog post:
Today, we're starting a new effort to talk more openly about some complex subjects. We hope this will be a place not only to explain some of our choices but also explore hard questions, such as:
- How should platforms approach keeping terrorists from spreading propaganda online?
- After a person dies, what should happen to their online identity?
- How aggressively should social media companies monitor and remove controversial posts and images from their platforms? Who gets to decide what's controversial, especially in a global community with a multitude of cultural norms?
- Who gets to define what's false news — and what's simply controversial political speech?
- Is social media good for democracy?
- How can we use data for everyone's benefit, without undermining people's trust?
- How should young internet users be introduced to new ways to express themselves in a safe environment?
In theory, it seems like Facebook is taking a step in the right direction. After nearly a year of ongoing drama over the potential influence of fake news disseminated on Facebook during the presidential election, questionable promotion of fake content, the use of the platform for terrorist recruitment, and murders and suicides being broadcast live, it's finally trying to tackling the outstanding questions that aren't answered in company press releases. But what comes across in the execution thus far feels like it may be too little, too late.
The first post, "How We Counter Terrorism," reads less like a curtain-raiser and more like a presentation to a business school classroom. It's littered with buzzwords and self-congratulatory statements and promises things that are already in motion as opposed to putting forth new moves to fix what's not working. Yes, Facebook has made tremendous steps to move forward in combating an increasingly global problem. Still, it doesn't really put forth anything that adds to conversations that have already been underway for months.
Even if you're willing to look past these facts, the same problem that has existed in all of Facebook's previous attempts at transparency is still present in this "hard questions" rollout: it's about the appearance of engagement and not the true act of integrating user and platform. That's not to say that users should have access to proprietary information like algorithms and process — they shouldn't, and can't, for very good reason — but there should be a way to openly create a dialogue that doesn't feel like a lecture.
In examining this first attempt to create an official forum for discussion, it's immediately clear that it's not the true focus of the initiative. If it were, there would be a section for public comments and discussion and not just a button to "like" the post or share it with friends. The email address that has been set up is a good step, but as with all of the public emails that Facebook and other social media platforms have in place, it's unlikely that you'll get a thoughtful or personal response to whatever it is you send forth into the abyss. And more importantly, a sense of community — the very foundation on which Facebook is built — is lacking. The whole effort seems like a clinical attempt to justify what's faced heavy scrutiny in the past.
In order to get a true understanding of the "hard questions" faced when it comes to social media, there has to be a real conversation between people: an exchange of views, a critical look at some potentially ugly facts, and a sense that there's a person on the other side of the screen who's taking responsibility for what is and isn't happening. Until Facebook is willing to talk with, rather than at, its users through initiatives like this, it will continue to come across as an algorithm answering what it believes your questions might be — and as we've seen over the last few years, that algorithm doesn't always speak truth to power.