On Nov. 7 a women’s advocacy group, the Uprising of Women in the Arab World, complained Facebook had deleted one of its photos and suspended five administrator accounts for its Facebook page. The apparently offensive photo showed an unveiled Arab woman in a sleeveless top, holding, in a call for liberation, a passport photo of herself wearing the hijab.
Among other content Facebook temporarily censored in 2012 was an image and caption criticizing President Obama for his handling of the attack on the U.S. consulate in Benghazi, Libya. The social media platform also took Mike Huckabee’s “Chick-fil-A Appreciation Day” event page offline for 12 hours during the summer media firestorm over Chick-fil-A president Dan Cathy’s support for heterosexual marriage.
In each of these cases, Facebook later apologized and claimed the content had been removed by mistake. But Craig Parshall, director of the John Milton Project for Free Speech at National Religious Broadcasters, a Christian association, says such incidents have become all too common. The John Milton Project published a report in September detailing examples of censorship from new media companies like Facebook, Google, and Apple. The companies’ policies allow them to remove user-generated content the companies deem offensive, Parshall says, including so-called “hate speech, or controversial political or religious content, even if it would be otherwise lawful.”
More examples: In 2010 and 2011, responding to complaints from homosexual activists, Apple permanently removed from its App Store applications from two groups—Manhattan Declaration and Exodus International—that promote heterosexual marriage. Google’s video-sharing service, YouTube, last May blocked as “hate speech” a youth speaker’s video that warned against gay “marriage.” Earlier in 2012, Ryan Faust, senior pastor of Grace Church in Seattle, told me Facebook had censored lengthy comments he posted regarding the Bible and gay “marriage.”
Parshall is concerned these social media platforms may develop a habit of censoring any speech, Christian or otherwise, they consider controversial or offensive. The John Milton Project has published a framework calling for media platform providers to adopt content policies aligning with First Amendment rights: Criminal, obscene, or sexually explicit material that might be seen by children can be censored, but not speech that merely offends some person or group.
Parshall admits the First Amendment probably doesn’t apply to private businesses, but offers the analogy of a phone company that might disagree with a church’s beliefs: “What if your church [staff] go into work one day at the local church and they pick up their phones and they’re all dead?” That would seem unfair to most people, and federal regulations already prohibit utilities from such discrimination.
Like a phone company, Parshall argues, YouTube, Facebook, and iPhones are “platforms” that enable users to publish content. The users (“content providers”) are video makers, phone users, or Facebook members. Users should be free to publish whatever they choose, like newspaper editors. Platform providers, on the other hand, should serve all customers even if they disagree with the content provided.
“We are not looking for government regulation,” says Parshall. His group isn’t ready to call social media companies “utilities” yet, but hopes public awareness about censorship—and awareness on Capitol Hill—will pressure companies to adopt a First Amendment approach voluntarily: “We think we can strike a balance between free enterprise and free speech.”