Case study of facebook
Offensive Content Versus Free Speech The social networking site Faceable has over 200 million active members and Is available In 40 languages.
Seventy percent of Faceable users live outside the United States, less than a third are 1 college students, and the fastest growing demographic Is Individuals thirty-flee years and older. With this kind of diversity in membership, there are often opposing viewpoints regarding acceptable content. Faceable has rules prohibiting hateful, threatening, or pornographic content 2 or content that contains nudity or graphic or gratuitous violence.
Some content, forever, resides in the gray area between natural and obscene, between inflammatory and hateful. When an issue does reside in this gray area, Faceable has to make a Judgment call based on its ethics and value system, which may be at odds with its users’ values. In January 2009, Faceable removed pictures of nursing women from personal pages, citing 3 that these pictures violated the policy against nudity.
In 2008, Faceable received criticism for not removing content, specifically, the pages of anthology groups. Members of the UK parliament 4 condemned Faceable for hosting pages that include images of the UK Klux Klan.
And n 2009, 5 Faceable was pressured to remove groups that denied the Holocaust. No doubt Faceable is in a tough position. They would like to maintain freedom of speech on the Web, but critics argue that Faceable has the responsibility to decide what is appropriate for 6 users within Its terms of service and that hate groups should not be tolerated.
Issues like these are likely to continue as Faceable adds users with diverse backgrounds and viewpoints. What steps has Faceable taken to deal with these ethical dilemmas? Currently, Faceable does not actively search for content that doesn’t adhere to their policies.
Instead, they rely on users to flag this content. Questionable items go before a team that either approves or deletes an item based on their interpretation of the company’s guidelines. Faceable also seeks outside counsel, meeting regularly with human rights organizations and even 7 the US Department of State for advice on these issues. In the end, the decision to allow or remove is a judgment call based on the company’s values which are clearly not always going to be in line with every user.
As a result, Faceable has received some criticism for their decisions.