URLs of wisdom – W.T.Facebook edition – April 2018

This is a special instalment of the URLs of wisdom in which I round up some new and some not-so-new links about privacy, fake news and the business of community as relates to Facebook. If you have additional reading material on these topics to recommend, please add it as a comment.


When you add a feature that is really a bug…
Image credit: https://www.flickr.com/photos/cambodia4kidsorg/2825261107/

Mis-using the C word

Several community professionals have pointed out that Mark Zuckerberg’s use of community doesn’t reflect their understanding of the term – and is one of the issues at the heart of Facebook’s problems.

  • A platform is not a community – Carrie Melissa Jones responds to Mark Zuckerberg’s statement that Facebook is a community by outlining the key differences between a platform and a community:

“Communities have two key characteristics: members have a shared sense of identity and participate in shared experiences. I don’t know about you, but I’ve never felt a shared sense of identity with someone just because we’ve both used Facebook or Instagram or watched videos on YouTube.”

She goes on to lament that talking about community in this way reveals a fundamental lack of understanding about how to build community – including how to make the online spaces safe: 

“…none of the large platforms that have come to define the first 20 years of the 21st century want to admit they’re not building community, and many have no clue how to do so. However, there are people all around the world who build community professionally, study it, and could tell you that strong identity and community self-governance are key to creating community resilience and long-term safety.”

Could we have predicted this?

But could we have predicted the problems with fake news, incivility and privacy breaches on Facebook that we’re now becoming increasingly aware of? Some of us were already thinking and writing about them…

  • danah boyd has for around a decade referred to the affordances of “networked publics” where the ways that we are able to interact online can change the way those interactions play out – the unintentional sharing of access to your Friends’ personal data being one example of this, mediated by Facebook’s poor privacy controls and decisions around developer access to APIs.

“While such affordances do not determine social practice, they can destabilize core assumptions people make when engaging in social life. As such, they can reshape publics both directly and through the practices that people develop to account for the affordances.”

  • But Zeynep Tufecki argues that expecting the user to give “informed consent” about how their data is used is unrealistic in a world of big data and unpredictable consequences. Earlier this year she argued that data privacy is a public good and should be regulated as such:

“Data privacy is not like a consumer good, where you click “I accept” and all is well. Data privacy is more like air quality or safe drinking water, a public good that cannot be effectively regulated by trusting in the wisdom of millions of individual choices. A more collective response is needed.”

  • As for the fake news problems…if we dive into the social sciences literature, Luke Stark says there are lessons we could learn from Merton and Lazarsfeld. They identified three conditions under which propaganda in mass media might be seen – monopolization, canalization and supplementation:

“Under monopolization, they wrote, “there is little or no opposition in the mass media to the diffusion of values, policies or public images…[and] occurs in the absence of counterpropaganda.” One set of messages, uncontested, could swamp public opinion. With canalization, the authors argued, “once [a] gross pattern of behavior or a generic attitude has been established, it can be canalized in one direction or another.” In other words, channeling existing social values and beliefs towards particular ends would be easier that totally changing those values and beliefs. Finally, Lazarsfeld and Merton pointed out that supplementation through face-to-face contacts made mediated propaganda vastly more effective.”

How do we fix things?

So what options do we have as social network users – and friends of social network users – to improve our online interactions and protect our privacy?

  • Stronger privacy laws all round? “…what we have learned about the data collection practices of social media firms, advertisers, political campaigns, online publishers and other groups suggests that company-specific changes like Facebook’s will be insufficient. What is needed is for Congress to adopt rigorous and comprehensive privacy laws.”
  • Move to Europe where regulation to protect users is about to become tighter?
  • Delete your Facebook account? 

“While deleting your account may give you some temporary satisfaction, it has “little impact on finding real solutions to the way data is being collected, sold, and used against the public. This issue isn’t just about one platform like Facebook, and the issues of surveillance and experimentation on the public, it’s about the many companies that are tracking and profiling us, and the abuses of power that come from having vast troves of information about us, available for exploitation.”

  •  Emphasise the importance of communities – as Facebook says it’s doing with groups? A Buzzfeed investigation explains why this may not be a cure all solution:

“Joining Facebook groups that are related to the content being promoted, then spamming links throughout those groups using various aliases, is quite effective,” he said. “Members of the group then essentially become ‘bots’ and often share content to their network (outside of the group) creating a more organic-looking campaign.”

  • Put more emphasis on media literacy? Is a focus on increasing media literacy the solution to enabling users to be able to discern the legitimacy of content? Not so fast, argues danah boyd in a recent talk “You think you want media literacy, but do you?” She starts by describing what we think we mean by media literacy:

“For better and worse, by connecting the world through social media and allowing anyone to be amplified, information can spread at record speed. There is no true curation or editorial control. The onus is on the public to interpret what they see. To self-investigate. Since we live in a neoliberal society that prioritizes individual agency, we double down on media literacy as the “solution” to misinformation. It’s up to each of us as individuals to decide for ourselves whether or not what we’re getting is true.”

Before going on to look at the ways in which media can manipulated – along with those consuming it, describing the hard-to-address “boomerang effect” where talking about something that’s incorrect automatically raises its profile:

“One of the main goals for those who are trying to manipulate media is to pervert the public’s thinking. It’s called gaslighting. Do you trust what is real? One of the best ways to gaslight the public is to troll the media. By getting the news media to be forced into negating frames, they can rely on the fact that people who distrust the media often respond by self-investigating. This is the power of the boomerang effect.”


What effect is the current news about Facebook having on your personal or professional community activities? Are there other links that you’d add to these perspectives?

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s