How the Meta(verse) stokes division

The release of the trove of papers (“The Facebook Papers”) by whistle-blower and former Facebook employee Frances Haugen in 2021 suggests that Facebook manipulates and displays its content to polarise opinion and keep people engaged, if not “addicted,” in their social media news feeds. Social media platform developers configure their systems to encourage controversy.

At a US Congressional hearing, Haugen asserted “I’m here today because I believe Facebook’s products harm children, stoke division and weaken our democracy.” The charge of harming children relates to making “young girls and women feel bad about their bodies.” At the moment “The Facebook Papers” are unavailable for public reading. We have to rely on press reports.

The charge is that programs that configure your “newsfeed” on Facebook tilt what you read towards the controversial, polarising readers and contributors, and further influences people’s politics. Facebook (now under a parent brand Meta) is motivated to keep audience numbers high and stimulate advertising revenue.

The methods fall within the toolkit of the “active measures” by foreign agents outlined in a previous post (Individual-1: Kompromat 101). That’s part of the opportunity, challenge and peril of big data. Such processes are covert, involving company policies that are hidden from users, but also involve algorithms that are little understood by people who use the social media platforms.

Similar hidden processes are at work in online retail and booking systems. We become aware that some calculation is happening in the background of our interactions when we receive directed advertising, or the platforms appear to come pre-loaded with preferences derived from our purchasing history on this or other platforms.

Critical commentators such as Shoshana Zuboff indicate how patterns detected in consumers’ activity online provides information about people’s behaviour, much of which constitutes a hidden commodity traded and sold between companies, often without our knowledge or explicit consent. See posts: The future of prediction, and Surveillance capitalism and its discontents.

References

2 Comments

  1. Perhaps the widespread personalisation of search results is enough to tilt opinions to extremes.

Leave a Reply