Social Cohesion and Filter Bubbles
Category: Social and Cultural Issues
The UK House of Lords Select Committee notes that "The use of sophisticated data analytics for increasingly targeted political campaigns has attracted considerable attention in recent years, and a number of our witnesses were particularly concerned about the possible use of AI for turbo-charging this approach" (Clement-Jones, et.al, 2018:para 260). One example is the use of bot Twitter accounts to sow division during the Covid-19 pandemic. "More than 100 types of inaccurate COVID-19 stories have been identified, such as those about potential cures. But bots are also dominating conversations about ending stay-at-home orders and 'reopening America,' according to a report from Carnegie Mellon (Young, 2020).
In a digital environment we are deluged with information with no real way to make sense of it all, creating what Benkler (2006) calls the Babel objection: "individuals must have access to some mechanism that sifts through the universe of information, knowledge, and cultural moves in order to whittle them down into manageable and usable scope." Using data from our previous reading or viewing behaviour, data analytics identifies patterns that we do not detect and we do not know about before they are mined (Ekbia, et.al., 2014;Chakrabarti, 2009) and feeds these back to us through recommender systems
This creates a cycle that augments and reinforces these patterns, putting people in "filter bubbles" (Pariser, 2012) whereby over time they see only content from a point of view consistent with their own. For example, "In a study of Facebook users, researchers found that individuals reading fact-checking articles had not originally consumed the fake news at issue, and those who consumed fake news in the first place almost never read a fact-check that might debunk it." (Chesney and Citron, 2018:1768)
An ethical issue here arises because "information is filtered before reaching the user, and this occurs silently. The criteria on which filtering occurs are unknown; the personalization algorithms are not transparent" (Bozdag & Timmermans, 2011). Additionally, "We have different identities, depending on the context, which is ignored by the current personalization algorithms" (Ibid). Moreover, algorithms that drive filter bubbles may be influenced by ideological or commercial considerations (Introna & Nissenbaum, 2000:177).
Examples and Articles
Smart Algorithm Bursts Social Networks' "Filter Bubbles"
IEEE Spectrum. Michelle Hampson
21 Jan 2021. “Eventually, people tend to forget that points of view, systems of values, ways of life, other than their own exist… Such a situation corrodes the functioning of society, and leads to polarization and conflict.â€
Direct Link
Do you have another example of Social Cohesion and Filter Bubbles? Suggest it here
- Course Outline
- Course Newsletter
- Activity Centre
- -1. Getting Ready
- 1. Introduction
- 2. Applications of Learning Analytics
- 3. Ethical Issues in Learning Analytics
- 4. Ethical Codes
- 5. Approaches to Ethics
- 6. The Duty of Care
- 7. The Decisions We Make
- 8. Ethical Practices in Learning Analytics
- Videos
- Podcast
- Course Events
- Your Feeds
- Submit Feed
- Privacy Policy
- Terms of Service