Surveillance Culture
Category: Social and Cultural Issues
Above, we discussed the ethics of surveillance itself. Here, we address the wider question of the surveillance culture. This refers not only to specific technologies, but the creation of a new social reality. "Focusing on one particular identification method misconstrues the nature of the surveillance society we're in the process of building. Ubiquitous mass surveillance is increasingly the norm" (Schneier, 2020). Whether in China, where the infrastructure is being built by the government, or the west, where it's being built by corporations, the outcome is the same.
Surveillance becomes data which then becomes a mechanism that disadvantages the surveilled. Insurance companies use surveillance data to adjust rates, penalizing those they feel are a higher risk. (Davenport & Harris, 2007;Manulife, 2020; Allstate, 2020) Cafe chains are using facial recognition to bill customers. (Sullivan & Suri, 2019). "A man tries to avoid the cameras, covering his face by pulling up his fleece. He is stopped by the police and forced to have his photo taken. He is then fined £90 for 'disorderly behaviour'" (Lyon, 2017). "The Republican National Committee and the Trump campaign have reportedly compiled an average of 3,000 data points on every voter in America," enabling it, arguably, "to wage an untraceable whisper campaign by text message" (Coppins, 2020).
What we are finding with surveillance culture is the 'elasticity' of analytics ethics (Hamel, 2016) as each step of surveillance stretches what we are willing to accept a bit and makes the next step more inevitable. The uses of streetlight surveillance are allowed to grow (Marx, 2020). Surveillance becomes so pervasive it becomes impossible to escape its reach. (Malik, 2019). And nowhere is this more true than in schools and learning. The goal is "to connect assessment, enrollment, gradebook, professional learning and special education data services to its flagship student information system" (Wan, 2019). Or, as Peter Greene (2019) says, "PowerSchool is working on micromanagement and data mining in order to make things easier for the bosses. Big brother just keeps getting bigger, but mostly what that does is make a world in which the people who actually do the work just look smaller and smaller."
Audrey Watters captures the issue of surveillance culture quite well. It's not just that we are being watched, it's that everything we do is being turned into data for someone else's use - often against us. She says "These products — plagiarism detection, automated essay grading, and writing assistance software — are built using algorithms that are in turn built on students' work (and often too the writing we all stick up somewhere on the Internet). It is taken without our consent. Scholarship — both the content and the structure — is reduced to data, to a raw material used to produce a product sold back to the very institutions where scholars teach and learn." (Watters, 2019).
She continues (Ibid), "In her book The Age of Surveillance Capitalism, Shoshana Zuboff calls this 'rendition,' the dispossession of human thoughts, emotions, and experiences by software companies, the reduction of the complexities and richness of human life to data, and the use of this data to build algorithms that shape and predict human behavior."
Examples and Articles
The Age of Surveillance Capitalism by Shoshana Zuboff review – we are the pawns
"Zuboff is no stranger to this territory. In her 1988 book "In the Age of the Smart Machine", she addressed at the moment of their appearance in the business world many of the issues that have come to achieve dominance in our everyday life. Embedded within a large pharmaceutical company in the 1980s, she observed first-hand how new tools for internal communication, first welcomed by employees as novel social spaces in which they could better converse, plan and access information, were gradually recognised as tools for management intrusion and control. Aspects of employees’ personal experience that were implicit and private suddenly became explicit and public, were exposed to scrutiny and made the basis for evaluation, criticism and punishment. Now it is the interiors of all our lives that are exposed to invisible overseers, who do not merely profit from our actions, but increasingly control their every expression.
Consider the apparently benign game Pokémon Go, both a ridiculous and a transparent example of the link between behavioural surplus and physical control. While its initial players lauded the game for its incitement to head outside into the “real worldâ€, they in fact stumbled straight into an entirely fabricated reality, one based on years of conditioning human motivation through reward systems, and designed to herd its users towards commercial opportunities. Within days of the game’s launch in 2016, its creators revealed that attractive virtual locations were for sale to the highest bidder, inking profitable deals with McDonald’s, Starbucks and others to direct Pokémon hunters to their front doors. The players think they are playing one game – collecting Pokémon – while they are in fact playing an entirely different one, in which the board is invisible but they are the pawns. And Pokémon Go is but one tiny probe extending out from Google and others’ vast capabilities to tune and manipulate human action at scale: a global means of behaviour modification entirely owned and operated by private enterprise."
Direct Link
A Detroit community college professor is fighting Silicon Valley’s surveillance machine. People are listening.
"Far from academia’s elite institutions, Gilliard, 51, has emerged as an influential thinker on the relationship between trendy tech tools, privacy and race. From “digital redlining†to “luxury surveillance,†he has helped coin concepts that are reframing the debate around technology’s impacts and awakening recognition that seemingly apolitical products can harm marginalized groups.
While some scholars confine their work to peer-reviewed journals, Gilliard posts prolifically on Twitter, wryly skewering consumer tech launches and flagging the latest example of what he sees as blinkered techno-optimism or surveillance creep. (Among his aphorisms: “Automating that racist thing is not going to make it less racist.â€) It’s an irony of the world Silicon Valley has constructed that an otherwise obscure rhetoric and composition teacher with a Twitter habit could emerge as one of its sharpest foils.
Among a growing chorus of critics taking on an industry that’s remolding the world in its image, Gilliard is not the most prominent or credentialed. Yet his outsider status is integral to a worldview that is finding an audience not only on social media but in the halls of academia, journalism and Washington."
Direct Link
Surveillance Studies Research Center
"Surveillance Studies is a burgeoning, global, interdisciplinary field that is producing new and important theoretical and empirical understandings of human behavior. "
Direct Link
Do you have another example of Surveillance Culture? Suggest it here
- Course Outline
- Course Newsletter
- Activity Centre
- -1. Getting Ready
- 1. Introduction
- 2. Applications of Learning Analytics
- 3. Ethical Issues in Learning Analytics
- 4. Ethical Codes
- 5. Approaches to Ethics
- 6. The Duty of Care
- 7. The Decisions We Make
- 8. Ethical Practices in Learning Analytics
- Videos
- Podcast
- Course Events
- Your Feeds
- Submit Feed
- Privacy Policy
- Terms of Service