Discrimination
Category: When Analytics Works
Schneier (2020) writes, "The point is that it doesn't matter which technology is used to identify people… The whole purpose of this process is for companies — and governments — to treat individuals differently." In many cases, differential treatment is acceptable. However, in many other cases, it becomes subject to ethical concerns.
The accuracy of analytics creates an advantage for companies in a way that is arguably unfair to consumers. For example, the use of analytics data to adjust health insurance rates (Davenport & Harris, 2007) works in favour of insurance companies, and thereby, arguably, to the disadvantage of their customers. Analytics are used similarly in academics, sometimes before the fact, and sometimes after.
Additionally, "We are shown different ads on the internet and receive different offers for credit cards. Smart billboards display different advertisements based on who we are." (Schneier, 2020) Perhaps this is appropriate if the differentiation is based on interests or affiliations, but it becomes problematic if based on gender, age or race. For example, predictive analytics may turn potential students away from programs based on prejudice. "Algorithms might be reinforcing historical inequities, funneling low-income students or students of color into easier majors" (Barshay and Aslanian, 2019)
We are already facing a potential worst-case scenario in Facebook. This case was made by Sacha Baron Cohen in a recent address. He "calls the platforms created by Facebook, Google, Twitter, and other companies 'the greatest propaganda machine in history' and blasts them for allowing hate, bigotry, and anti-Semitism to flourish on these services" (Baron Cohen, 2019; ADL, 2019).
A significant impact of discrimination is that AI and analytics may be used to deny basic human rights. For example, Access Now (2018) writes, "Looking forward: If AI technology is used for health and reproductive screening, and some people are found to be unlikely to have children, screening could prevent them from marrying, or from marrying a certain person if the couple is deemed unlikely to conceive. Similarly, AI-powered DNA and genetics testing could be used in efforts to produce children with only desired qualities."
It is not a stretch at all to see more fine-grained discrimination applied to the allocation of learning opportunities, limitations on employment, and other impacts. For example, in a case where failure was determined by predicted learning events, the "Mount St. Mary's University... president used a survey tool to predict which freshman wouldn't be successful in college and kicked them out to improve retention rates" (Foresman, 2020).
Examples and Articles
A Detroit community college professor is fighting Silicon Valley’s surveillance machine. People are listening.
"Far from academia’s elite institutions, Gilliard, 51, has emerged as an influential thinker on the relationship between trendy tech tools, privacy and race. From “digital redlining†to “luxury surveillance,†he has helped coin concepts that are reframing the debate around technology’s impacts and awakening recognition that seemingly apolitical products can harm marginalized groups.
While some scholars confine their work to peer-reviewed journals, Gilliard posts prolifically on Twitter, wryly skewering consumer tech launches and flagging the latest example of what he sees as blinkered techno-optimism or surveillance creep. (Among his aphorisms: “Automating that racist thing is not going to make it less racist.â€) It’s an irony of the world Silicon Valley has constructed that an otherwise obscure rhetoric and composition teacher with a Twitter habit could emerge as one of its sharpest foils.
Among a growing chorus of critics taking on an industry that’s remolding the world in its image, Gilliard is not the most prominent or credentialed. Yet his outsider status is integral to a worldview that is finding an audience not only on social media but in the halls of academia, journalism and Washington."
Direct Link
Micro-Targeting Voters
"The 2004 Bush campaign sorted voters into 30 categories according to their lifestyles, affinities, interests and ideologies and developed different campaign messages to suit each category."
Direct Link
Do you have another example of Discrimination? Suggest it here
- Course Outline
- Course Newsletter
- Activity Centre
- -1. Getting Ready
- 1. Introduction
- 2. Applications of Learning Analytics
- 3. Ethical Issues in Learning Analytics
- 4. Ethical Codes
- 5. Approaches to Ethics
- 6. The Duty of Care
- 7. The Decisions We Make
- 8. Ethical Practices in Learning Analytics
- Videos
- Podcast
- Course Events
- Your Feeds
- Submit Feed
- Privacy Policy
- Terms of Service