Easing Distress
Category: Deontic Analytics
Systems that track and analyze feelings and emotions can sway users toward good or healthy emotions. For example, some products and services specifically address mental health issues. “These applications aim to coach users through crises using techniques from behavioral therapy. (A program called) Ellie helps treat soldiers with PTSD. (Another program called) Karim helps Syrian refugees overcome trauma. Digital assistants are even tasked with helping alleviate loneliness among the elderly†(Kleber, 2018).
Similar systems could interact with students and modify sentiments and emotions that may be interfering with their learning and socialization. “Applications will act like a Fitbit for the heart and mind, aiding in mindfulness, self-awareness, and ultimately self-improvement, while maintaining a machine-person relationship that keeps the user in charge†(Ibid).
Examples and Articles
Journal responds after controversial Facebook emotion study
The study, published June 17 in the journal Proceedings of the National Academy of Sciences (PNAS), was conducted by Facebook researchers to investigate a phenomenon dubbed "emotional contagion."
Direct Link
Empathy in Artificial Intelligence
"The challenge will be finding the right mixture and chemistry for the agents to be assisted by machines in providing, in combination, a more empathic and effective service."
Direct Link
Do you have another example of Easing Distress? Suggest it here
- Course Outline
- Course Newsletter
- Activity Centre
- -1. Getting Ready
- 1. Introduction
- 2. Applications of Learning Analytics
- 3. Ethical Issues in Learning Analytics
- 4. Ethical Codes
- 5. Approaches to Ethics
- 6. The Duty of Care
- 7. The Decisions We Make
- 8. Ethical Practices in Learning Analytics
- Videos
- Podcast
- Course Events
- Your Feeds
- Submit Feed
- Privacy Policy
- Terms of Service