Unedited audio transcript from Google Recorder.
Sherida I was watching your face throughout to try because you are very expressed. I am. Yes, I've been told not to play poker.
So let me first fucks. Well, the first thoughts that I have is a while ago. I did some research on, trust specifically online trust. And one of the things that I thought about what and one of the things that I read that was fairly new as opposed to let's say 30 40 years old view on how trust was formed.
Was that empathy. Yeah was really one of the antecedents of trust that if you could communicate empathy in an environment, let's say where you don't have a face served, whatever it tended to be in the process of trust. So what's going around in my mind then is if care is somehow involved with some kind of empathy, some kind of empathic behavior um and that in a way is connected to trust is the duty of care.
Somehow related to trust in a situation. Does that make any sense? I know I'm skipping something. It makes sense to me.
So I mean again maybe I'm coming at this is the analytic philosopher who so it's sounds like into me it sounds like a causal chain from trust and perhaps trust forming mechanisms to this connection to create empathy which in turn leads to, or is part of care. Is that sound like a characterization?
Yeah, sort of I mean I look at trust in empathy as a chicken and egg kind of thing. Sure. And that they revolve around each other. And if you were to take that frame, then in what you are, looking at care. Yeah, for another entity. So let's interesting, let me throw a rock into your pond.
So I'm a devote to you less. So now than I used to be of something called fail videos, and you might not know what fell videos. Are certainly a lot of the viewers, won't basically a fail. Video is when somebody tries to do something and fails part of the spectacularly.
I actually sat down a while back and tried to think you know to myself, you know I'll put out of the components of the good fail video, right? Because there's lots of failure out there in the world, but a good feel video will include the spectacular element. Sometimes the best ones include, you know, an element that's unexpected, right?
And that's where the the fail website has actually begun to. If you will fail me recently why I'm less interested. Now, there's any number of people on escape board, sliding down the rail, losing their balance and landing with the rail right between their legs, right? That's a fail. Now that gets me because when they do that, I go, right?
So, and that I attribute to empathy. Right now, I'm more accurately that seems to need to be a feeling of empathy, right? And the more spectacular and the more unusual, the fail, the more I'm inclined to feel this sort of empathy, you know. I'm trying to think of and unexpected way of feeling, you know.
It's like the the slide down the rail, everything's perfect. I'm expecting them to lose their balance, but no, they land the rolling along. Then a car. Hits one. That's unexpected. Or I saw a video just yesterday, person standing or on their bike or in their car, whatever they're looking at an intersection and you see the two cars approach each other.
There's a collection. What among those flipping off the side of the screen? Wait to beats. Then another car started rolls along with the flipping car on top of it. Yeah. And I didn't feel so much empathy there because I didn't see any people all I'm seeing are cars but anyhow the rock in the pollen, part of this is there's no trust anywhere here.
I I'm watching these things and when they're well, done, I'm having this feeling of. Oh, no. And and that feeling to me seems to be the empathy, would that make me trust the more node make me trust them less because yeah. So but that's not to disagree with what you're saying, but it is to suggest that there might be varieties of empathy.
Yeah, I hadn't thought of that. Hang on, let me look that up quickly. Hey, why not? It might be another database but varieties of empathy. Let's see if anyone has thought of that before. Of course they have right? Okay. So we have in a website called the skills you need.com which is promoted by Google apparently.
And there are three types of empathy, cognitive, emotional, and compassionate. So maybe the one that I'm talking about is emotional and the one that you're talking about is compassionate and I'm not sure what cognitive empathy would even look like, but don't know, don't know. Yeah, we'd have to look it up.
I'm not gonna spend our video, looking things up on Google but so that maybe, yeah, you know, way of looking at it. Um, And limiters this whole thing about mirror neurons.
Yeah. So mirroring neurons. So just a quick quoting from, oh, some US government site. Okay? Mirror neurons are a class of neuron that modulate their activity. Both, when an individual executes, a specific motor act and when they observe the same or similar act performed by another individual. So I get the idea here is and that's, that's consistent with what I've read before.
The idea is that when you do something and have a feeling that say that creates, you know, that stimulates neurons in the head, like for example right now, feel pain from that which was actually bit unexpected. Then if you were to do the same thing and I would observe you some a little certainly not all of the neurons in my head that reacted to.
When I did this would react when you did this producing, presumably similar feelings. And so, the theory is that, that's the basis for empathy. Are these similar feelings produced when you have where, when you do something or you have an experience similar to, when I produce something, when I, when I do something or have a feeling, So maybe mirror neurons certainly seem to be a thing.
Mm-hmm. And it's not surprising that they would occur because it, at the very least, the visual perception is similar. So why wouldn't some of the neural reactions be similar? Do they tie into thing? And can we found in ethics on them? Is I think a bit of a different story?
Um, empathy also has to do, when, when you're doing therapy, if you mirror somebody's emotions or if you mirror, you know, through words, I'll show your understanding empathy of what a person goes through that. That's one of the ways that he was stabbed is a relationship with the other person.
The point so and and that's considered being empathetic and in many ways your training to do that. When you're doing certain forms of therapy and that in turn, you think about that mirroring motion. And that is in some turn. I think relates to the building of trust. Yeah. There's I'm not sure where I've seen this, but I I've heard of this being used, believe it or not like pick up artists and bars.
Yes. Totally will believe that yeah. Yeah. It's it's it's a form of manipulation really. Oh yeah, it absolutely is. Yeah. So I wonder if ethics is based on manipulation, that's an interesting thought. Well it's supposed to be the opposite and right? Yeah. Like, well, it is those manipulators that keep telling us what ethics we're supposed to be following.
Yeah. Yeah, visions. Yeah, so it's not that ethics is completely free of the concept of manipulation. I know the concept under the heading of active listening, okay? And and again the idea of active listening is you listen to what they said and then you repeat back. Not their words but you're wrong interpretation of what they said which is sort of what I'm doing here but of course I can be misused too, I was reading an article just the other day about how to do this when somebody is suffering from depression or whatever.
And the the instinct on the part of a lot of people is to say, oh yeah, I know how you feel. I've gone through the same thing and I felt blah blah blah which of course is exactly the wrong thing to do because you probably don't know how they feel.
Although we feel like we know how they feel, don't we? At least I do. But, you know, I think a lot of myself might be mistaken. What do you think of all this mark? This just makes sense to you. Well, it made sense but you know I'm gonna throw pebble in your front so you community sort of drove by biological determinism yeah, zip right by it.
Yeah. Yeah. Which is good but which brought in the conversation so I just wanted to share that. I immediately thought wider men have metals so I didn't want to drop that in here determinism and he used the man for your example of a feminist theory. So but yeah, I understand you even become that about.
Yeah, I felt badly about it. Yeah, and I understand. So anyway, those were just you what? That's just maybe. But as far as artificial intelligence and analytics and empathy, that's where I hit the wall, you know, and again, with your narrow neurons, you know, another, you know, more neurological departmentism, you know, not that.
I mean, it is true. And I've always used that I always thought it was spaces of learning empathy. You know, what you see, most you do kind of anything. I never thought of it in terms of empty, so I will think about that. But based on our prior discussions, I mean you've convinced me that AI is unstoppable and ethics are shouldn't resolve that ethics have nothing to do with the spread of AI.
And so, then my question about and a question, whether technology can be using an ampathetic way. As someone who has been using way too much technology, way too much of my time or when it's long to me, it seems to inhibit everything. And you know it's a problem. Everybody's working on.
It's a Mahabali's class like summer Chicago with me as Amora at the digital pedagogy labs, a very good. Yeah. You know, we talk about the duty of care a bit but we certainly, in my opinion didn't make any progress on how we could use this technology to provide care to enact this disease or college.
I just so that's where I'm stuck. So there's that's that's what I had to say. I'm sort of stuck in. How AI or digital technology in general can promote care. It's seems to be, and I think that's that reflects a dominant trend in the literature. Overall. I mean, go through any number of writers, I was Donald Clark just referenced, Jaron Lemir in the mirror.
I forgot how to pronounce the name, how they I'm so terrible with names.
Sorry, I'm just looking at up. That's not him Carol about heritage Jeremiah. I think it's just Lanier LAN IER, Jaron Jaron and I know him, especially from the video series all wrapped up in machines of loving grace, right? Oh, that's a blast from the past. Oh wait, that wasn't him though.
That was someone else curry? I mean linear though advances arguments along a similar similar vein up in that documentary films, always there was it was terrific, all all watched over by machines of loving grace and was by Adam Curtis. Yeah. Terrific series, all of his films. Are dark and rippled.
So but I mean it's you know, all you think. It's an empirical question still. I think that because the machines are still under development, we don't actually know whether they could be empathetic caring, it would sure be nice if they could because that's way better than the alternative, you know in which machines are uncaring and dystopian and interact with us along the lines of machine guns and robot dogs.
All right, you know? Which is not a caring relationship but yeah, well yeah, I constantly say we're the very beginning of this. I mostly work in business and higher educ and it's all I have to keep saying looking very beginning of this and culture has to adapt. And, but unfortunately, our tiptoe through ethics is made me a little less positive.
Yeah. When I arrived, and I wasn't all that positive to begin with. So this is, you know, so that's right. That so far in the course, I'm hoping to turn it around here and to see how the culture can
Mitigate the negative effects and promote the positive and then you did bring up power. And here we are, you know, in a very higher article system help and it's a small group of people that control the direction of this of these machines and how they evolve. And that's very worrisome and we I mean we can teach people to have the outward behavior and appearance of empathy.
We can teach people to have that so given that their, you know, given machine learning. Mm-hmm. Given that it at this point starts from some kind of human involvement. Can we not teach machines to be empathic? Well, we in terms of. Yeah, here, yeah here to be I was gonna ask right?
Yeah. When people, when we teach people to be empathic, we're teaching the overt behaviors. Does it follow that? They eventually feel empathic and that's not that's, that's an argument and psychology? Yeah. It says yes, you can do that. Yeah. Right. And, and that argument that has been made, there's some pretty good sources for that William drain, William James and principles of psychology.
Suggest that this is what happens is. Well, soaring Kirkagar from a religious context, talks about taking a leap of faith. And the idea is that if you practice your faith over time, you will come to believe your faith. And I think that we can see that empirically, at least in the sense of, you know, people report that this happens that that they do something, they create a habit or whatever, and then they begin to have the feeling associated with that habit.
So arguably that could happen with empathy, would that happen if we taught a machine that way or her feelings intrinsic to human embodiment. But then again, could there be a machine embodyment that's just as good. Yeah, but reminder of, you know, anybody on the office, inspect them personally. No, and then but although that said I wouldn't know if I did.
Yeah. And you know, I work in a research center. Full of scientists. Odds are pretty good that I do so. So I have no to help with people on the spectrum. And they work very hard to appear and empathetic and engaged in caring and, you know, yeah. But they have to talk themselves into it.
Even, I've heard people in their 60s, things are scared in their children, right? The one I speak, you know, actually is a very good scientist, but she even in her sixties has to constantly remind yourself to present in an acceptable way. So, based on my experience, I'm doubtful that a machine would learn anything that we would call it now behavior.
Yeah, we can that influence. So then that gets back to trust. So if someone was working with machines their whole life, I do not press machine. Okay? From taking apart my Volkswagen as a teenager to this working in one of the most advanced technological black boy facility. It only got as I moved that in the technology world and it got more difficult the humans.
So that one you're gonna have a real hard time to listen to me. Yeah, I'm not working. That's very stronger. Yeah, and yeah, let me get down to real simple things. If a machine falls on you, you are dead story in depends on the machine and depends on. But I know what you mean, it's an order of 19 heavier than you.
Yeah, that's it. Yeah. You know, I mean, yeah, that's a thing, there's no negotiating. There's no. Yeah it's over and I work with electricity and it was yes. And then I met people that have been there. Yeah, technology. So, even though I spent my whole life in technology, it's very interesting.
This, you know, and now I'm here I thought I was doing a good thing, help me make computer chips. Now I realized that with way more dangerous and everything, so that's part of my scanning latitude, unfortunately I have and it's not insulted. Well it's funny. We could make the leap from what you've just said to saying something like machines are malicious and evil and that would I know.
But that's that's elite. People wouldn't have nearly as much difficulty making, you know, when you give them artificial intelligence and you can start making those panels. Yeah, I was just talking about visits. Yeah, but now, if you're talking about this robot quadrats, I'm not contagious. Yeah. Right. Can't resist.
Yeah, I know. And that's why they use the term because you know it you know, and and we don't talk enough about dominating cultures. Yeah. Propaganda and all it because that's what that's what goes for my mind when I just hear anybody. Yeah. Our education politics, they went to church as a class well, and virtually as a class project, something I haven't done in, you know, 50 60 years and as people talk I just, you know.
Yep. That's my deal. Yeah, my little bit of education is in sociology, and I'm like, what is this person trying to do with this group? Yeah, and I just can't get away from that. Now, now that I got that in my head, you know, that's so, you know, this constant critical of groups.
So there's a bunch of threads that we can tug at this week and we will, I've got 1257 here. So I'm just sort of moving into wrap up mode. I'm sorry, because this was a pretty good discussion and I wanted to go on for a long time, but at least we, we have Friday again, to, to revisit some of us.
I'm going to look especially at some of the ethics of care. Philosophers people like Carol Gilligan now nodding's and others and and talk about how that influences our topic and even if we can't make our machines love us, and even if they only hate us, you know, I think there's some interesting things will be able to say and I'm not trying to convince you that machines will care for us.
Like I said, I still think that that's an open question, but the interesting question here is, how would we know, you know, you know, what are the criteria if we can even speak in those terms of caring for us? How would we know if a machine cares for us or not?
And doesn't matter if they do the right sort of thing and and even more, the point, what lessons can we draw from this discussion of care with respect to the wider topic, what how does it inform us about? You know what, our ethics in terms of analytics and AI with respect to learning?
I know there are some things that we can learn from it, but again, we have to look at that to see what it is. So I'll have more videos, my apologize at the same time, I make a promise, but I'm also going to be including more readings in the newsletter so that you're not dependent on my videos.
So I think that's something I should have been doing from the start, which I haven't been doing. And as well, I'll try to improve some of the activities that the course offers. They've been pretty lame. So far although I've been trying. But you know, I think there are things that I can do but we'll see how that goes.
It all depends on time. Of course, if you have suggestions or ideas by all means and certainly, you know, encourage you to blog. Some thoughts about how care ties into ethics and learning analytics, specifically. I think that would be helpful. Certainly this discussions been very helpful to me already, so that's, that's a good start.
I mean last words before I close off the recording. Well, all right. Well it was good to see you and talk to you both again. Thanks for joining me.
- Course Outline
- Course Newsletter
- Activity Centre
- -1. Getting Ready
- 1. Introduction
- 2. Applications of Learning Analytics
- 3. Ethical Issues in Learning Analytics
- 4. Ethical Codes
- 5. Approaches to Ethics
- 6. The Duty of Care
- 7. The Decisions We Make
- 8. Ethical Practices in Learning Analytics
- Videos
- Podcast
- Course Events
- Your Feeds
- Submit Feed
- Privacy Policy
- Terms of Service