“Last year when I was in Asia, I watched a class being taught by a robot and it was really fascinating because at the end of the session…I had the opportunity to talk with the kids and I asked them about the experience. And you know the story: the teacher-student relationship is critical; the student-student relationship is critical. What did these kids say? ‘Eh, we prefer the robot to the teacher.’”
- John Hattie
I really like this whole premise. It tickles me. I want to be clear right here that I don't for one instant take it seriously. Robots are coming to take my job, and the kids will be happy about it? K. My school doesn't even have Chromebooks for everyone yet and the printer outside my classroom has been repaired so many times it's basically a brand new printer that still constantly jams. Less than half the classrooms at my school have SMARTboards. But the robots are coming.
I have to assume that part of this whole thing is 1) not everything from the speech is being reported and b) he's intentionally being hyperbolic because that's how we get heard now. But, for a moment, let's take his points at face-value. To review and expand, Hattie says that the robot is not judgmental of the students, it doesn't care if the kid has a disease or a problem, it doesn't get frustrated, and it will help us normal fleshy teachers overcome bias.
First- HA!
Second- Hahaha!
Ok, who programs these robots? Because they aren't programming themselves. The assumption that technology is lacks bias simply because it's technology is purposefully blind. He's making the leap that whoever is coding these teaching robots will code them to, what, not see color? Or they'll be coded in such a complex way that they'll see and understand every student's cultural background? How would that even work? You code a robot to respond in a completely neutral way to students, regardless of what inputs the student delivers. Which makes it a better teacher, or even helpful how?
Listen, I can't stand the constant "Relationships Matter" nattering that we're subjected to by thought leaders, but just because it's annoying doesn't mean it's wrong. My problem with it is more "Yeah, we know, move forward" than "I disagree". The kids do want a teacher who cares, they want a connection, humanity in general operates better when there's a connection. And I honestly think Hattie knows this, which makes me suspect he's using the "kids told me they prefer the robot" story as a hook to get people to listen to the "robots will reduce bias" part. Because I would not be surprised if he and others believe that is a thing that is possible. Taking "I don't see color" to the nth degree and then wiping your hands with a satisfied, "Welp, we solved it." Because in what universe does data support that? And don't tell me that's not what he's saying, because what else can it mean about reducing human bias? Being able to do that purely makes this whole idea fantasy instead of science fiction. You can't remove bias, you can only see it within yourself and respond to it. That goes for the programming team too.
The idea that it's good that a robot could answer the same question multiple times without getting frustrated means that Hattie does not live with both a child and an Amazon Alexa. Did you know that if you say, "Alexa, ask for a fart" she will make one of a dozen hilarious fart sounds? She will. And she will never get tired of the question. Never get frustrated with the child laughing and shouting for another fart, a juicier one this time. I'm a grow'd up, and if I were presented with a robot teacher that would answer the same question over and over without getting frustrated I know for a fact I'd see that as a challenge. Can I kill this thing's battery making it tell me if my essay is good enough yet? Of course the kids liked it more!
Robot teachers are antithesis to everything I hold dear as a teacher, but I'm still not taking his idea personally because I still can't take it seriously. How far away is this technology? I mean, what he's saying, a fully automated robot teacher. In my lifetime? Are we talking Robby the Robot? Are we talking Robot Daddy? Are we talking my favorite golden-skinned android Data? If so, listen- I want Data as a teacher's aid. Data is cool. I mean, he only took over the ship and endangered everyone on board four or maybe five times. I teach in the United States, we won't ban anything just because it endangers students. Give. Me. Data.
If we do get robot teachers will they be coded with Asimov's Three Laws?To review, for those of you who didn't grow up nerdy-
First Law- A robot may not injure a human being or, through inaction, allow a human being to come to harm.
Second Law- A robot must obey orders given to it by human beings except where such orders would conflict with the First Law.
Third Law- A robot must protect its existence as long as such protection does not conflict with the First or Second law.
The crux of most of Asimov's Robot books hinge on the inherent paradoxes of the Three Laws. I would give my students until lunch time to completely wreck a Three Laws Certified robot.
I'm also forced to assume that Hattie, a distinguished researcher, studied many many classrooms that were taught by a robot teacher before reaching this conclusion. The examples in the speech in question couldn't possibly all come from a single experience, cherry-picked to prove a point. That's bad data. (Not bad Data though, this bad Data.) Most of my information is coming from this article on tes.com written by Emma Sieth. But I'm not trying to make a point at a giant conference, I'm trying to mock one made at a giant conference.
At the end of that article there's a reference to this one by Tes Reporter from 9/11/17 about Sir Anthony Seldon saying teachers will be glorified aids while robots do the real work within ten years. To which my response is *copy+paste entire blog post*.
Hattie's biggest failing here is the idea that not getting frustrated, not remembering who got in trouble yesterday, not understanding bias are all positives. That the Human Element is what's holding teachers and students back. The programming needed for this to be actualized will require something to be centered as "normal." And I wonder wonder Ooh, ohOhOhOh, who'll write the book on that.
Honestly, I love technology and I'm really excited about all the possibilities of the future once we get past the backwards bigots holding us back in a past that never existed, and I think having robots would be real neat. But until a Roomba won't drag dog poop all over the house, I'm gonna feel pretty secure in my job.
If you like this post and the other posts on this blog you should know I’ve written three books about teaching- He’s the Weird Teacher, THE Teaching Text (You’re Welcome), and A Classroom Of One. I’ve also written one novel- The Unforgiving Road. You should check them out, I’m even better in long form. I’m also on the tweets @TheWeirdTeacher.