1.2 – AI On The Job: Robot Co-Workers

Jessica Chobot takes command of a robot tasked with sorting and fulfilling online orders in search of the truth – will AI really replace us all? Find out if robots are up for the job or just a task.
Transcript
Subscribe
All AI: Hype vs. Reality Podcasts

Is AI really coming for all of our jobs? Or is fear of a robot takeover based on hype? To find out, host—and fellow human—Jessica Chobot talks with leading AI experts to uncover the source of the hype, reveal just how challenging it is to train a robot, and clearly define the role of machines in the workplace (spoiler: they may work, but they don’t have jobs). Then she heads to Kindred in Toronto, where she’ll pilot the aptly-named SORT robot, putting it to the task of sorting online orders and answering the question: AI co-workers—hype or reality?

What you’ll hear in this episode:

  • Meet SORT, the robot that… you guessed it… sorts
  • Will robots really take over our jobs?
  • The uncertain next decade for AI and automation
  • That time Elon Musk made his Tesla factory too automated
  • The difference between a task and a job
  • How does a robot know when it’s being taught?
  • Tennessee by way of Toronto
  • Jessica plays a very expensive game of “claw machine”
  • Who knew grabbing packages via robot would be so easy to … pick up?

Guest List

  • Dave Graham is the director of emerging technologies messaging at Dell Technologies and specializes in AI and social transformation.
  • James Bergstra is the co-founder and head of AI Research at Kindred.ai, a robotics and artificial intelligence company that develops robots to solve real-world problems.
  • Sai Cherla is an AI robotics pilot and product manager at Kindred.ai.
  • Geoffrey Hinton is an engineering fellow at Google, an emeritus professor at the University of Toronto and the chief scientific adviser of the Vector Institute. Hinton is a pioneer in the branch of machine learning referred to as deep learning.
  • Andra Keay is the Managing Director of Silicon Valley Robotics, a non-profit industry group supporting innovation and commercialization of robotics technologies.
  • Dylan Losey is a postdoctoral scholar at Stanford. His research focuses on robotics at the intersection of human-robot interaction, machine learning, and optimal control.

Jessica Chobot:                 I’m Jessica Chobot, and this is AI Hype Versus Reality, an original podcast from Dell Technologies. And I’m at Kindred, an AI company in Toronto to meet their sorting robot.

Jessica Chobot:                 I don’t even know where to begin to start describing this thing. It kind of reminds me of like those car assembly line arms that you see.

Jessica Chobot:                 So the robot is called Sort, and it’s used in warehouses to sort online purchases. Basically, a bunch of orders, like tee shirts and stuff, get dumped into a big bin and it’s this robot’s job to sort which product is going to which person. And it uses AI to figure out how it’s going to pick something up.

Jessica Chobot:                 The front, where the head is, there’s a little suction, and then what looks like mandibles from an insect that is also grabbing things.

Jessica Chobot:                 I’m also standing next to James Bergstra. James is Kindred’s co-founder and head of their AI research.

James:                                  I am very proud to see that you talk about our robot moving in a human-like way, because that was a real inspiration for us when we were designing this thing.

Jessica Chobot:                 Yeah. It moves fast, too. Right there when it did that quick arch, I’m like, “Oh, that thing’s going to miss and slap me right in the face.”

Jessica Chobot:                 Sometimes Sort can’t pick up something and it needs a human’s help, a human that can teach the AI how to grab the package. And to do that, I’m going to have to learn to pilot this beast. But before we get to any of that, here’s the hype around AI and robots.

Jessica Chobot:                 Intelligent robots will clean our homes and cook our food. All labor will be automated, rendering all jobs obsolete. Robots will become self-aware and realize that they don’t need humans anymore. All of this is going to happen any day now.

Jessica Chobot:                 I am joined by our resident AI expert, Dave Graham, the Director of Emerging Tech and Messaging at Dell Technologies. So quick reality check, what do you think of all the hype surrounding AI robots? Do we have to worry about robots taking over?

Dave:                                    I don’t think we have to really worry about robots taking over. They’re very functional and they’re very functionally appropriate for what they do. So a lot of what we’re looking at is the augmentation of, not the replacement of or kind of the helping hand aspect of human machine partnerships, really, and that’s what we want it to be. We want it to be symbiotic in nature. You know we’re in that stage where we just can’t conceive of that next place. It just kind of boggles the mind what we could possibly do.

Dave:                                    I heard from Geoffrey Hinton on this exact subject. He’s the godfather of deep learning and a recent winner of the A.M. Turing Award, which is kind of like the Nobel Prize of computing. He thinks it’s very hard to predict what’s going to happen in the next decade.

Geoffrey:                            If you’d asked me 10 years ago would we today have neural nets do machine translation that used no linguistic knowledge that just learned it all from scratch, I’d say, “No, that’s crazy. That’s too much to hope for.” But we’ve got it. So I don’t think you can predict 10 years in the future with this stuff.

Geoffrey:                            The other thing is, I think it’s very sensible for people to be thinking about it now. I’m not saying people shouldn’t think about it. It’s just a question of what are the chances, what are the chances that in 10 years time, computers will have taken over everything? I think the chances are very, very slight of that. But if you ask me, what are the chances that in 50 years time, there’ll be some power struggle? I don’t know.

Jessica Chobot:                 Well, that sounds ominous. I mean, 50 is still not that far off. On a more ground roots level, should we be concerned about the effect that robots are actually going to have on jobs? I know a lot of people are very nervous about that. I mean, maybe not only just from factories, but even to something like what you and I do?

Dave:                                    I think there’s always an opportunity in society to enable the workforce to develop new skills, right? So coming alongside of robots being, again, augmentation, we talk about robotic limbs and robotic helpers to help warehouse workers be able to lift heavier objects, right? You get more efficacy out of your workforce that way. So that’s a true augmentation, not a replacement of.

Dave:                                    I think some of the skills that people are worried about have already been a foray for robotics, right? So we look at even the overcorrection that Mr. Musk talked about in Tesla, where he thought he could build the entire line in an automated fashion. He actually had to step back from it and say, “You know what? I over-rotated on this. Perhaps it wasn’t truly something that should’ve been 100% automated.”

Dave:                                    You have to take a step back and realize that human beings who are in the workforce otherwise inherently, we can draw logical conclusions. We can do things that machines can’t necessarily synthesize now.

Dave:                                    I also reached out to someone else to get an insider perspective. Andra Keay is the Managing Director of Silicon Valley Robotics, a nonprofit that supports innovation and commercialization of robotic technologies.

Andra:                                  We use robot as an avatar for all of our concerns about the digital world. So we often get scared because we think, “Wow, if a robot can do that one thing well, what else can it do?” The reality is that they are not capable of multitasking or multipurpose behaviors. Humans are capable of doing a job. A robot is capable of doing a task.

Andra:                                  And I’ve just had a lot of conversations with various industries who’ve deployed robotics, and they say by and large, even when a robot does the work of four people, those four people remain employed because the company is more productive, but they shift to higher value tasks.

Andra:                                  Where we see the match up happening is in where are there constant labor shortages, and that’s a pain point where people are willing to potentially change their infrastructure to allow for the robotization of tasks. We see things, like people rearchitecting their orchards so that they are more amenable to machine operations, where trees are being trained to grow in a more two-dimensional shape than a three-dimensional shape. That’s allowing for various combinations of machine human augmentation at work in the orchards.

Jessica Chobot:                 So Dave, it sounds like what she’s saying is that rather than the AI or the robot being an actual worker per se, it’s more of a tool that the worker can use in order to free up time for them to think about things in the bigger picture.

Dave:                                    Yeah, like a smart hammer that’s been banging away at it and you feed that pipeline, it just does what it knows to do, right? Absolutely.

Jessica Chobot:                 Well, actually, I, Dave, am going to get hands on myself with trying to pilot one of these kind of worker bee robots, so wish me luck that I don’t destroy somebody’s incoming package that they paid a lot of money for.

Dave:                                    Well, best of luck to you, Jessica.

Jessica Chobot:                 But before I get to that, I’m going to talk with a researcher who’s convinced that what’s missing when it comes to training AI robots is a human touch.

Dylan:                                   So I’m Dylan Losey. I’m a PhD in mechanical engineering, and I study human robot interaction at Stanford University within the Stanford Artificial Intelligence Laboratory.

Jessica Chobot:                 So Dylan and his team, they have this demo where a robotic arm is holding a cup of water and keeps trying to place the cup on the table, but thus keeps spilling the water.

Dylan:                                   The robot knows it’s carrying this cup, but it doesn’t understand that it needs to keep the cup upright when it’s moving between the starting goal. So the human could come along and physically grab the robot arm and twist it so that the cup is upright.

Jessica Chobot:                 But most robots just ignore the human and continue to keep spilling the water.

Dylan:                                   The work that we’re interested in is the robot to recognize that there’s a reason why the human has corrected the robot and is now pushing it to carry the cup upright, and then for the rest of the current task, the robot will keep that cup upright. It will change its underlying behavior to suit or to match what the human has shown it, and so now it’s treating this interaction as a meaningful correction.

Jessica Chobot:                 Brilliant. That sounds great. So, what’s the problem?

Dylan:                                   I think there are two main challenges to get this onto a factory floor. The first is that the way robots are set up right now, they’re put in fences or boxed off areas on these factory floors, and you never want a human to be close to the robot just because you want to ensure that at no point is the robot going to swing and hit this human by accident.

Dylan:                                   And to solve that challenge, we need to make robots that are physically safer, that are softer, more compliant, and also we have nice programming guarantees to ensure that at no point will the human and robot have a dangerous interaction.

Dylan:                                   I think the second challenge is just convincing these different manufacturers about the utility of having robots that can change their programs and learn from interactions. Right now, I think sort of the state of the art is just pick and place, I know exactly what I’m doing beforehand, I never change my behavior, it’s fixed.

Dylan:                                   But if you’re going to manufacture maybe smaller volume manufacturing or you’re manufacturing a variety of parts, I think this is when this technology becomes more useful.

Jessica Chobot:                 All right. I like what Dylan is saying, especially the safety bit, because I don’t want to lose a finger or anyone else’s finger when I pilot the sorting robot at Kindred, which if you remember, is a big, powerful beast of a thing that kind of looks like a giant metallic caterpillar.

Jessica Chobot:                 But first, Kindred’s James Bergstra is back to explain how Sort does its job. Well, actually, as we’ve been saying all show about AI robots, how it does its task.

James:                                  Imagine you’re in a huge facility that’s fulfilling eCommerce orders. Lots of people have been picking. 100 people’s orders all came in sort of within an hour, and then all those orders are all jumbled together, and they’re falling into the front of this machine into this-

Jessica Chobot:                 Oh, it just missed!

James:                                  … pickup zone.

Jessica Chobot:                 Oh, it just totally missed! In its defense-

James:                                  It will try again.

Jessica Chobot:                 It sucked it, but I think it’s the suction … The plastic bag threw it for a loop.

James:                                  So what’s going on is that there’s cameras all over the place. They are taking pictures of the items in the input bin. They are making a plan for the arm is going to get in and grab something. So then the suction will get the first contact with those items. The fingers come in after, so it’s got a good grip on it. It picks it up, and then you see it there with all those laser scanners. They are looking for barcodes.

James:                                  So like I said, all these orders are all mixed together. Then we need to look up what items are which to know, “Oh, this is Alice’s order. This is Bob’s order. Okay, that’s going to have to go in that cubby up there,” and that’s where the arm just put it.

James:                                  What makes it special as a new kind of product is the AI that drives it. So whenever the robot’s looking at the input area, the bin where the stuff falls, it’s a difficult computer vision challenge to figure out what’s in the bin and how a robot should interact with it, like where would it grab it so that it doesn’t damage the item so you get one item.

James:                                  It’s like stuff that really feels like common sense for a person, but as with so many things about AI and computer vision and mobility, things that seem effortless for people end up being really a challenge for AI. So it’s the special sauce that makes the AI and Sort special is that it’s really good at knowing where to pick things so that you’ve got a nice grasp and it can be quick about it.

Jessica Chobot:                 Okay. I’m dropping in here to point out that this whole time, while James and I are standing here talking about the robot, it is completely autonomously sorting out a huge pile of tee shirts. This is exactly the kind of work that we were talking about earlier.

Jessica Chobot:                 So I want to ask a couple questions. I just saw it pick up a shirt, try to scan the shirt. I don’t know what happened. It decided to then change its mind. It put the shirt back and grabbed a different shirt to scan and then moved that one into a cubby hole. So what was it doing there? Was it not finding the code and it’s like, “Okay, I’ll come back to this later?”

James:                                  It recognized when I guess it didn’t get a scan. So I put it back and it would just try it again. It feels intuitively like you just pick up stuff, you scan it, and put it where it needs to go. There’s a lot of kind of edge cases of what if you grabbed on top of the barcode or what if the lighting is wrong or whatever’s going wrong on a given day, how do you deal with all those edge cases? That’s why our product is uniquely capable.

Jessica Chobot:                 Is it learning that if it can’t grab a scan, that if it drops it, it has a better chance the second time when it grabs it to get a flat surface, or the scan to show up?

James:                                  Yeah. It does do that. That is a thing that it does. Sometimes items that weren’t maybe supposed to land in our product for whatever reason find their way in there. There’s two kinds of people – roles  I should say, that interact with our robot. There are people who are on the floor who are dealing with it physically in person, and then there’s also remote tele-operating pilots that can see through the sensors of the robot over the internet.

Jessica Chobot:                 Ah, right. So that’s the person who steps in and helps the robot, teaches it how to pick something up if there’s a problem. That’s what I want to try to do, if I can, without you know breaking the robot or ripping someone’s expensive order.

James:                                  To see a bit more about the piloting, I’m going to introduce you to Sai here, who is an expert.

Jessica Chobot:                 Hi. Jessica.

Sai:                                         Nice to meet you.

Jessica Chobot:                 Nice to meet you.

James:                                  And I will leave the two of you and see you in a bit.

Jessica Chobot:                 All right. Awesome. Thanks.

Sai:                                         Thank you, James.

Jessica Chobot:                 Can I sit down next to you here?

Sai:                                         Absolutely.

Jessica Chobot:                 See what you’re doing.

Sai:                                         What’s going on here, these are actually running logs of what’s happening with each of these robots.

Jessica Chobot:                 Okay.

Sai:                                         So it’s telling me if I see a warning or an error or something that’s a little bit not the best thing to see, I can always jump to that robot, take a look, figure out what’s going on with it, and make sure it’s still sorting out.

Jessica Chobot:                 Okay. And so then what’s going on in the middle monitor?

Sai:                                         So right now, this is the control interface that we use to keep the robots running. So right now, you’ve got on this side, this is what our algorithm uses to kind of make its picks, and then on this side, this is a live camera that I would be using to understand how the arm is moving, if it’s going into anything that might damage it or that we might even need to just park the arm in case so that we can have someone jump in and grab something. This right here is exactly what the robot is seeing, and so these little dots are previous candidates it was going to want to try for.

Jessica Chobot:                 Just jumping in again to explain that the dots are white circles with crosses on them, like targets, and there’s one on every shirt. Not in real life, just in the robot’s vision. Those targets are where it’s going to try to pick up the item.

Jessica Chobot:                 Got it. And so then on the right is the live action shot-

Sai:                                         Exactly.

Jessica Chobot:                 … of that in actual motion.

Sai:                                         Exactly.

Jessica Chobot:                 Got it. And so then going back to the middle monitor, I see that you have some, I guess for lack of a better phrase, buttons at the bottom where it’s like yield, override, abort, pinch, vacuum, which is what the robot is supposed to be able to do on its own, right? So is that to kind of kickstart it? What are those buttons for?

Sai:                                         So these ones would only be used in a case where I get pulled into a robot that needs my help to do something. So if it’s run into an item that it’s not familiar with because it maybe hasn’t interacted with many of that type, I would then be able to say, for example, override the algorithm and then choose whether or not I want it to be a vacuum-only grasp or a pinch-only grasp, and even if I want to send the item directly to the error bin just to have someone else deal with it.

Sai:                                         This is more like a … I would actually call this a security guard position, to be honest, because you’re responsible for making sure that a large group of robots, you can monitor them through either this control interface but really, you can do the majority of the work from this dashboard here. You don’t even need to necessarily be hands on unless there’s a situation that requires it.

Jessica Chobot:                 Okay. So you would have somebody that would be hands on, go in, grab the stuff, fix it-

Sai:                                         Like unloading cubbies, exactly.

Jessica Chobot:                 And you would be there monitoring the entire situation.

Sai:                                         Yeah.

Jessica Chobot:                 But would you need to be onsite, as well?

Sai:                                         Very, very rarely. I don’t think there would ever be a situation where we would have to send someone to the site.

Jessica Chobot:                 So wait, this isn’t onsite? This robot is not in this building?

Sai:                                         Nope. This is in Gallatin, Tennessee.

Jessica Chobot:                 Well, is there anything that I should know before I hop into the driver’s seat, or the pilot’s seat?

Sai:                                         The pilot’s seat.

Jessica Chobot:                 The pilot’s seat.

Sai:                                         The only thing to really note is-

Jessica Chobot:                 Don’t break it.

Sai:                                         No. The only thing to avoid is barcodes. So if you’re suggesting something for the robot to do, try not to plant your crosshair exactly where that sticker is. It’s best to avoid that just so that it gets a nice clean scan.

Jessica Chobot:                 Okay. Got it. So actually while you’re picking the items or while you’re suggesting the items that it should go for next, you’re also suggesting where on the bag it should grab?

Sai:                                         Exactly. Exactly.

Jessica Chobot:                 Oh, okay. Cool.

Jessica Chobot:                 So if you think about it, now that the boring, repetitive job of sorting stuff is being done by the AI, the humans get to play this cool claw machine game of helping the robot pick stuff up when it needs it. Okay, it’s more complicated than that, of course, because you’re looking after the inner workings of a whole fleet of robots, but still, claw machine, which I excel at. I mean, seriously, I win so much stuff.

Sai:                                         So I’m going to actually override the algorithm so that we can have you suggest what the next pick is going to be. Okay. So I’m going to press the override button, and whenever you’re ready, so go ahead and right click on whichever item you want to get rid of.

Jessica Chobot:                 Okay. We’re going to try to not do the barcode, right?

Sai:                                         And away it goes.

Jessica Chobot:                 It did it. Oh, cool. Well, this is really fun. Oh no, but it fell out of the bag.

Sai:                                         So that’s exactly what we mean about the packaging, right? The packaging can sometimes be a little bit annoying to deal with, and so let me show you something [crosstalk 00:18:13].

Jessica Chobot:                 So yeah. Can I error it?

Sai:                                         Yeah, so we’re going to override. We’re going to target the pinch grasp, and then we’re going to put it into error mode. So now, go ahead and click on that white item.

Jessica Chobot:                 Anywhere?

Sai:                                         Anywhere you want.

Jessica Chobot:                 I’ll do it right in the center.

Sai:                                         Cool. And it’s going to take that away and put it to the error bin.

Jessica Chobot:                 So this really is a giant claw machine.

Sai:                                         Yeah, it is.

Jessica Chobot:                 Yeah. Okay.

Sai:                                         It’s a giant game of claw.

Jessica Chobot:                 I was like, “Gosh, you know, I’m really good at claw machines.” So I want to try another one where the bag doesn’t implode on me.

Sai:                                         Let’s go for that black one.

Jessica Chobot:                 Oh, go to the black one? We’ll go up here in the corner. Where’s my … Oh, I don’t have my finger on this.

Sai:                                         Go ahead, and perfect.

Jessica Chobot:                 Get it. Ooh.

Sai:                                         That’s a nice one.

Jessica Chobot:                 That was a good one. Wow, this is really satisfying. Are those flip flops? I wonder if they’re flip flops.

Sai:                                         Those are flip flops. Those are flip flops.

Jessica Chobot:                 Oh, it already grabbed it.

Sai:                                         It wants them.

Jessica Chobot:                 It knew it. Dang it.

Sai:                                         It beat you to it.

Jessica Chobot:                 All right. I’m going to go for the … It’s faster than me. I also keep hitting the wrong button. There we go.

Jessica Chobot:                 And it’s not just that Sort is faster than me. It’s sorting so quickly that with my little bit of help, it figured out right away how to pick up that item and immediately went off and picked up a bunch of other stuff while I was still looking around trying to figure out what to do next.

Jessica Chobot:                 Thanks so much.

Sai:                                         Absolutely.

Jessica Chobot:                 That was super fun, oddly satisfying. I legit really enjoyed this.

Sai:                                         Good. I’m glad. I’m glad.

Jessica Chobot:                 Can I add robot pilot to my license now? I was actually surprised at how easy it was for me to pick up. That being said, is the hype justified? Will AI robots take over some, if not most or all of our jobs in the future? I got to say, based on what I’ve seen, while I do think robots will be playing more and more of a role in our jobs, we’re going to always need the human element. Hopefully this will just free us up from some of the more boring tasks.

Jessica Chobot:                 That’s AI Hype Versus Reality from Dell Technologies, and if you want to see me piloting that robot, check out DellTechnologies.com/HypeVReality. Next time on the podcast, I perform in a sitcom, yes, an actual sitcom, in front of a live audience, co-written by a comedy bot. Oh my. Wish me luck.