Show Info
Browse Fall 2017: Robot Invasion Cover Previous Next
Fall 2017: Close Menu Search
All issues
Summer 2017 View Issue
Winter 2017 View Issue
Markus Giesler: Letting down our collective guard
Raghavender Sahdev: Busy perfecting a “person-following robot”
John Tsotsos: Self-driving cars? He’s skeptical

Robot Invasion

by cynthia macdonald

photography by mckenzie james

Stephen Hawking has a remarkable mind. But equally remarkable is the fact that the 75-year-old British scientist has changed the landscape of theoretical physics and cosmology while living with a neuromuscular disease that has robbed him of the ability to speak, walk or write.

HÉLÈNE MIALET BELIEVES that Hawking’s physical life is just as fascinating as the mental one for which he’s best known. A professor of science and technology studies at York University, she spent years studying the ways in which technology has enabled Hawking to live a full existence. By squeezing a cheek muscle, he ­activates a switch attached to his eyeglasses that sends messages to his wheelchair computer. This is how he writes speeches, watches television, surfs the Internet. Typing at a rate of one or two words a minute, he can also “talk” through a speech synthesizer.

Speaking (perhaps ironically) via Skype from her part-time base in California, Mialet asks: “What does it mean to interact with the world when everything is mediated through a machine?” Her 2012 book, Hawking Incorporated: Stephen Hawking and the Anthropology of the Knowing Subject, is an attempt to answer that question – but also to make a larger point. “We are all, now, more or less unable to function without technology doing most of the work for us,” she says. “To a certain extent, we are all Stephen Hawking.”

Years ago, computers sat in designated rooms and were used periodically. Now, with the advent of smartphones, exercise trackers and the “Internet of things,” they are with us at all times: in our hands, on our wrists, in our cars, pockets, walls – even, in a primitive but increasing way, inside our bodies. In a very real sense, robots no longer live in the fantasy realm of science fiction. They live where we do, because they are us.

We’re ethnographically exploring how this changes family – whether it makes life easier, makes us happier, makes us lonelier

“Alexa’s talking now,” Markus Giesler tells me. “The device is right behind me, so whenever I say her name she starts listening.” Giesler, chair of the marketing department at York’s Schulich School of Business, is an expert on how ­artificial intelligence is changing how we behave as consumers. He currently has a voice-enabled Google home assistant – commonly known as Alexa – in every room of his house. “We’re ethnographically exploring how this changes family – whether it makes life easier, makes us happier, makes us lonelier,” he says.

Even though Alexa can’t move, her abilities are considerable. “She could be your maid, your storyteller, your DJ, your information source and many other things as well,” says Giesler.

But unlike other sci-fi robots, Alexa doesn’t look human: “she” is just a 23.5-centimetre cylinder. And yet, with her human name and voice, she is anthropomorphized, or given lifelike properties.

“Robotic technology is going to be huge,” says Michael Jenkin, head of the Canadian Centre for Field Robotics at York. “But a lot of it won’t look like science fiction robots, because in North America that’s not what we want. We want our car to look like a car, our walker to look like a walker.” This is the “uncanny valley” concept, which suggests that we are actually still terrified by robots, such as those found in Isaac Asimov stories or the TV show “Westworld,” which uncannily resemble us.

There are areas in the world, however, where Jenkin says that isn’t true. “In Japan, for example, there’s a real interest in robots that look human. There was a really interesting study a few years back, looking at whether people wanted assistants in grocery stores to look like carts or robots. It turned out that in Japan, anyway, they liked robots.” Not coincidentally, Japan is where you’ll actually find a hotel staffed entirely by humanoid machines – from concierges to bellhops to housekeepers.

But elsewhere, we’ve become more comfortable having robotic technology be a part of us, not apart from us. “Hawking’s American computer voice has become part of his identity, even though he’s British,” says Mialet. Even though better technology now exists, “he wants to keep his voice. He’s used it for a long time, and it’s how people recognize him.”

One of its special features is that it can handle appearance changes. So if you take your jacket off, it will still recognize you and follow you

Like Giesler in his Google-enabled home, we now regularly converse with robots – even though current limits in artificial intelligence mean we’re really just talking to ourselves, using the intelligence we have on hand. I confess to Giesler that I find the Australian male voice for Siri, my iPhone’s built-in voice assistant, rather comforting (putting me, perhaps, one step closer to Joaquin Phoenix’s love struck computer-suitor in the 2013 movie Her).

Giesler says this is isn’t unusual. “We’ll pick whatever voice creates more intimacy,” he says. “It’s not uncommon for female consumers to pick a male Siri voice and vice versa. That kind of choice makes the experience more unique and emotionally rewarding.”

He points out, however, that consumers don’t always have that kind of choice, which is worrisome. “Alexa is female, or that’s what her name suggests. So what does this mean for our notions of gender equality if all these virtual assistants are female? We could be experiencing a conservative return to patriarchy, through technology.”

Our modern robot companions need not be mere voices, of course. They can do physical work too. The word robot actually comes from the Slavic root robota, or job; it was first coined in a 1920 Czech play.

Modern robots do, of course, spend a good deal of time away from their human overlords. One of their functions, in fact, is to do jobs that are impossible (or too dangerous) for humans to perform. Designing and building these kinds of robots is part of Michael Jenkin’s research.

Jenkin is a principal in a firm called Independent Robotics. The underwater robot they manufacture, called AQUA, is used all over the world in a variety of innovative ways. It monitors reef decay, helps with environmental reconstruction and analyzes huge amounts of fish population data. Like live organisms, Jenkin says, “AQUA swims by moving legs. Instead of propellers, it uses flexible fins for propulsion, which means you can put your fingers on them and they won’t get cut off. That’s important, when you want to deploy a human being and a robot at the same time.” The version at York is named Kroy – its alma mater’s name spelled in reverse.

York is currently developing many other innovations in robotics, such as improved wheelchairs, on-screen avatars that respond to commands and autonomous car technology. This last item is something we’re continually assured is right around the corner. But is it?

Tsotsos is skeptical. “We’ve got a long way to go before human safety is assured, because those cars still have quite a few weaknesses,” he says. “Drivers communicate with other cars, pedestrians, cyclists all the time. We need to figure out the intent of the people around us. And it’s all done non-verbally, through gestures. How does an autonomous car do that?” Two more of his lab members, Amir Rasouli and Yulia Kotseruba, are developing algorithms for exactly this purpose.

City driving is certainly complicated. But Jenkin thinks, at least with respect to trucks on the highway, that autonomous driving is now a realistic prospect. “When the first truck goes on it’ll be difficult, but when the thousandth does we’ll be used to it,” he says.

The trick, he says, is speed limits. Trucks will all drive the same speed in the slow lane, which could be frustrating for drivers used to weaving in and out of traffic. “If you want to go 20 over the speed limit, you may come across kilometres of trucks in the slow lane. There will be a beautiful geometry as you pass all these slow-moving vehicles. It’ll be a very different model than the one we currently have.”

And, as he points out, cars are already self-driving to some extent: cruise control, global positioning systems and automatic braking systems all testify to that. There are now cars that can parallel park for you, and ones that jam on the brakes when a pedestrian pops onto the road in front of you. We may still be in the driver’s seat, but few of us realize how much help we are getting there.

The new robots may be better drivers, more patient caregivers, and smarter about weather and traffic than we are. But because we own, program and operate them, we assume we’re still in full control of their activities. Giesler warns that the human-like qualities of a machine linked to the Internet may coax us into letting our collective guard down. This, he feels, is dangerous.

“Alexa listens to every conversation we have,” he says. “Alexa can then transmit those conversations to Amazon, who can then store them. But when we talk to consumers about this, they don’t even mind – they consider that Alexa is part of the family. But this isn’t a person! Alexa is linked to a ­corporation. The information we give her can be linked to my credit card transactions, health data and overall spending behaviour. It all goes to construct a fairly accurate profile of how much I can be worth to companies.”

We have to get people thinking about training for STEM (science, technology, engineering and math) jobs much earlier in childhood

By far the biggest concern people have about robots is whether they will take their jobs away. Every day, gloomy statistics are rolled out: a 2013 study determined that 47 per cent of American jobs were at risk for automation. Robots work fast. They do not get distracted, catch the flu, require benefits, take vacations or have family responsibilities. In many ways, they’re the perfect employees.

Then again, the perceived superiority of the machine is hardly new. For thousands of years, they’ve been replacing humans at work; generally, new and different jobs have been the result. But now, observers like Martin Ford (author of Rise of the Robots) think the pace is just too rapid for old jobs to simply get replaced by new ones. Further, advances in artificial intelligence mean that jobs in all classes are threatened: Machines can now diagnose illness, research case law and determine whether or not we should get a bank loan. As a recent article in The Economist puts it, “automation is blind to the colour of your collar.”

But if this grim scenario is about to materialize, there’s little sign of it yet. The Economist goes on to report that demand for legal clerks has actually increased, commensurate with how technology has made legal research increasingly available. It’s true that retail stores are very much threatened by e-commerce, but the ease of online shopping has only encouraged people to buy more, increasing (though changing the nature of) the number of retail jobs.

At the end of the day, it’s worth remembering that machines aren’t infallible: people still need to create, repair and maintain them. “I think the person is going to be in the loop for a long time to come,” says Tsotsos. He says we should all spend more time learning about machines, instead of just using them. “There are lots of jobs in this field, and it’s difficult to find enough people to staff all the high-tech companies. We have openings for professors all the time. We have to get people thinking about training for STEM (science, technology, engineering and math) jobs much earlier in childhood.”

Like most professors who have classes full of millenials, Mialet grapples daily with a sea of distracted students. On smartphones, tablets and laptops, they are simultaneously in her class and somewhere else at once. “It’s difficult to get a sense of what they want, what they feel. They’re not giving feedback when they’re looking at their phones,” she says. She compares the experience to her conversations with Hawking, whose disembodied voice seemed to come from another person entirely, depersonalizing their interaction. “Our notions of politeness, of how we have conversations, are changing,” she says. “I think my son is rude for texting when I’m talking to him, but he thinks I’m rude for talking to him while he texts!”

Perhaps the strangest thing about all this is how humans seek to deny this new reality. There is definite rage against the machines; some on the religious right have criticized Mialet for downplaying Hawking’s humanity by emphasizing how thoroughly mechanized his life has become. And last year, an experimental hitchhiking robot called Hitchbot was viciously smashed to shreds by the very car that picked it up.

But is this rage misplaced? Robots are so much a part of our lives that they are here to stay. “We see ourselves as detached from this environment, when in fact we are totally enmeshed in it,” says Mialet. “And we are able to do everything we do because of it.”

Up Next

Power Play

How York grad Sudarshan Maharaj turned his love of the game into his dream job

Read More