Do Unto Robots As You...

Kim BellardIt was very clever of The New York Times to feature two diametrically different perspectives on robots on the same day: Do You Take This Robot... and Why Do We Hurt Robots? They help illustrate that, as robots become more like humans in their abilities and even appearance, we're capable of treating them just as well, and as badly, as we do each other.

As fans of HBO's Westworld or Channel 4's Humans know, it's possible, perhaps even likely, that at some point the robots are going to realize this -- and they may be pissed about it.

We're going to have robots in our healthcare system (Global Market Insights forecasts assistive healthcare robots could be a $1.2b market by 2024), in our workplaces, and in our homes. Some of them will be unobtrusive, some we'll interact with frequently, and some we'll become close to. How to treat them is something we're going to have to figure out. Written by Alex Williams, Do You Take This Robot...focuses on people actually falling in love with (or at least preferring to be involved with) robots. Sex toys, even sex robots, have been around, but this takes it to a new level. The term for it is "digisexual."

As Professor Neil McArthur, who studies such things, explained to Discover last year:

We use the term 'digisexuals' to describe people who, mostly as a result of these more intense and immersive new technologies, come to prefer sexual experiences that use them, who don't necessarily feel the need to involve a human partner, and who define their sexual identity in terms of their use of these technologies.

Apparently, we're already in digisexuality's second wave, in which people take advantage of those immersive technologies -- VR/AR or AI-enable robots -- to form deeper relationships. Professor Markie Twist, who co-wrote The Rise of Digital Sexuality with Professor McArthur in 2017, told Mr. Williams she has several patients in her clinical practice who qualify as digisexuals.

Writer Emily Witt told Mr. Williams, "Digital sexuality allows for possibilities of anonymity, gender-bending, fetish play and other modes of experimentation with a degree of safety and autonomy that's not present in the physical world," and Dr. Twist added:

Research already shows that people can achieve orgasm with inanimate objects, and we already see how people have a longing for their tech devices, and feel separation anxiety when they are not around. I think it's easily possible that people might develop actual love for their technology. They already come up with affectionate names for their cars and boats."

And it's not just about sex. There are a number of companion robots available or in the pipeline, such as:

  • Robear - Japan's nursing robotUbtech's Walker. The company describes it as: "Walker is your agile smart companion-an intelligent, bipedal humanoid robot that aims to one day be an indispensable part of your family."
  • Washington State University's more prosaically named Robot Activity Support System (RAS), aimed at helping people age in place.
  • Toyota's T-HR3, part of Toyota's drive to put a robot in every home, which sounds like Bill Gates' 1980's vision for PCs. One Toyota advisor said: "The idea is for the robot to be a friend."
  • Intuition Robot's "social robot" ElliQ. The company's testing summed up users' reaction: "It's clearly not just a device, but it's clearly not a person. They said it's a new entity, a new creature, a presence, or a companion...They fully bought into ElliQ's persona."
  • A number of cute robot pets., such as Zoetic's Kiki or Sony's Aibo.

All that sounds very helpful, so why, as Jonah Engel Bromwich describes in Why Do We Hurt Robots?do we have situations like:

Research already shows that people can achieve orgasm with inanimate objects, and we already see how people have a longing for their tech devices, and feel separation anxiety when they are not around. I think it's easily possible that people might develop actual love for their technology. They already come up with affectionate names for their cars and boats."

One might understand a factory worker taking an opportunity to damage the robot which took his job, but what do the kids in this YouTube video have against the robot?

Cognitive psychologist Agnieszka Wykowska told Mr. Bromwich that we hurt robots in much the same way we hurt each other. She noted: "So you probably very easily engage in this psychological mechanism of social ostracism because it's an out-group member. That's something to discuss: the dehumanization of robots even though they're not humans."

As Mr. Bromwich concluded: "Paradoxically, our tendency to dehumanize robots comes from the instinct to anthropomorphize them." In a previous article, I discussed how easy it was to get people to treat robots like persons, and quoted researcher Nicole Kramer: "We are preprogrammed to react socially. We have not yet learned to distinguish between human social cues and artificial entities who present social cues."

Get ready for it. Sextech expert Bryony Cole told Mr. Williams:

In the future, the term 'digisexual' will not be relevant. Subsequent generations will have never known a distinction between their online and offline lives. They may grow up with sex education chatbots, make love to the universe in their own V.R.-created world, or meet their significant other through a hologram. This will be as normal as the sex education we had in schools using VHS tapes.

And you were worried about Fortnite.

Robots have already gotten married, been granted citizenship, and may be granted civil rights sooner than we expect. If corporations can be "people," we better expect that robots will be as well.

We seem to think of robots as necessarily obeying Asimov's Three Laws of Robotics, designed to ensure that robots could cause no harm to humans, but we often forget that even in the Asimov universe in which the laws applied, humans weren't always "safe" from robots. More importantly, that was a fictional universe.

In our universe, though, self-driving cars can kill people, factory robots can spray people with bear repellent, and robots can learn to defend themselves. So if we think we can treat robots however we like, we may find ourselves on the other end of that same treatment.

Increasingly, our health is going to depend on how well robots (and other AI) treat us. It would be nice (and, not to mention, in our best interests) if we could treat them at least considerately in return.

Do Unto Robots As You...was authored by Kim Bellard and first published in his blog, From a Different Perspective.... It is reprinted by Open Health News with permission from the author. The original post can be found here.