Robotic seals comfort dementia patients but raise ethical concerns

Aug 17, 2015

This story originally aired on December 9, 2014.

At the Livermore Veteran’s Hospital, there are a few animals residents can see: wild turkeys that run around the grounds, rattlesnakes that hide out in the dry grass, and therapy dogs that make weekly visits. But there’s one animal in particular that Bryce Lee is always happy to see: a baby harp seal.

This seal isn’t alive. It’s a robot called Paro that was invented in Japan, but Lee doesn’t necessarily know that as he pets it while it coos and purrs. He and the other senior patients here have dementia or similar loss of cognitive function, caused by stroke or traumatic brain injury, and Paro the seal was designed to comfort them. It’s a type of tool known by scientists as a “carebot”.

Lee interacts with Paro under the supervision of Cassandra Stevenson, a recreation therapist here at the V.A. hospital. Because of his condition, Lee doesn’t normally speak much, but Stevenson gets him to talk by asking him questions about the seal, questions like what he thinks it eats, and if it catches the fish by itself.

Cute but complex

Paro is pretty adorable. It has big black eyes that open, close, and follow your movements. It’s about the size of a large cat, and when you pick it up, it’s heavier than you’d expect. It weighs exactly six pounds, so it feels like you are holding a newborn baby. It charges by sucking on an electric pacifier. Inside its fuzzy, white exterior, the seal has sensors that detect touch, sound, light, heat, and movement, and it reacts in different ways. It can recognize its own name.

“We started using it with the residents and a lot of them think it’s real,” says Kathy Craig, another therapist at the V.A. “They’ll bark at it, they'll pet it, they'll sing to it. We find it works better with people with dementia because if the residents are aware that it’s not real, we find that sometimes they don’t engage with it as much.”

Craig thinks it’s a useful tool for residents who are antisocial, agitated, or sad.

“We'll bring out the Paro robot and set it down and they'll start talking to the Paro, they'll talk to other people, it'll brighten their mood. And if they’re maybe at risk of wandering and getting lost, instead of that happening, they might sit down with Paro for a while and spend some time with it.”

Craig says they’re even doing a study on whether seal time can replace anti-anxiety medication.  Nursing and therapy staff have noticed Paro also brings out a sense of nurturing and caring in patients. The veterans smile as they stroke Paro’s fur. They ask questions about it, call it baby names, and even flirt with it.

Dog vs. seal

In addition to Paro, live dog therapy is also available for the residents at the V.A. A few times a month, volunteers come with their dogs and let the veterans play with them. Their interactions are very similar to when they play with Paro. In fact, the little white dog, Bailey, that visits frequently is the same size and color as Paro the seal.

“There’s a pretty large body of evidence to show that interacting with animals can help things like lower blood pressure, reduce depression, reduce subjective pain, decrease the time it takes to recover from chronic ailments,” says Dr. Geoffrey Lane, the psychologist who brought Paro to the Livermore hospital three years ago. He says watching a particularly difficult patient interact with live therapy dogs was the reason he brought the robot to the hospital in the first place.

“She was screaming and yelling a heck of a lot, most of the time medications weren’t working, and all the other things that staff were doing weren’t working,” says Lane. “But one thing I did notice is that when the dogs were brought into the room, that’s when she stopped.”

As useful as they are, Lane says live dogs present some problems: they are unpredictable, they can transmit disease, and most importantly, they go home at the end of the day.

“So I thought to myself, ‘Is there some way we could bring animals into her room and just kind of leave them there?’ For practical reasons we can’t do that, so I went to the computer and...found an article on a blog about the Paro.”

Dr. Lane thinks there isn’t much of a difference if a resident plays with Bailey or Paro. He says humans are wired for connection.

“People are able to connect with this robot. It’s designed to behave in a way and interact with the person so that you want to touch it, you want to pet it, you want to interact with it. They have the same reaction that they do to any other cute animal or cute baby.”

Moral and ethical quandaries

However, not everyone is on the same page as Dr. Lane. Shannon Vallor is a virtue ethicist and philosophy professor at Santa Clara University. She studies the ways our habits influence the development of our moral character, and she thinks there are a few ethical issues to worry about when using carebots.

“People have demonstrated a remarkable ability to transfer their psychological expectations of other people’s thoughts, emotions, and feelings to robots,” Vallor says.

Nurses and therapists at the Livermore V.A. don’t explicitly tell the patients Paro the seal is a robot. They play along with questions about where it lives and what type of fish it eats. Vallor says with dementia patients, the line between reality and imagination can already be blurred, but that “we should worry about it with people who are in the facility for other reasons, who are lonely and who want to feel like somebody cares about them.”

And there’s another problem. It has to do with us, the people who are actually doing the caring.

“My question is what happens to us, what happens to our moral character and our virtues in a world where we increasingly have more and more opportunities to transfer our responsibilities for caring for others, to robots?" Vallor asks. "And where the quality of those robots increasingly encourages us to feel more comfortable with doing this, to feel less guilty about it, to feel in fact maybe like that's the best way that we can care for our loved ones?” 

She says that caring is really hard, even for the most well-meaning human beings.

“At a certain point we just run out of emotional resources, and at that point both the human caregiver and the person they are caring for is at risk. The robots are reliable, the robots are trustworthy, we don’t have to worry that the robots are going to get burned out, stressed out, that they are going to lose their patience, and we have to worry about that with human caregivers.”

So Vallor says she doesn’t deny the potential usefulness of carebots, but thinks we should be wary of our intentions when we design them.

“Not ‘How could we replace you?’ but ‘How could we help you become a better caregiver?’”

That means making robots that might challenge us, ones that make us work to form a relationship, and encourage conversation with others.

Back at the Livermore V.A., Bryce Lee is talking to therapist Cassandra Stevenson about Paro.

“She’s a pretty domesticated seal, right?” says Stevenson.

“Yeah, she is,” Lee laughs and responds.

Paro could be an example of the middle ground that ethicist Shannon Vallor speaks of. It’s helping therapists like Stevenson do her job better. It’s getting patients like Lee outside of his room, helping him socialize. But by not getting in the way of human-to-human interaction, it could help us develop our caring responsibilities rather than deplete them.