group smiling

Trust us, this research into human-robot interaction is going to be useful


The robots are coming. Correction, they’re already here. Think drones, driverless cars and all those Roombas with hitchhiking cats.

And at Chapman University, a new kid on the block – a spindly stick of a fellow tucked away in Smith Hall. That would be CASEY, amusingly described by the student researchers who work with the device as “an iPad on a Segway.” But CASEY is the robot that a group of Chapman communication science students and their teacher
Yuhua (Jake) Liang
, Ph.D., assistant professor in the
Department of Communication Studies
, use to study the future of human-robot communication.

group smiling

Chapman University researchers using CASEY the robot in their work include, from left, Clairre Abeyratne ’16, Yuhua (Jake) Liang, Ph.D., assistant professor, Kirk Mulligan ’17 and Lauren Henderson ’16.


“We’re interested in how robots can connect with humans more efficiently,” Liang says.

Some of that latest research conducted by Liang and a colleague at Northern Kentucky University with assistance from a team of Chapman undergraduates recently received international attention. A paper reporting Liang’s research into trust and human-robot communication was published in the proceedings of the 11
th
Annual ACM-IEEE International Conference on Human-Robot Interaction held this spring in New Zealand.

The title is a mouthful: “Employing User-Generated Content to Enhance Human-Robot Interaction in a Human-Robot Trust Game.” The crux of the findings, though, is something everyone can understand. Reputation, just as is often the case with regular old person-to-person communication, can make a world of difference.

That’s where CASEY, an acronym for computerized autonomous system engineered for genY, and the Chapman research assistants came in. Study participants would arrive at the Smith Hall lab and learn about a dice game they would play with the robot, which resembles a small electric scooter with a computer tablet for a head. Some participants then visited a mock website where they read glowing reviews of the robot’s helpfulness. The control group didn’t see such reviews.

CASEY’s happy face would appear on the tablet screen and the gaming would begin. At the conclusion of the experiment, the people who had CASEY’s accolades floating in the backs of their minds reported better mood, a positive perception of the technology and more trust in the robot. They even let it have more turns during the game, which the researchers interpreted as a willingness to take a risk with CASEY.

Meanwhile, control group participants who didn’t see the reviews tended to come away thinking the device was incompetent and untrustworthy.

The potential applications are not about developing ploys to make people toadies to the likes of Rosie the apron-wearing robot from
The Jetsons
, the student researchers explain. Rather, they’re interested in developing strategies that people can employ when they create robots that will work with and be helpful to people. For example, when lifting an elderly person from bed to chair or guiding a student through a tutoring session.

“We’re seeing a lot more robots and automated voice messages being integrated into our lives so it’s important to figure out how we can get people to communicate with robots better. That’s our overall goal,” says communication sciences major Lauren Henderson ’16, one of the students who along with fellow classmates helped conduct the research.

Kirk Mulligan ’17, says the implications of this research are already timely. A native of the San Francisco Bay Area, he watched the city squirm when the landmark toll bridges there replaced toll takers with robotic systems.

“There’s a huge spectrum of where this could really be going and I’m just excited to be part of the studies that gets it going there,” he says.

Likewise, Clairre Abeyratne ’16 says the research reaches into the future — even if CASEY doesn’t have the benefit of arms.

“I thought there were going to be hands,” she says with a laugh, when she recalls her first encounter with the robot. “But they’re going to be in different shapes and sizes and we’re going to have to be prepared for that.”

For Liang, whose studies are part of the work conducted through the
Communication and Influence
Technology Lab

, (CITlab) an interdisciplinary collaboration with Seungcheol Austin Lee at Northern Kentucky University, it’s still early days for this body of research. Ethical questions abound, he says, such as when one should distrust a robot’s choice. But hardware engineers are forging ahead, so communication scientists need to keep up and study this tool of modern life that is part new medium and part technology partner.

Liang is particularly interested in trust and fear issues. One finding out of CITlab is that about 25 percent of the U.S. population fears robots. That could be a conundrum, he says.

“Do we need to trust robots? We do,” he says. “Because the way society is moving, there are going to be a lot of automated processes that are going to be done by robots in the near future.”

Dawn Bonker

Add comment

Your Header Sidebar area is currently empty. Hurry up and add some widgets.

#printfriendly .pf-hide { display: none !important; } #printfriendly .elementor-background-video-container { display: none !important; } .elementor-widget-container span + .wp-audio-shortcode { display: none !important; }