Story by Jonny Hart, KLN ’18
Illustration by Robert Frawley

It’s a busy Sunday morning at the grocery store, and dozens of shoppers are weaving through the aisles of stocked shelves. Near the dairy aisle, an autonomous floor-scanning robot pauses for a bit too long. It has wedged itself between a refrigerator door and a shopping cart, creating a bottleneck that forces annoyed customers to reroute around it.

The scene catches the attention of Philip Dames, associate professor of mechanical engineering. Moments like this fuel the work he leads as director of Temple’s Robotics and Artificial Intelligence Lab (TRAIL). He’s used to robots getting in the way.

“Robots are pretty socially incompetent a lot of the time,” Dames says. “One thing we’re studying is how robots can act more naturally and use social context to integrate into the flow of the crowd.”

Philip Dames holding a part of one of his robots
Philip Dames holds a small drone robot
Philip Dames working on his robot in the lab. He's wearing glasses and concentrating.

Dames spends most of his time working with robots, as is evident from a glance around his lab. TRAIL’s space on the first floor of the College of Engineering building contains a computer cluster, where Dames and his student researchers program robots and run simulations testing different robot behaviors. There’s a work bench where they tinker with robotic arms, wheels and other components.

And then there are the robots themselves—dozens of small drones, circular robots that resemble big and small Roombas, and four-wheeled robots that the TRAIL team drives through the building’s hallways to test how they respond to people in motion.

Dames is helping this fleet of robots read the room, literally and figuratively. He is outfitting robots with cameras, microphones and sensors to help them gather information about their environment. And he’s programming robots to integrate contextual information into their behavior to become more seamless companions and complete more complex tasks.

“We largely work on projects related to multirobot coordination and autonomous operation in the real world,” he says. “We’re trying to build robotic systems that can help people in different workplaces.”

Assembling a robot dream team

Dames has spent nearly 10 years researching how a team of robots can best work together to accomplish tasks. For this work, he’s borrowing a concept from the sports world.

“In sports, moneyball involves using data to quantify how much different players contribute to team success,” Dames says. “You want to find players that are good value for the money.”

The concept has spread to industries beyond sports, challenging decision-makers to scrutinize intuition-based assumptions they’ve historically operated on. Dames is applying the concept to the multirobot, multitarget tracking problems, where he is analyzing how the composition of robot teams impacts success at tracking a target.

The research could have applications in infrastructure inspection, or search and rescue missions.

“In infrastructure inspection, the target you’re tracking might be an area of damage that needs to be addressed. Or in a disaster response context, the target you’re tracking might be a missing person,” he says.

“Basically, you’re using a team of robots to search for targets in a defined space. We want to understand what size and composition of a robot team can best accomplish that task.”

Dames’ team is running thousands of computer simulations testing what size team is most optimal for searching a given space. Later on, he wants to test how different robot capabilities contribute to the team’s success.

“We’re trying to build robotic systems that can help people in different workplaces.”

—Philip Dames
Associate professor of mechanical engineering

A firefighter uses a hose to put out a fire. A pixelated illustration of a robot helps hold the hose.

“Then we can ask, is it better to have robots that see further or drive faster?” he says. “We want to understand which robot features are the most helpful for a given task.”

He’s found that the ratio between the number of robots and the number of targets they’re tracking is the most important factor in the team’s success. But simply adding more robots isn’t always the best approach for every task.

Dames also works on multirobot path planning problems, where you might have a fleet of robots working in a warehouse, staging and organizing orders before they get shipped to their destination. The more robots you have in a contained space, the more time they will spend dodging each other, causing delays, Dames explains. He wants to figure out the critical density where adding more robots leads to a deterioration in performance.

“Practically speaking, you could use this research to determine how many robots you should buy for your given space,” he says. “Or if you already have a team of robots, you can look at how to design your warehouse or store to maximize the performance of your robot team.”

A construction worker on site, placing a chain on the site. A pixelated illustrated worker is helping him do so.

Making robots more socially aware

Another area of Dames’ work is inspired by the rising popularity of robots in places like grocery stores, airports and shopping malls. It’s not uncommon to see autonomous robots in these spaces completing tasks alongside humans. But as scenes in grocery stores and other public spaces make clear, robots and humans still struggle to navigate the environment together.

“A lot of traditional robotic navigation approaches focus on geometry and safety,” Dames explains. “We want them to be safe, but these approaches can lead to very conservative behavior where robots will just stop and wait for everyone to get out of the way. They become tripping hazards, and they’re not really part of the flow. In robotics, we call this the frozen robot problem.”

Solving this problem is part engineering, part psychology, and Dames is collaborating with Donald Hantula, associate professor of psychology and neuroscience in the College of Liberal Arts, to study how humans perceive and interact with robots in shared spaces.

The pair are conducting experiments in which participants watch videos of humans and robots, and the participants are asked to stop the video when the robot seems too close to a human. They’re also experimenting with humans and robots walking in groups and tracking how much space humans want from robots, especially in constrained spaces.

“If you drop a person in Times Square, they can still move through it, even if it’s not easy,” Dames says. “A robot, on the other hand, might just sit there until the crowd disperses. We don’t want robots to be too aggressive, but we also don’t want them to be so conservative that they end up being an impediment.”

Dames is looking for that middle ground where robots move fluidly, but safely, through the environment without interrupting humans. He is collaborating with Slobodan Vucetic, professor of computer science in the College of Science and Technology.

Dames and Vucetic are using machine learning algorithms to predict the flow of pedestrians, and they can use those models to train robots to better navigate pedestrian traffic.

“The big question we’re asking here is: How can we make robots that can move in a more socially competent way?” Dames says. “How can we design the behavior of robots to be natural, predictable and comfortable for people? This will only become more important as robots start to occupy retail environments, restaurants, hotels and even sidewalks.”

Designing better robots for the future

In 2015, the friendly hitchhiking robot, hitchBOT, made its way across Canada and parts of Europe, only to be destroyed shortly after arriving in Philadelphia. The incident highlighted the anti-robot bias that many robot designers know well.

“My goal isn’t to make everyone like robots,” Dames says. “But I don’t see robots going away, at least not in the short term. So, how can we make them better if they are going to be here?”

That attitude drives Dames to continue improving robots. For one of his next projects, he wants to explore selective attention, the concept that describes how humans focus on certain details and tune out others depending on the task.

Dames wants to apply this to robotics by adjusting the level of detail a robot operates on given the task they are performing.

A Temple-themed robot rolling down the hallway. It has a white, boxy frame with a cherry Temple 'T' logo on the side and four wheels.

Philip Dames uses the busy hallways inside the College of Engineering as a testing ground for autonomous robots. He hopes to design robots that seamlessly navigate spaces alongside humans.

“Consider a bookshelf. If a robot’s goal is simply to move through a space, then a bookshelf is a big box that the robot wants to avoid,” he says. “But if its goal is to retrieve a certain book, now it needs a lot more information. And if it needs more information, it needs more computing power.”

Dames is also excited to expand on the psychology research he and Hantula have started. By better understanding how humans learn, move and make decisions, he hopes to design robots that are more intuitive teammates and companions. The work might just help future robots move through busy public spaces as effortlessly as the people around them.

“I’m excited to think more about neuroscience and psychology in my work,” he says. “If we can bring those concepts to robots, directly or indirectly, we can make them smarter and help them navigate this complex world that we live in.”

“I don’t see robots going away, at least not in the short term. So, how can we make them better if they are going to be here?”

—Philip Dames
Associate professor of mechanical engineering