This article at Beech Blog written below is taken from Popular Mechanics updates.

  • Disney Research scientists recently created a system for a more realistic robot gaze.
  • The team demoed the technology with a humanoid animatronic bust.
  • To make interactions with the robot more realistic, the team programmed movement into not only the eyes but also the neck and eyebrows.

Disney World could one day feature some of the most realistic animatronic characters on the planet, making your stay that much more magical. Imagine robots that can accurately follow your gaze while talking to you, raise their eyebrows, and even break eye contact like any other stranger periodically would.

Scientists at Disney Research, the network of labs supporting the company’s technological endeavors, have recently devised a new system for creating a lifelike robotic gaze.

By introducing minute “secondary behaviors” that humans exhibit in a conversation—from the flicker of the pupils between focal points, to the faint tilt of the head—the team managed to craft a machine that feels sort of human. The scientists presented their paper at the International Conference on Intelligent Robots and Systems last fall.

In effect, the humanoid robot comes off as incredibly lifelike, despite its face being mostly uncovered, exposing the electronics beneath. For now, that’s fine; Disney artists can enhance the face later, Doug Fidaleo, director of Disney Research Los Angeles, tells Pop Mech.

Fidaleo’s team is responsible for the hardware and software that could one day appear in Disney’s proprietary “Audio-Animatronics” figures, which the company uses to create repeatable live shows and experiences (like “It’s a Small World” and its 300 Audio-Animatronics dolls). So far, the results have been pretty convincing.

“I know the first time that I sat in front of [the robot], [I got] a little nervous, because you actually believe that this thing is alive,” Fidaleo says. “That threshold of feeling something, nervousness or something, is critical.”

“IT WOULD BE QUITE UNNERVING FOR [THE ROBOT] TO FIXATE ON A SINGLE POINT ON YOUR FACE.”

It takes some serious finesse on the software side to build that sense of realism. The engineering team places most of the emphasis on transitions and blending, so from one moment to the next, there aren’t any real hard stops that might give away the animatronic figure’s true robot identity, effectively breaking up the experience.

If the robot is looking at one person, for instance, and then another child walks up to it, the animatronic figure can sense that with its onboard RGB camera, says James Kennedy, a research scientist with Disney Research Los Angeles. From the perception side, the robot gets a new set of coordinates to look at, and it will slowly transition its gaze to focus on that second child.

“Now, you have to make the decision of, ‘How do I go from where I am currently facing and get to these new coordinates?’ And, you know, as a robot, that’s a very simple problem,” Kennedy tells Pop Mech. “You can draw a straight line and do that. But that’s not particularly believable. People don’t move in that way. And so there are a lot of these small details that we do.”

Through the programs his team has built, Kennedy says it’s possible to dictate how long one glance should last before slowly blending into another motion. From there, the software is fit with rules for the acceleration and deceleration of the motors that control the robot’s neck, face, and torso along a particular curve.

For example, one of the most impressive features relates to saccades, or quick, simultaneous movements of the eyes between fixation points. Think about making eye contact during a job interview, when you’re probably most aware of your body language. Your eyes don’t remain static while looking at your future boss, but rather, they subtly dart back and forth.

So, if you wanted to have a staring competition with this animatronic bust, you’d probably win—and that’s by design.

“It would be quite unnerving for [the robot] to fixate on a single point on your face,” Kennedy says. “It’s something we could do technologically, but it would be quite unnatural to people.”

The researchers have programmed the robot to sort of mimic what the people in its line of sight are doing, from tilting its head in sync with guests, to blinking, and even subtly “breathing.” Engineers combine these motions in a few different states of being based on a “curiosity score” that records the number and type of stimuli in the surrounding environment.

About Author

BB

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.