Japanese latest neural network robot "Alter" does everything itself just like a human
Japan's National Science Museum is no stranger to eerily human androids: Itemploys two in its exhibition hall already. But for a week, they're getting a new colleague. Called "Alter," it has a very human face like Professor Ishiguro'sGeminoids, but goes one step further with an embedded neural network that allows it to move itself.
The technology powering this involves 42 pneumatic actuators and, most importantly, a "central pattern generator." That CPG has a neutral network that replicates neurons, allowing the robot to create movement patterns of its own, influenced by sensors that detect proximity, temperature and, for some reason, humidity. The setup doesn't make for human-like movement, but it gives the viewer the very strange sensation that this particular robot is somehow alive. And that's precisely the point.
The project is an attempt to bridge the gap between programming a robot to move and allowing it to move for itself. With a neural network in place, movement is given a loose degree of flexibility, what the researchers are calling "chaos." Alter's arm movement, head and posture will adjust and change on the system's own volition. The neural network ticking behind the scenes offers multiple movement modes, switching between a longer movement mode and a more random "chaos" mode. The decision to switch is influenced by the sensors dotted around the base and take in what's happening around Alter: proximity, humidity, noise and temperature. These sensors operate like the robot's version of skin, copying our own senses, even if the system is far, far simpler. If the proximity sensors detect a lot of people nearby, for example, the torso shudders as the robot's body reacts to its environment.
Alter also sings -- in a horrific, nightmare-inducing way. The haunting melody that comes from the machine are sine waves vocalizing how the robot's fingers move. (The team apparently tested other noises and melodies but decided to keep things simple for this early model.)
The theory behind the CPG is based on one of the simplest artificial models for neurons, the Izhikevich neuron, which reacts in a way that's called "spiking and burst behavior": Something builds up, and the robot's system creates a signal spike, which chains together with other neurons. Professor Ikeue from Tokyo University describes the central pattern generator as "coupled pendulums" -- one bumps into another into another and a movement in formed. While not an equal, balanced rhythm, this becomes Alter's own rhythm. The researchers didn't make the movement; the robot made it itself.
Osaka University's Kouhei Ogawa, who worked on previous humanoids at the Ishiguro lab, added: "This time, Alter doesn't look like a human. It doesn't really move like human. However, it certainly has a presence." It's true. It feels like there's something alive in there that's neither human nor robot. Movement seems random -- even if it's nonsensical.
"Until now, making androids talk or interact for 10 minutes was an incredible amount of hard work -- simply to program something to react for that long. Alter, moving for itself, can do so easily." The robot will be on display to the public for a week while the Tokyo and Osaka teams hope interactions will inspire new ideas on what they should teach Alter next.
Source: engadget
0 comments:
Post a Comment