When it comes to the cutting edge of robotics, the Japanese are sharpening it year-on-year while the rest of the world simply watches on in wonder. Maybe it’s because they are so interested in the whole subject that they are so driven to explore the whole area of robotics, in the same way, perhaps, that the Russians were particularly interested in space travel and so were the first to explore that arena.
Whatever the reason, practically whatever you might want crave in
the way of a robotic entity, from manservant or receptionist,
companion to dog, or even cat, has been made created in the last
couple of decades by one of the Japanese manufacturers. True, not
all of them have been profitable and so have, in some cases, gone
out of production, or in other cases haven’t quite made it to market
as yet. But the technology is there. And the ability to use that
technology to turn a dream into a tangible reality.
So, let’s start with ASIMO (Advanced Step in Innovative Mobility),
which is currently the most high profile of the huge Japanese
robotics family. No short timeline this, as the crux of the challenge
was to create a robot that could walk upright on two legs in a stable
and intuitive manner, something that has taken Honda two decades
to achieve. And if you are wondering why they would bother with
something, well, so mundane, the engineers see ASIMO as
something that can help people in need, a robotic replacement for
tasks that are too dangerous for humans to perform, such as
fighting forest fires or cleaning up toxic waste. Such situations are
highly likely to involve uneven surfaces underfoot and a stranded
robot might as well not be there, hence the quest to create ASIMO.
Honda’s first humanoid robot, P1, introduced in 1987, was a rather
startling 6’ 2” (nearly 2 metres) tall. The reaction was unanimously
negative to this size of humanoid robot, and so the height was
swiftly adjusted in the following versions and the present P4 is a
much more manageable 5’ 2” (1.20m) tall, which means it is at eye
level with a seated adult. The present ASIMO (P4) can run, walk on
uneven slopes and surfaces, turn smoothly, climb stairs, and reach
for and grasp objects, as you have probably already seen on
Honda’s charming adverts. It can also use its camera to map its
environment and avoid knocking into obstacles, as well as
comprehend and respond to simple voice commands and recognize
the faces of individuals.
People instantly seem to take to ASIMO, perhaps because of its
chunky, what Honda call ‘friendly’ design, which is reminiscent of
Star Wars’ R2D2. The same can be said about Sony’s robotic dog
AIBO (Artificial Intelligence BOt in English, and meaning
‘companion’ in Japanese), which was produced by Sony for seven
years and sold well over 130,000 units worldwide before being
discontinued a year ago following a cost cutting drive.
Whatever standard you measure against, Sony’s AIBO represents
the most sophisticated product ever offered in the consumer robot
marketplace. AIBO was programmed to walk, feel objects with its
feet, ‘see’ in colour via camera (and take photographs), hear in
stereo, and understand commands – although as Sony were at
pains to point out, because they strived for it to show true dog-like
behaviour, it was also programmed to occasionally ignore them.
It also has (and we’ll revert to the present tense now because
although production has ceased there are still thousands out there
in the world continuing to do their robotic doggy thing) the ability to
learn, adapt to its environment, and express emotion. In fact, these
robotic pets are considered to be autonomous robots, since they are
able to learn and mature based on external stimuli from their owner
or environment. Owners responded to this by forming emotional
attachments to their AIBOs, especially when SONY introduced the
sophisticated software, AIBOware, which enabled the robot to be
raised from pup to fully grown adult while going through various
stages of development.
AIBOs robust platform also made it particularly attractive to
students of computer science and robotics. In the technical arena,
the RoboCup competition has used AIBO since 1998. Dr. Manuela
Veloso, professor of computer science at Carnegie Mellon University
and a leader in international RoboCup competition, praised the AIBO
for its contributions to research of multi-robot systems:
“Teams of AIBO soccer players have captured the hearts of
researchers and spectators while they search for the ball, struggle
to take possession, beautifully move, kick, and aim at the goal.
They eventually score, dance with happiness, and receive roaring
cheers of enthusiasm and praise.
The RoboCup Federation and Research Community are in debt to
SONY for the development of these remarkable AIBO robots which
researchers and students have used in research and development of
four-legged robot soccer teams since 1998.”
In the consumer arena, there are numerous internet ‘roboblogs’
filmed independently by individual AIBOs as they wander around
taking periodic pictures, mainly of ankles and chair legs, to be fair,
but still, these robots are definitely part of the family, not the
furniture, and Sony certainly spared no expense with the creation of
the AIBO. The sounds were programmed by acclaimed Japanese
DJ/avant-garde composer Nobukazu Takemura, who is considered
by many to be one of the foremost masters at fusing mechanic and
organic concepts, while the bodies of the ‘3x’ series (Latte and
Macaron, the round-headed AIBOs released in 2001) were designed
by visual artist Katsura Moshino.
But perhaps you prefer cats? Then you may have been tempted by
the NeCoRo, a fur-covered cat animoid launched by Omron in 2001
at about the same cost as AIBO. Omron is a company best known
for electronic health products such as blood pressure gauges and
digital thermometers, and the NeCoRo was initially designed as a
replacement for use in therapy and in cases of emotional
deprivation. However, NeCoRo had glass eyes, a synthetic fur coat,
and has been described in some quarters as looking like an
animated zombie cat, so perhaps the design wasn’t completely
thought through for the market.
That apart, the technology that was utilized in the NeCoRo was right
up there with the best, as it was able to perceive human action and
thoughts via internal sensors of touch, sound, sight, and
orientation. In addition, using 15 actuators inside the body, it
behaved in response to its feelings, so became angry if someone
was violent towards it, and purred when stroked, cuddled, and
treated with lots of love. Like AIBO, NeCoRo was designed to inspire
emotional attachment and adjust its personality to that of the owner
through a learning/growth function.
Just a shame it was looked so spooky, although Segatoy’s Yume
Neko Smile ( ‘Dream Cat Smile’ in English) robot cat, which was
released this year, seems to have addressed the problems inherent
with the NeCoRo. Yume Neko Smile has the functionality and cat-
like programming, plus it looks and feels incredibly like the real
thing so people mistake it for a real cat! Owners report they just
want to keep on stroking it, which makes it purr and (very
realistically and exceptionally cutely), stretches out it toes when its
tummy is being stroked! Aargh…presumably it’s really aimed at all
those Japanese kids who are desperate to have a cat but can’t in
Japan’s tightly squeezed urban landscape, and it should be the
answer to their prayers as there’s definitely no spookiness about it
at all. Guaranteed.
But while the pets have been mesmerizing, it’s the humanoids that
have really clamoured for our collective attention. ASIMO is a an
example of what we collectively respond to in a robot, as is Toyota’s
Partner Robots, which are designed to function as personal
assistants and are described by Toyota as being “agile, warm and
kind and also intelligent enough to skilfully operate a variety of
devices in the areas of personal assistance, care for the elderly,
manufacturing, and mobility.”
Interestingly they also have artificial lips that Toyota claims have
the same finesse as human lips, apparently developed so that the
robots can play musical instruments. Although the mind boggles
slightly at the idea of a band of musical instrument-playing Toyota
bots, but perhaps it’s no odder than the robotic ‘elderly companion’,
ifBot, which was the winner of the 2003 Good Design Award, and
whose software was developed by Dream Supply, a Nagoya-based
IT firm. Standing just 45cm tall, the ifBot looks like a little alien
spaceman and is “designed to provide hours of companionship to
lonely elderly folks who don’t have a loved one to speak with.” This
is a big area of concern in present day Japan, and this little bot has
been designed to fill the void. To achieve it, the ifBot is pre-
programmed with millions of word phrase at about the level of a 5-
year-old child, which apparently helps keep elderly folks from
becoming forgetful by keeping their minds sharp. In addition, the
ifBot plays puzzles and memory games, sings songs, comments on
the weather, offers advice, and does medical checks. Seems more
like a babysitter than a carer, really, and maybe that’s how they will
pitch it when, and if, it loses its Japanese exclusivity and it is
eventually marketed to the West. Interestingly, the ‘Hello Kitty’
robot is produced by the same company, but this one is aimed more
at teenagers and children.
On a different stage, so to speak, is Takara Tomy’s Omnibot2007 i-
SOBOT. Certified by the Guinness Book of Records as the smallest
humanoid robot in production, it stands just 165mm tall, has an
LCD-equipped remote to controls its programmed motions, and also
responds to voice control. And it is a bit of a mover, able to play
music, dance, and respond to applause and other external stimulus,
and can also make its own punching and kicking sound effects, so
there could be some entertaining robo-duels (or duets!) if you had a
pair of them. The Omnibot2007 i-SOBOT CAMVersion includes a
camera that can send pictures to your PC or phone, and its head
can swivel 60° in each direction. Yes, it’s very neat and highly
In contrast to the stylised bots we have so far discussed, the robots
Repliee and Geminoid developed by Professor Hiroshi Ishiguro at
the ATR Intelligent Robotics and Communication Laboratories at
Osaka Universit, are what you would expect an AI robot to look like
– ie incredibly human.
Their skin is composed of silicone and appears highly realistic.
Internal sensors allow the Actroid models to react with a natural
appearance by way of air actuators placed at many points of
articulation in the upper body, so the robot looks like it is breathing.
The Actroid can also imitate human-like behaviour with slight shifts
in position, head and eye movements and the appearance of
breathing in its chest.
So far, movement in the lower body is limited, mainly because the
compressed air that powers the robot’s servo motors and most of
the computer hardware that operates the AI are external to the
unit, which is a contributing factor to the robot’s lack of locomotion
capabilities. The operation of the robot’s sensory system in tandem
with its air-powered movements make it quick enough to react to or
fend off potentially damaging approaches, such as a slap or a poke,
as well as the ability to react differently to more gentle kinds of
touch, such as a pat on the arm.
Additionally, the robot can also learn to imitate human movements
by facing a person who is wearing reflective dots at key points on
their body. By tracking the dots with its visual system and then
computing limb and joint movements to match what it sees, the
individual motion can then be ‘learned’ by the robot and repeated.
The interactive Actroids can also communicate on a rudimentary
level with humans by speaking. Microphones within those Actroids
record human speech and this is then filtered to remove
background noise, including the sounds of the robot’s own
operation. Speech recognition software is used to convert the audio
stream into words and sentences, and when spoken to they also use
a combination of ‘floor sensors and omnidirectional vision sensors’
in order to maintain eye contact with the speaker – to see this in
action, you can watch the videos posted on YouTube. It’s quite
incredible how humanoid they are.
In addition, the Actoids can respond in limited ways to body
language and tone of voice by changing their own facial
expressions, stance, and vocal inflection. It’s all very realistic,
although what the longterm practical application of the Actroids will
be is difficult to predict at this point. The company suggests that the
demand is as receptionists and as, perhaps, nurses, although the
high cost of hiring one at the moment is very high and possibly only
something a very high profile company would consider. Which is a
shame as the science and technology that have gone into
developing the Androids is immense.
Perhaps the best application, however, and one that would justify
the high cost of production, would be as a stand in presenter,
speaker or lecturer. This is precisely why the Geminoid was
developed by Professor Hiroshi Ishiguro in the first instance, and it
stands in very successfully for its human double in lectures when
the professor is unable to travel in from his home an hour away.
The robot is loaded with the lecture and the students get the
information they need, albeit at one remove. Genius.
We shall see. But with each new wave of new robotics (and trust us,
we have only scratched the surface in this article) it becomes
increasingly clear just how hard it has been for the Japanese to
transform their robot achievements into profit-making reality.
Interestingly, American robot firms, such as iRobot, while less
ambitious and cutting edge, have generally been more successful
than Japanese firms at marketing their non-industrial robots (like
their vacuum cleaning robot, Roomba) simply because they are so
focused on the practical applications rather than the fantasy of
humanoid or animalistic application. But then fantasy is where
breakthroughs occur, so without the Japanese it is likely there
would be little in the way of development.
So what’s the future, then? It’s looking very Japanese.
© Claire Burdett. Please only reproduce this article with permission, in its entirety and with a hyperlink to www.claireburdett.com. Thank you.
First published in WTF magazine, December 2007