A Nice Gesture by Jeroen Arendsen

Various personal interests and public info, gesture, signs, language, social robotics, healthcare, innovation, music, publications, etc.

Month: December 2008 Page 1 of 2

The Gruesome, Semi-Gestures of Politics

I came across an old draft that struck me. It was written about two years ago when the IDF bombed Beirut. Back then, I did not want to post it, because this is not a political blog, but today’s situation in Gaza is so sad. The greatest sadness of it all comes, for me, from realizing that these military actions are intended, in a cruel way, as gestures. The acts themselves are horrible of course, but the practical goals are insignificant in comparison to the ‘message behind the actions’. But the point is that the ‘message behind the actions’ is not received, can not be received, and what is left are gruesome ‘semi-gestures’. At best, these acts are politically motivated and appreciated by the home crowd…

Read More

Mama Appelsap, Perception and Phonetics


What you might hear in an English song if you are a Dutch native speaker.

In my own research ambiguity in signs is a recurrent issue. Context can change the meaning of signs. And if you are unfamiliar with a sign you may try to project anything that comes to mind on the incoming signal. These songs are great examples of such projections: Dutch listeners who have trouble decyphering the lyrics supplant them with their own ‘Dutch phonetic interpretations’. DJ Timur collects such cases as ‘mama appelsap’ liedjes.

In a way this is quite similar to this ‘silly’ translation of the song ‘Torn’ (here) into makeshift ‘sign language’. Or perhaps that is only a vague association in my mind and not an actual similarity…

No wait, it wasn’t a translation from song to sign, but the other way around: from a signed news item to a silly voice over…

And even this thing does not really show a lot of similarity to the ‘mama appelsap’ phenomenon, because the ‘translater’ does not supplant the correct phonology (BSL) with the phonology of another language (e.g. English), but he just interprets the signs in the only way he can: through iconic strategies. In a way you could call that the ‘universal language of gesture’ but that would be a bit lame, for there wouldn’t really be anything like a proper phonology at work, I think (not being a real linguist I am unsure). It does show the inherent ambiguity in gestural signs quite nicely, doesn’t it? And how it can be quite funny to choose to ignore contextual clues or even supplant an improper context. Ambiguity and context. A lovely pair.

My apologies to the Deaf readers who cannot bear to see these videos: I think my audience knows enough about signed languages to know that it is not really signed language nor a proper translation.

Jubilations with Mr. DJ

In celebration of getting a new job at TNO, with great opportunities to do all sorts of interesting stuff with gesture recognition and robots!

Also, a merry X-mas and a happy new year to you all too! Thanks for the tip to Björnd, my brother, he owns one of these lovely old robots, called Mr. DJ (created by Tomy in the eighties).

Furry Things that Purr are not Robots


It purrs, it’s furry, and a bad example of a ‘robot’.

In the words of Wowwee, the creators of these cuddly toys:

WowWee Alive™ Cubs are life-like, huggable baby animals that feature plush bodies and animated facial and vocal expressions triggered by users’ touch. Forget trudging through a bamboo jungle or embarking on a safari to catch a glimpse of these wild animals — now, children and animal-lovers alike can nurture a lovable WowWee Alive Lion Cub, Panda Cub, Polar Bear Cub, and White Tiger Cub in their very own living rooms.

Is it important whether kids or other people call these Cubs robots or not? No, not really. But once in a while I feel the need to draw some lines in the dust. And for my definitions I like to follow categories that come naturally to the perception of the innocent. Therefore, if we ask ourselves ‘what are robots’, a good answer would be ‘the things that are called robots by kids’. Any category of objects has fuzzy boundaries (e.g. see Rosch, 1978, and Wittgenstein), yet some cases (examples) are more prototypical for the category than others. In this case we could say that a Cub might be considered a ‘robot tiger cub’ (a good case of ‘robot pets’, under ‘robots’…) but certainly not a prototypical ‘robot’.

Robot Man: Noel Sharkey

I read a news item about robots on the Dutch news site nu.nl (here) about the ethics of letting robots take care of people, especially kids and elderly people. The news item was based on this article in ScienceDaily. Basically it is a warning by ‘Top robotics expert Professor Noel Sharkey’. I looked him up and he appears to be a man to get in contact with. He has, for example, called for a code of conduct for the use of robots in warfare (here).

Noel Sharkey

Noel Sharkey

According to his profile at the Guardian (for which he writes):

Noel Sharkey is a writer, broadcaster, and academic. He is professor of AI and Robotics and professor of public engagement at the University of Sheffield and currently holds a senior media fellowship from the Engineering and Physical Science Research Council. Currently his main interest is in ethcial issues surrounding the application of emerging technologies

I wholeheartedly agree with his views so far. He has a good grip on the current capabilities of machine vision and AI, neither of which I would trust when it comes to making important decisions about human life. At least when it comes to applications of speech and gesture recognition, with which I have had a lot of experience with, they simply make too many errors, they make unpredictable errors, and they have lousy error recovery and error handling strategies. So far, I only see evidence that these observations can be generalized to just about any application of machine vision, when it concerns the important stuff.

It reminds me of an anecdote Arend Harteveld (may he rest in peace, see here) once told me: Some engineers once built a neural network to automatically spot tanks in pictures of various environments. As usual with such NNs, they are trained with a set of pictures with negative examples (no tank in the picture) and positive examples (a tank in the picture). After having gone through the training the NN was tested on a separate set of pictures to see how it would perform. And by golly, it did a perfect job. Even if nothing but the barrel of the tank’s gun stuck out of the bushes, it would spot it. And if there wasn’t a tank in the picture the NN never made a mistake. I bet the generals were enthusiastic. A while later it occurred to someone else that there appeared to be a pattern to the pictures: the pictures with the tanks were all shot on a fairly sunny day (both in the training and testing pictures) and the pictures without tanks were taken on a fairly dreary day. The NN was not spotting tanks, it was just looking at the sky…

University of Sheffield
Wikipedia

Crabfu’s Motion Control

There is a fun company called Crabfu, which is basically one guy called I-Wei. He creates great steam powered robots, 3D art and animation and all sorts of robots with cute motion control (swatchbots). The funny thing about his swatchbots is that he uses direct control of the actuators which create the movement, see for example in this video of his R/C Tortoise:


It does remind me a bit of a tortoise

There is a more complete coverage of his work and an interview with him by Discovery channel:


Can someone please give him a job?

So, instead of just being able to steer the robot ‘forward’ you need to work out, on your R/C how to move the limbs? As if you are learning to walk all over again. Would people like to get down to this basic level of motion control? Would it feel funny to get your bot to go where you want it? Maybe. At the very least, his robots do provide a cute impression.

Some of his robots almost feel a bit vulnerable or helpless, because they have such trouble moving forward. It reminded me of Hall Object (or Dibbes), the ‘gezellige robot‘ that was built to live in the hall of the NPS/Vara building and endear the people who worked there.

Paro, the Mental Commitment Robot


Paro, a present for some Dutch elderly people (source)

In the local Dutch news I read that another Paro, a robot baby seal, had reached our shores. More specifically, a Paro seal entered the homes and hearts of the good people of verpleeghuis Van Wijckerslooth in Oegstgeest.

It is altogether fitting that Paro has come to Oegstgeest. Oegstgeest is a small and very old town near the coast that rose to fame as the setting of the novel ‘Return to Oegstgeest’ by Jan Wolkers. In the novel Wolkers writes a lot about his love for animals, both the cuddly ones and the less cuddly ones. It makes me wonder what Wolkers, may he rest in peace, would have had to say about Paro…

“Tsja, het is natuurlijk van de gekke dat je 4000 euro gaat betalen voor zo’n in elkaar geflanst stuk mechaniek terwijl je maar de tuin in hoeft te lopen voor de meest leuke beestjes. Maar ik snap het wel hoor. De mensen willen gewoon vertroeteld worden en vertroetelen zonder vies te worden. Ze zouden maar wat graag een robot hebben die de hele dag zijn hygiënisch schone vingertje zachtjes rondpoert in hun kont of hun kut zonder dat ze ervoor op hoeven te staan.”

There is a good deal of thinking behind Paro. For example, the creators at AIST chose the form of a baby harp seal, and not of a cat or dog, because people will not compare Paro to their experience with a real seal (since they probably will not have had a real experience with a live baby seal). Robot cats tend to be perceived as less fun and less cuddly than real cats. I know from personal experience that many people, especially kids, are quite fond of baby seals. We once went to Pieterburen, home of the world’s foremost Seal Rehabilitation and Research Centre. Even though the kids were not allowed to touch any real baby seals they came to love them just by looking at those big eyes and that innocent appearance. And now, with Paro, you can actually touch and even cuddle them without smelling fishy for a week. I guess all signs are ‘go’ for entering a loving ‘mental commitment’, which is at least what Paro is intended for own homepage

“Mental Commitment Robots” are developed to interact with human beings and to make them feel emotional attachment to the robots. Rather than using objective measures, these robots trigger more subjective evaluations, evoking psychological impressions such as “cuteness” and comfort. Mental Commitment Robots are designed to provide 3 types of effects: psychological, such as relaxation and motivation, physiological, such as improvement in vital signs, and social effects such as instigating communication among inpatients and caregivers.

Rather grand claims for a robot that hardly does anything, but so far there have been reports in the news (e.g. here, here, or here) that it does have such positive effects to some extent. Yet, Paro only has a few basic sensors (light, touch, microphone, oriëntation/posture, and temperature). He can only open or close his eyes, move his head and paws a bit and ‘purr’ or ‘cry’. The solution, as always, comes from allowing the power of suggestion to work its magic. Minimalistic functionality leaves room to project feelings, moods, even personality to a robot.

Robot La Yang Che Flaps Ears and Rolls Eyes


Wu Yulu’s ‘talking and walking robot’

Wu Yulu is apparently some chinese guy who has built robots on his own. This fella, named La Yang Che (a translation anyone?) can actually walk and sort of talk. The visionary design of Yulu manages to capture a hitherto disregarded aspect of human face-to-face interaction: The flapping of the ears!

While the lip synchronization only distracts from the message, it is clear to see that the flapping of the ears, rhythmically accompanying the spoken words, beats out the tempo and thereby diminishes the cognitive effort needed for speech perception. At the same time, the rolling or darting of the eyes seems to serve merely to enhance the overall aesthetic experience.

Spykee, Robotic Eyes and Ears

Spykee is a fairly new robot by Meccano, which explains why you need to assemble it from a bunch of plastic and metal parts. AvB describes the assembly quite well on his blog


Does Spykee actually ‘do’ anything of its own? 

For me, it is a bit weird to see that somehow Meccano managed to transform the idea of a robot as a toy or a thing that ‘does stuff’ and respond in various ways to you, to a silly RC WIFI-controlled extension of the owner. The camera only relays the video to the owner, the sound is relayed, and even his master’s voice is relayed through the speakers.

It does not seem to have a voice of it’s own. I cannot even imagine facial expressions. And what of gestures? Spykee does have hands, which is an important requisite for gesturing. But they don’t do much yet, it appears. Perhaps this is where their ‘open source’ policy comes in. Maybe they expect me, or you, to program all sort of interesting gestures for Spykee. A little bit like making gestures for Second Life. Hmmm, maybe the gesture databases for Second Life could somehow be ported to Spykee?

There is an interesting comparison here between Spykee and Rovio. Rovio is even worse at gesture, since it does not even have anything that could be interpreted as hands or arms. But it does have more autonomous navigation.

Rent a Robosaurus

This time the robot, Robosaurus, is not from WowWee, nor is it cute, small and safe. There are quite a few robots that you can rent for shows or trade fairs and such. Honda’s Asimo, Titan (Cyberstein), Mico … feel free to add what you know here.

There is a small company called rentarobot. But their robots are quite dull, it appears. You can also rent a robot here, for $750 to $1500 a day (operator included). This company called entertainment robots is also in the rental business. They have quite a good collection and build custom robots for you.

Page 1 of 2

Powered by WordPress & Theme by Anders Norén