A Nice Gesture by Jeroen Arendsen

Various personal interests and public info, gesture, signs, language, social robotics, healthcare, innovation, music, publications, etc.

Category: Social Robotics Page 1 of 2

Item over Zora en andere robots in de zorg bij CampusTV Utrecht

Naar aanleiding van een grote proef met de Zora robot (eigenlijk NAO met wat extra programmering door een Belgisch bedrijfje) was er een item op CampusTV van de Hogeschool Utrecht over robots in de zorg.

Ik was uitgenodigd als expert om commentaar te geven over robots in de zorg.



Zie Campustalk 07 Winter 2015-2016 https://youtu.be/qd8txYpq9GM (actie vanaf 3:30). Het verhaal van de verzamel-expert is trouwens ook leuk (aan het eind).


Why the Robonaut Handshake with ISS Captain Burbank is not a Greeting

Last Februari 15, 2012, NASA made a little fuss about Robonaut (or R2) performing ‘the first human-humanoid handshake in space’. You can watch it in the video below.


‘Robonaut Shakes Hands’, uploaded by ReelNASA

I would like to use this occasion to enlighten both humans and humanoids about greetings. Although it is of course nice that cpt. Burbank has some kind words regarding the robot’s firm handshake, a nice bit of programming in itself, the normal phases of a greeting, which is a fairly well documented social interaction pattern, are remarkably absent. A handshake is what is called the ‘close salutation’. These ‘close salutations’ are the final phase of the entire greeting episode, which also includes sighting and announcement, distance salutations, and an approach phase (Kendon, 1990). None of the first three steps were followed by Robonaut and cpt. Burbank, the latter merely waited for Robonaut to try and grab his hand. I think this is important, because greetings are among the best documented social interaction patterns of humans. If we want to create social robots, then programming a proper greeting may be one of the most important, and most manageable things we can strive for.

Some references:
Arendsen, J. (2008). Greeting by gesture, a review. Gesture, 8(3), pp. 386-390, link, pdf
Kendon, A. (1990). A description of some human greetings. Conducting Interaction, pp. 153-207. Cambridge University Press.

Update, here is a nice example of a handshake that includes all of the phases of a greeting. They make eye contact, nod slightly or incline their head, approach each other and shake hands.

Book Review of: Imitation and Social Learning in Robots, Humans and Animals

In 2007 an interesting book was published that I believe is also relevant to gesture researchers:

Imitation and social learning in robots, humans and animals: behavioural, social and communicative dimensions.
Chrystopher L. Nehaniv, Kerstin Dautenhahn (Eds.). Cambridge University Press, 2007 – 479 pagina’s (available online in a limited way, here)

The book is an excellent volume with many interesting chapters, some with contributions by the editors themselves but also by many other authors. Personally, I found the following chapters most interesting (of 21 chapters):

  • 1. Imitation: thoughts about theories (Bird & Heyes)
  • 2. Nine billion correspondence problems (Nehaniv)
  • 7. The question of ‘what to imitate’: inferring goals and intentions from demonstrations (Carpenter & Call)
  • 8. Learning of gestures by imitation in an humanoid robot (Calinon & Billard)
  • 10. Copying strategies by people with autistic spectrum disorder: why only imitation leads to social cognitive development (Williams)
  • 11. A Bayesian model of imitation in infants and robots (Rao et al.)
  • 12. Solving the correspondence problem in robotic imitation across ambodiments: synchrony, perception and culture in artifacts (Alissandrakis et al.)
  • 15. Bullying behaviour, empathy and imitation: an attempted synthesis (Dautenhahn et al.)
  • 16. Multiple motivations for imitation in infancy (Nielsen & Slaughter)
  • 21. Mimicry as deceptive resemblance: beyond the one-trick ponies (Norman & Tregenza)

I’ll probably update this post with more in-depth review remarks later… But at least chapter 21 has connections to earlier posts here regarding animal gestures, such as here.

What is Social Robotics?

Is it possible to give a good definition of social robotics? Is it a field of scientific study or is it only a catch phrase for exciting robot stuff? If it is a field of study, can we identify what belongs to it and what is outside of the field? Should we already set such boundaries or should we wait a while to give maximum growing room to the first seeds being planted by enthusiastic researchers and engineers around the world?

Instead of answering these questions here directly, I want to give you two answers of a different kind.

The first answer is that sometimes things can best be defined by identifying good examples (see explanation of Prototype Theory). If enough people can agree on good examples of social robotics then this defines the phrase ‘Social Robotics’ as a usable concept. This kind of definition plays an important role in the study of language and, given that the word ‘robot’ came from literature rather than science, it appears appropriate to try to define it in this way. Therefore, I collected the following videos that, in my opinion, each deal with one or more aspects of social robotics. They are all good examples of social robotics.

As a second answer, which may be more useful if you need more clarity fast, here is a reference to the call for participation of our recent workshop ‘Robots that Care‘, which contains a description of the field of social robotics.

Keepon Rocks

After watching a bunch of Keepon movies on the tube I gotta say: Keepon Rocks. Great idea, great ‘minimal’ design, great manufacturing, great experience. Here’s a small collection…

See, for more info on the web:
beatbots.org
Hizook, robotics news portal
The Thinkers: Robotics developer helps studying autistic children
BotJunkie on Keepon
Q&A with Hideki Kozima – How Keepon was born and what comes next

One CareBot ™ One Family

At my new workplace, TNO, we had a modest celebration today: Two robot projects in which we will be cooperating have been approved by the EC (three cheers for the authors of the proposals RL and MN!). One of those is concerned with robotics in healthcare, which brings me to the next video:

From Gecko Systems (check out more movies) comes this would-be personal robot nurse. The people in this movie appear slightly naïve in their childish enthusiasm but it’s nevertheless good to have such glimpses of the future. Who knows, perhaps you and I will be nursed by such machines? A thought I find somewhat disturbing, I must confess.

One family’s experience with a robot companion for their Mother.

Also on Robots-Dreams
Gecko about Consumer Familycare
Gecko about Professional Healthcare

Ballroom Dance Robot

Video taken of the Ballroom dancing robot at WIRED Nextfest

Although they obviously spent a lot of time and energy on creating this robot I can’t imagine that it will ever be a good dancer if it merely follows the motions, if it can only be led. There will inevitably be a short gap that will prevent real synchrony in movements, which is exactly what you want to achieve during dancing. But then again, most peple don’t get in full synchrony with each other either…

Robot Gestures need Robot Speech: Elmo Live


Elmo Live presented in februari 2008 by 7x7toys

Please watch the gestures that Elmo makes. There are only a few basic gestures, but they are well connected to the speech. Gestures are often ambiguous and get their specific meaning through their interaction with speech. The same is true to some extent for words (their meaning sometimes relies on the accompanying gestures). In any case, by combining speech and gestures you get a very lively impression. This is what is lacking in my opinion in some of the RC-controlled robots, like the i-Sobot and the MechRC (here). They can do a couple of gestures, but without speech they are restricted to emblematic gestures that can be understood without any words. Add to this that context also does not play a role, and you get a very poor repertoire of gestures. To function properly, gestures need context, and gestures need words even more.

It should be noted that this entire episode is scripted. I do not know enough about Elmo Live but I would guess that all his stories and jokes are preprogrammed chunks.

Robot Man: Noel Sharkey

I read a news item about robots on the Dutch news site nu.nl (here) about the ethics of letting robots take care of people, especially kids and elderly people. The news item was based on this article in ScienceDaily. Basically it is a warning by ‘Top robotics expert Professor Noel Sharkey’. I looked him up and he appears to be a man to get in contact with. He has, for example, called for a code of conduct for the use of robots in warfare (here).

Noel Sharkey

Noel Sharkey

According to his profile at the Guardian (for which he writes):

Noel Sharkey is a writer, broadcaster, and academic. He is professor of AI and Robotics and professor of public engagement at the University of Sheffield and currently holds a senior media fellowship from the Engineering and Physical Science Research Council. Currently his main interest is in ethcial issues surrounding the application of emerging technologies

I wholeheartedly agree with his views so far. He has a good grip on the current capabilities of machine vision and AI, neither of which I would trust when it comes to making important decisions about human life. At least when it comes to applications of speech and gesture recognition, with which I have had a lot of experience with, they simply make too many errors, they make unpredictable errors, and they have lousy error recovery and error handling strategies. So far, I only see evidence that these observations can be generalized to just about any application of machine vision, when it concerns the important stuff.

It reminds me of an anecdote Arend Harteveld (may he rest in peace, see here) once told me: Some engineers once built a neural network to automatically spot tanks in pictures of various environments. As usual with such NNs, they are trained with a set of pictures with negative examples (no tank in the picture) and positive examples (a tank in the picture). After having gone through the training the NN was tested on a separate set of pictures to see how it would perform. And by golly, it did a perfect job. Even if nothing but the barrel of the tank’s gun stuck out of the bushes, it would spot it. And if there wasn’t a tank in the picture the NN never made a mistake. I bet the generals were enthusiastic. A while later it occurred to someone else that there appeared to be a pattern to the pictures: the pictures with the tanks were all shot on a fairly sunny day (both in the training and testing pictures) and the pictures without tanks were taken on a fairly dreary day. The NN was not spotting tanks, it was just looking at the sky…

University of Sheffield
Wikipedia

Asimo dancing

Here, four Asimo robots are dancing a really nice choreography. Quite entertaining, but not because of how they interact with humans. It is entertaining to see how someone managed to build a robot with the right movement parameters and then managed to program it to dance in this way.  One could also admire the aesthetics of the movements or of the synchronization.

Page 1 of 2

Powered by WordPress & Theme by Anders Norén