A Nice Gesture by Jeroen Arendsen

Various personal interests and public info, gesture, signs, language, social robotics, healthcare, innovation, music, publications, etc.

Month: July 2007

Gesture Watch from Georgia Tech

Here is another aspiring wannabee HCI star at the gesture firmament: the gesture watch.

Gesture watch

Activate! The Gesture Watch has five infrared sensors, four of which sense any hand motion that occurs above the watch. If the user is wearing the watch on his left hand, he can move his right hand over the watch in an up or down, left or right, or circular motion. Different combinations of these movements communicate an action to the watch. (source)

Why do such applications receive so much credit in the various tech news sites and magazines? The only thing happening is that a couple of engineers have put together a neat device that can do a trick. It’s not commercially available, there are no real users yet, there is no positive market feedback. There is only a vague promise of solving a vague problem.

Discovery Channel: It won’t be long now before all electronic devices go “nano,” and shrink to the size of frosted mini wheat square. You won’t know whether to turn it on or eat it. But the real question is: How do you press those teeny buttons?

I know that writing an opening line can be hard, but this one has fallen straight from the sky on the willing imagination of Tracy Staedter (the reporter in question). Did she not notice the big display on the iPhone? People may not want tiny devices at all, because they need displays. And yes, they may also require decent buttons from their devices. In other words, the premises of the promises are promiscuous (sorry, couldn’t resist); reporters are trading in their objective reflection for a nice soundbite.

Carol Fowler

A scientist called Carol Fowler has apparently done research that mine is related to. I was told so by Dominic Massaro, at the Visual Prosody workshop in Nijmegen MPI. He said that my findings on sign perception are similar to speech perception. Specifically, a delay of about 90 ms between detection and recognition, which I found for sign perception, was also found for speech perception by Fowler. But which is the literature I should consult?

At the Haskins laboratory she was part of a stream of research in the 1980’s that treated speech as articulatory gestures:

Carol Fowler[33] proposed a direct realism theory of speech perception: listeners perceive gestures not by means of a specialized decoder, as in the motor theory, but because information in the acoustic signal specifies the gestures that form it.

Carol Fowler

A scientist called Carol Fowler has apparently done research that mine is related to. I was told so by Dominic Massaro, at the Visual Prosody workshop in Nijmegen MPI. He said that my findings on sign perception are similar to speech perception. Specifically, a delay of about 90 ms between detection and recognition, which I found for sign perception, was also found for speech perception by Fowler. But which is the literature I should consult? At the Haskins laboratory she was part of a stream of research in the 1980’s that treated speech as articulatory gestures:

Carol Fowler[33] proposed a direct realism theory of speech perception: listeners perceive gestures not by means of a specialized decoder, as in the motor theory, but because information in the acoustic signal specifies the gestures that form it.

Gesture Remote Control for TV

Two guys in Australia have given humanity the ultimate killer application for gesture recognition:

Gesture Remote Control

Just the thing we needed, really. I am going to throw my remote away as soon as I can get this little gem of technology: something that solves the giant problems we are having with TV remote controls (and replaces them with a whole new set of problems).

I think about twenty problem scenarios popped up simultaneously in my head fighting for priority. But I am just too lazy to type them all in. Instead I will just shrug this one off and save myself the calories.

Gesture Wellformedness

I am running an experiment on the acceptability of variation in sign language. One of the things that touch upon this matter is sign wellformedness, which supposes a certain sign language phonology with rules that tell whether a sign is wellformed or not. I am not done thinking that one over but it did get me thinking: is there such a thing a gesture wellformedness.

According to these guys, there definitely is a way to do a gesture and a way that you don’t do it. Listen to the comments for details 🙂

Can I Learn How to Sign?

Here is a nice master’s thesis by Chantal Mülders called ‘Can I Learn How to Sign: Exploring aptitude for spoken language and visual stimuli in connection with sign language’. (Master of Arts, Radboud University Nijmegen, 2007). Usually these theses aren’t published but since there is so little published on Sign Language of the Netherlands (SLN) I offered her to publish it here.

Download the full pdf (138 pages, 3.8 Mb) here

Summary: This thesis explores the relationship between linguistic and visual aptitude and sign language learning to see what abilities are necessary for sign language acquisition and whether these differ from spoken language acquisition. 29 Students enrolled in a Sign Language minor took four tests before the start of their practical sign language course: a Sound Discrimination test, a Sign Language test, a Shape Discrimination test and a Visual Spatial Discrimination test. The Sign Language test was constructed for this thesis and focused on phonological and phonetic alterations of handshape. The other tests originated from aptitude test batteries. After seven weeks of sign language instruction, the students took a proficiency test that tested their receptive and productive sign language skills. This proficiency test was constructed by the sign language teacher and was the student’s final exam for the course. 18 Students remained who had taken all five tests. Correlations between the four tests and the proficiency test show that the Sign Language test has a decent, but insignificant correlation with Reception. The Sound Discrimination test did not show a relationship and the Shape Discrimination test had a steady, but low and insignificant relationship. The Visual Spatial Discrimination test correlated negatively with Reception. This was the only significant correlation between the four tests and the proficiency test. It is likely that the subject group was not varied enough and all subjects performed above a certain critical level.
I conclude that sign language learning appears to require different abilities from spoken language acquisition, but the current subject group is too small for a definite answer.

Chantal’s spot on the web (in Dutch).
Some believe even dogs can learn sign language … (I disagree, they are learning hand signals, or gestures, but nothing beyond that)

Powered by WordPress & Theme by Anders Norén