Various enterprises and personal interests, such as Man-Machine Interaction (MMI), gesture studies, signs, language, social robotics, healthcare, innovation, music, publications, etc.

Category: Mobile Phones

MobileASL progress

A demo describing the MobileASL research project

The group of MobileASL researchers at the University of Washington features in a local news bulletin. They have been working for a few years now on efficient transmitting of ASL video over a channel with limited bandwidth. The idea is to enable mobile videophony, which has been the holy grail of mobile applications for quite some time already.

Personally, I am not convinced that specific technology for the transmission of sign language video will really have an impact. Here are a few reasons. Bandwidth will increase anyway with costs going down. Processing capacity in phones will increase. Videophony is an application that is desirable for many, not just signers. In other words, there is already a drive towards videophony that will meet the requirements for signing. Furthermore, I am not sure which requirements are specifically posed by sign language. People talk and gesture too, and I imagine they would want that to come across in the videophony as well. Finally, signers can and do adjust their signing to for example webcams. Does the technology address a real problem?

“The team tried different ways to get comprehensible sign language on low-resolution video. They discovered that the most important part of the image to transmit in high resolution is around the face. This is not surprising, since eye-tracking studies have already shown that people spend the most time looking at a person’s face while they are signing.”

Would this not be true for any conversation between people?

On the positive side: perhaps this initiative for signers will pay off for everyone. It wouldn’t be the first time that designs for people with specific challenges actually addressed problems everyone had to some degree.

Ninja Strike Airtime

Ninja Strike, a killer application for gesture recognition?

This is certainly an interesting development. Previously we have seen mobile phones using motion and acceleration sensors for gesture control (see here and here). There have also been applications where the camera was used to simply capture optical flow: something in front of the camera is moving/turning in direction A therefore the phone is moving/turning in A + 180 degrees (here). In this case the gesture recognition appears to go a step further and at least the hand appears to be extracted from the image. Or does it simply assume all movement is the hand? And then perhaps the position of the motion is categorized into left-middle-right? Maybe the velocity is calculated but I don’t think so.

Update: I do like the setup of how people can hold their phone with the camera in one hand, throw with the other and check their virtual throw on the display. The virtual throwing hand on the display is more or less in the same position as your physical hand, which I think is nice.

EyeSight is a techno start-up of 2004 from the Kingdom of Heaven (Tel Aviv) aspiring to use nothing but Air and a Camera to achieve a divine interaction between true techno-believers and their mobile phones. They prophetize that their technology will ‘offer users, including those who are less technologically-adept, a natural and intuitive way to input data, play games and use their mobile phone for new applications’. Heaven on Earth. Mind you, nothing is carved in stone these days. Besides, human nature and intuition are all too often deified these days anyway. Human nature is what usually gets us into trouble (not in the least in the Middle East).

Anyway, one of their angels called Amnon came to me in the night bearing the following message:

Hello Jeroen,
First Allow me to introduce myself. I’m Amnon Shenfeld, RND projects manager for eyeSight Mobile Technologies.
I’ve been following (and enjoying) your BLOG reports for a while, and I thought that the following news from my company, eyeSight Mobile Technologies, may make for an interesting post.
eyeSight has just launched “Ninja Strike”, an innovative mobile game featuring a unique touch free user interface technology we call eyePlay™. Allow me to provide some background information about eyeSight, eyePlay and Ninja Strike: I’m sure you are aware of the popularity and attention innovative user interfaces are getting since the introduction Apple’s IPhone and Nintendo’s Wii… My company’s vision is to bring this technology into the mobile market, and our first products are focused on changing the way mobile gamers play. Our new game, “Ninja Strike”, does exactly this.
You play a ninja warrior with Ninja Stars as your primary weapon. Your stars are thrown by making a throwing motion in front of the phone’s camera. Much like training in real life, during the game you will learn how to throw your weapon correctly, and improve your aim. Your enemies, the evil Kurai ninjas, will also gain strength as the game advances…
Looking forward to hear from you, I hope to see a new post in your blog soon, you’ve been quiet for a while… J
Amnon Shenfeld

Amnon, will you heed my calls? Have you answers to my burning questions above?

game demo

Powered by WordPress & Theme by Anders Norén