A Dutch local newspaper, Leidsch Dagblad, has written a good report of the annual holiday gathering organized by the national foundation for the Deafblind (De Nederlandse Stichting voor Doofblinden). About 70 deafblind people (and their interpreters) apparently had a good time there.
Impressions of the gathering (source)
So, wouldn’t it be great to set up some research on this haptic sign language. There are plenty of people who are interested in sign language because it provides insight in the human language capacity. They compare how people listen and talk (and gesture) to how they watch and sign (and gesture). General human language processing must be separated from modality dependant processing stuff (though it is actually more like oral/auditory+visual/gestural vs. visual/gestural). Very interesting nevertheless. Lots of brain research with fMRI scanners…
Just imagine what we could learn by studying deafblind people while letting them ‘talk’ or ‘listen’ in haptic sign language. They should probably go two-by-two? Or else, what would be the stimulus material to which to must respond? Prepared haptic sign language material? Hmm, maybe some observations should be the first step, or recordings using video or perhaps datagloves?
Anyway, I would love to see more of it. Investigate how deafblind people manage to defy the odds and together create a language of their own. They are apparently already telling jokes. When shall we see/feel the first haptic sign language poem? And how can it be captured, transcribed or annotated? What sort of grammar does it have? Does iconicity play a role in sign formation and language use? Is iconicity achieved using similar strategies as in gesture and sign language? An ambitious man could write a research proposal for a nice post-doc position about it.