Here is a nice video of a performance by Johann Lippowitz of his ‘signed’ version of Torn. It is really a classic performance of which many movies have been shown earlier. Only this time, Natalie Imbruglia joins him halfway, and the two add some nice touches to the routine.
Johann Lippowitz (real name David Armand) performs his mime version of Natalie Imbruglia’s ‘Torn’.
Yes, we all know that he does the guitar slide wrong. Get over it. It’s still really funny.
Without a doubt, his quasi-signing has pissed off many a serious singing-to-signing translator, because, needless to say, it is not any real sign language that he uses. So, is he making a mockery out of signing? Could be. Is that a bad thing? Only if you think that ASL or any other sign language needs to be put on a pedestal and glorified. In general, as any politician will tell you, being the butt of a joke is something to take in full stride. Just laugh along with all the rest, and if you can, play along and take the joke to a next level. Mind you, I am not saying it is weird to take offense at the joke if you are Deaf and proud of your sign language. But if you can’t beat the joke, join the laughers. It is the only effective strategy really.
It turns out, after a bt of ‘tubing’, that Johann Lippowitz (real name David Armand), has done quite a few songs in this way:
A major issue in the teleoperation of robots (e.g. UGVs) is the idea that teleoperation can be made easier by creating telepresence. Telepresence is not a thing that is limited to teleoperation, and the term appears to originate from work on teleconferencing. Below is an illustrative video about telepresence. Further down are a few more vids that provide an impression of the sort of camera images an operator has at his or her disposal for teleoperation of a robot.
POC: Kyle D. Fawcett, firstname.lastname@example.org
Telepresence technologies use interfaces and sensory input to mimic interaction with a remote environment to trick your brain into thinking you’re actually in the remote environment. Visual telepresence tricks your eyes into thinking they’ve been transported into a remote environment. This unlocks the brains natural spatial mapping abilities and thus enhances operation of closed cockpit armored vehicles and teleoperation of unmanned vehicles. The MITRE Immersive Vision system is a highly responsive head aimed vision system used for visual telepresence. Videos of MIVS experiments show the effectiveness of the system for robot teleoperation and virtually see-through cockpits in armored vehicles.
Lately, I have been studying the teleoperation of UGVs. Many people have tried various forms of gestural interaction to ‘teleoperate’ robots or robotic arms. Here is a nice, larger-than-life example where they control a robotic arm with a Wiimote.
15 tonnes of steel, 200 bar of hydraulic pressure and a control system written in Python. Oh, and a Wiimote.
To be quite honest, I do not think that using a Wiimote for teleoperation is a good idea at all. The only immediate advantage of using a Wiimote instead of using a more elaborate manual controller may well be a better ‘walk-up-and-use’ intuitiveness, although one still has to learn how the Wiimote ‘commands’ are mapped to the robotic arm’s motions, much as one needs to learn this with any other controller. A disadvantage may lie in the limited precision and the limited number of commands that the Wiimote offers. I think it all boils down to a basic ergonomical design of a manual controller for teleoperation. Operators must be able to (learn to) map the controllers options (degrees of freedom and commands) to the robot’s options (degrees of freedom and functions). This will likely involve a lot of prototyping and user testing to see what works best, but there is also quite a large literature on this topic (some of which originates from my current workplace at TNO, for example by Van Erp en by De Vries).