Here is a nice video of a performance by Johann Lippowitz of his ‘signed’ version of Torn. It is really a classic performance of which many movies have been shown earlier. Only this time, Natalie Imbruglia joins him halfway, and the two add some nice touches to the routine.
Johann Lippowitz (real name David Armand) performs his mime version of Natalie Imbruglia’s ‘Torn’.
Yes, we all know that he does the guitar slide wrong. Get over it. It’s still really funny.
Without a doubt, his quasi-signing has pissed off many a serious singing-to-signing translator, because, needless to say, it is not any real sign language that he uses. So, is he making a mockery out of signing? Could be. Is that a bad thing? Only if you think that ASL or any other sign language needs to be put on a pedestal and glorified. In general, as any politician will tell you, being the butt of a joke is something to take in full stride. Just laugh along with all the rest, and if you can, play along and take the joke to a next level. Mind you, I am not saying it is weird to take offense at the joke if you are Deaf and proud of your sign language. But if you can’t beat the joke, join the laughers. It is the only effective strategy really.
It turns out, after a bt of ‘tubing’, that Johann Lippowitz (real name David Armand), has done quite a few songs in this way:
A major issue in the teleoperation of robots (e.g. UGVs) is the idea that teleoperation can be made easier by creating telepresence. Telepresence is not a thing that is limited to teleoperation, and the term appears to originate from work on teleconferencing. Below is an illustrative video about telepresence. Further down are a few more vids that provide an impression of the sort of camera images an operator has at his or her disposal for teleoperation of a robot.
POC: Kyle D. Fawcett, email@example.com
Telepresence technologies use interfaces and sensory input to mimic interaction with a remote environment to trick your brain into thinking you’re actually in the remote environment. Visual telepresence tricks your eyes into thinking they’ve been transported into a remote environment. This unlocks the brains natural spatial mapping abilities and thus enhances operation of closed cockpit armored vehicles and teleoperation of unmanned vehicles. The MITRE Immersive Vision system is a highly responsive head aimed vision system used for visual telepresence. Videos of MIVS experiments show the effectiveness of the system for robot teleoperation and virtually see-through cockpits in armored vehicles.
Lately, I have been studying the teleoperation of UGVs. Many people have tried various forms of gestural interaction to ‘teleoperate’ robots or robotic arms. Here is a nice, larger-than-life example where they control a robotic arm with a Wiimote.
15 tonnes of steel, 200 bar of hydraulic pressure and a control system written in Python. Oh, and a Wiimote.
To be quite honest, I do not think that using a Wiimote for teleoperation is a good idea at all. The only immediate advantage of using a Wiimote instead of using a more elaborate manual controller may well be a better ‘walk-up-and-use’ intuitiveness, although one still has to learn how the Wiimote ‘commands’ are mapped to the robotic arm’s motions, much as one needs to learn this with any other controller. A disadvantage may lie in the limited precision and the limited number of commands that the Wiimote offers. I think it all boils down to a basic ergonomical design of a manual controller for teleoperation. Operators must be able to (learn to) map the controllers options (degrees of freedom and commands) to the robot’s options (degrees of freedom and functions). This will likely involve a lot of prototyping and user testing to see what works best, but there is also quite a large literature on this topic (some of which originates from my current workplace at TNO, for example by Van Erp en by De Vries).
At my new workplace, TNO, we had a modest celebration today: Two robot projects in which we will be cooperating have been approved by the EC (three cheers for the authors of the proposals RL and MN!). One of those is concerned with robotics in healthcare, which brings me to the next video:
From Gecko Systems (check out more movies) comes this would-be personal robot nurse. The people in this movie appear slightly naïve in their childish enthusiasm but it’s nevertheless good to have such glimpses of the future. Who knows, perhaps you and I will be nursed by such machines? A thought I find somewhat disturbing, I must confess.
One family’s experience with a robot companion for their Mother.
More info: http://chrisharrison.net/projects/scratchinput
Scratch Input: Creating Large, Inexpensive, Unpowered and Mobile Finger Input Surfaces
We present Scratch Input, an acoustic-based input technique that relies on the unique sound produced when a fingernail is dragged over the surface of a textured material, such as wood, fabric, or wall paint. We employ a simple sensor that can be easily coupled with existing surfaces, such as walls and tables, turning them into large, unpowered and ad hoc finger input surfaces. Our sensor is sufficiently small that it could be incorporated into a mobile device, allowing any suitable surface on which it rests to be appropriated as a gestural input surface. Several example applications were developed to demonstrate possible interactions. We conclude with a study that shows users can perform six Scratch Input gestures at about 90% accuracy with less than five minutes of training and on wide variety of surfaces.
This one is about six months old, but I only just saw it. Definitely good for a laugh. And, it goes to show that the iPhone is not so much something that does ‘gesture recognition’. Rather, it has a really sensitive touchscreen and the ability to create various interactions using that touchscreen. Snorting coke is not a gesture, it is a recognizable ‘practical action’ (if you will allow the term here). So, anyone thinking of iPhone apps can consider going in the direction of any well defined action. I just can’t wait to see who will be the first to build iSwaffel.
Download The iSnort v0.1 NOW at www.TheiSnort.com This is a demonstration of a simulated Apple iPhone / iTouch Application invented by Directed and Produced by Irish (Belfast-based) Filmmaker / Action Artist Peter ‘Magic’ Johnston. Animation and a Co-Directing credit goes to Steven Henry.
I’m dreaming of a WHITE Christmas… just like the ones I used to BLOW…
The ultimate in last-minute Christmas (or New Year) gifts for iPhone and iPod Touch users.
Forget the Virtual Pint – it’s piss. What YOU need is an unlimited supply of ‘Class A’ Virtual Narcotics.
Be the envy of the in-crowd. Get ejected from pubs and nightclubs. Shock and amaze your so-called friends. Get oral sex from Z-list celebrities.
Introducing The iSnort – an ultra-edgy simulated iPhone / iPod Touch application.
Go on… give it a toot… it’s virtually addictive. I Can’t Believe It’s Not Cocaine.
Download The iSnort v0.1 NOW at www.TheiSnort.com for £10 – all future versions and updates are included in this one-off subscription.
Please pass this on. Going Viral on this would be naughty, but nice…