Lately, I have been studying the teleoperation of UGVs. Many people have tried various forms of gestural interaction to ‘teleoperate’ robots or robotic arms. Here is a nice, larger-than-life example where they control a robotic arm with a Wiimote.

15 tonnes of steel, 200 bar of hydraulic pressure and a control system written in Python. Oh, and a Wiimote.

To be quite honest, I do not think that using a Wiimote for teleoperation is a good idea at all. The only immediate advantage of using a Wiimote instead of using a more elaborate manual controller may well be a better ‘walk-up-and-use’ intuitiveness, although one still has to learn how the Wiimote ‘commands’ are mapped to the robotic arm’s motions, much as one needs to learn this with any other controller. A disadvantage may lie in the limited precision and the limited number of commands that the Wiimote offers. I think it all boils down to a basic ergonomical design of a manual controller for teleoperation. Operators must be able to (learn to) map the controllers options (degrees of freedom and commands) to the robot’s options (degrees of freedom and functions). This will likely involve a lot of prototyping and user testing to see what works best, but there is also quite a large literature on this topic (some of which originates from my current workplace at TNO, for example by Van Erp en by De Vries).