vol 9, issue 2

Motion control - wave to the future

10 February 2014
By Kris Sangani
Share |
Mobile fruit graphic

Devices have been moving towards gesture control for some time: has the technology reached fruitfulness?

Fruit Ninja

‘Fruit Ninja’ was a major hit with smartphone users who enjoyed making the most of their touchscreens

Image-based motion control graphic

Image-based motion control

Gyroscopic motion tracking graphic

Gyroscopic motion tracking

Sonar-based motion control graphic

Sonar-based motion control

Xbox Kinect gesture sensor

The Xbox Kinect gesture sensor has transformed on‑screen gaming into a much more physical pastime

Theremin instrument

1928 - The Theremin is an early electronic musical instrument which is controlled without physical contact

Nintendo Wii Remote

2006 - The device uses gesture sensing technology using a combination of accelerometer and optical sensors

The Leap

2013 - Leap Motion starts shipping The Leap - a small USB peripheral device designed for Windows PCs and Macs

Eye-monitoring system being used on a driver

A new eye-monitoring system is looking to combat fatigue among professional drivers

Stephen Hawking

Stephen Hawking’s language composition systems use gesture control, and is updated every few years

Motion control has come on in leaps and bounds (and swipes and punches) in recent years, and hardware companies are getting all e-motional in a bid to engage with users. But do we really know what to use it for yet?

Smartphones and tablets have led us into a touchable digital world. Gone are the days of pressing keys and moving a scroll wheel. Now we expect instant feedback via touch controls and air-based gestures are the next natural step for this type of interaction.

Companies like Leap Motion are looking to make waves in the human-computer interface market with a product that enables users to manipulate on-screen items not with a joystick or a mouse or a gadget of any kind, but directly with hand gestures.

Last year, Hewlett Packard announced an expanded partnership with Leap Motion that saw gesture-based controllers embedded in a range of HP laptops. And with Apple patenting a touchless control interface, it's time for manufacturers to start thinking about the next steps in user interaction.

Norwegian startup Elliptic Labs has made an ultrasound software development kit (SDK) that uses sound waves to interpret hand movements. Then there's PrimeSense, the company behind gesture control for Microsoft's Xbox Kinect, which recently demonstrated a smaller version of the Kinect sensor working with a Nexus 10 tablet at a Google developers' conference in May.

However, it's Leap Motion that has arguably the biggest potential to reach users.

Founded as OcuSpec in 2010, Leap Motion has amassed a global developer base tasked with creating apps for its gesture-based controller, which can be bought as a plug-and-play USB device or picked up as an embedded peripheral in laptops and keyboards.

CEO and co-founder Michael Buckwald has been busy promoting the application store where several developers have created applications for the device.

"One of the things we've done over the past few months is create a new module reference design that takes the module at the top of the peripheral, which is about 10mm, and reduce it to 3.5mm so that it's easy to be embedded in things like laptops and ultra-thin keyboards," he says. The company expects to launch a version for tablets and smartphones later this year.

"We've been saying for quite a while that our goal is to have Leap Motion in everything that's a computer or has a computer, so I think that tablets and phones are a natural next step. Our conversations are ongoing, but that could happen as early as this year – not only integration into tablets and phones but we'd also like to see additional orignal-equipment manucaturing (OEM) partners on the PC side," adds Buckwald.

In fact Buckwald sees professional applications as an important growth area on the app store. He explains how he was frustrated with the fact that a toddler can make something with modelling clay, but it takes a professional engineer hours to make a model using professional tools. "There are lots of examples of that, whether it's things in 3D or creation sculpting, education or exploration," he says. "Those experiences just don't happen on computers today, and our mission is to bring them to PC and to bring them to tablets."

Leap Motion's flagship product Airspace comes bundled with units such as the HP Envy 17 Leap Motion Edition laptop which is likely to encourage developer growth, especially as HP is implementing Leap in 11 stock-keeping units (SKUs) or different versions of the products – those being desktop and all-in-one computers – bringing the total number of SKUs to 12. HP has embedded Leap Motion technology into a keyboard that's included with 11 new SKUs,

One of the company's developers is Autodesk. It has a plugin for Maya,which is a $5,000 to $10,000 suite used by software engineers. Another is the Cubase iC pro music app, and Leap Motion is even being implemented in surgeries and other environments to attempt to reduce the possibility of infection.

Wand TV control

At the annual Consumer Electronics Show Las Vegas in January, Samsung took the wraps off its brand-new TV remote control. It has every control method imaginable baked in, from buttons to a touch pad to motion control and even voice recognition.

"As more and more functions are added to TVs, the remote control is also evolving to accommodate the added functions," says KwangKi Park, Samsung's executive vice president of visual display sales and marketing. "We will continue to work so that our customers can use remote control more intuitively and easily."

Other features on the new remote include'a''Soccer Mode' button, which apparently flicks your TV over to a set of interface and screen modes custom-built to best show off the football, and 'Multi-Link Screen' which is a fancy picture-in-picture mode. The new controller was on display with Samsung's new 110in Ultra HD TV at the show.

As ever, Samsung will want to steal a march on any motion-control interface that rival Apple may introduce in the near future. But motion control has been touted for TV for several years. In fact, Philips has long been demonstrating a gesture control device – but so far there has been little uptake by other TV manufacturers of their technology.

Television remote controls haven't changed a great deal since they were invented. The first generation looked more or less as they do now, but were often connected to your equipment with a cable. Philips's idea is a gesture-control device called uWand. It's exactly that: a wand, which you use to move through your TV's menu systems.

It is reminiscent of the Nintendo Wii remote, with the difference being that the Wii controller is designed for playing games – something that's obvious when you try to use it to press buttons in the console's menus, which is fiddly and annoying. The uWand is much smoother. The cursor doesn't jump about all over the place, and grabbing hold of things in the menus is simple.

Perhaps now is finally the right time for the uWand. Modern televisions are dominated by menu systems, especially for using the programme guides. On a standard remote, navigating through listings requires countless button presses, and is quite a pain to use. The uWand simplifies that by letting you scroll with a twist of the controller and select items to see in more detail with a large 'OK' button, one of only three physical controls on the remote.

What's more, you can record a programme by dragging the name of the show from the EPG to a drop zone. This makes recording things much faster than messing around with arrow keys and buttons.

Early versions of the controller employed a Wii-style sensor bar, which feeds back information to the remote about positioning. Philips says that when it makes its way into TVs, it will simply be integrated into the bezel. The remote communicates via radio frequency at the moment, but Bluetooth connectivity is in the pipeline.

Philips won't be keeping this technology to itself; any TV manufacturer can license it, with the hope that viewers will soon be able to interact with their new TVs in an entirely new and more logical manner. Whilst Philips has developed the hardware, user interface design will be handled by the TV manufacturers. Philips is also keen to point out that the uWand is well-suited to navigating around a 3D environment too.

Sonar

All the gesture and motion control technologies currently being promoted are line-of-sight or gyroscopic based – except one. Elliptic Labs is a Norwegian based company that has developed gesture recognition technology using ultrasound for touch-less gesturing smartphones and Windows PCs.

Laila Danielsen, the company's CEO says its software "delivers touchless gesturing in a natural way all around the screen of a smart phone, tablet or laptop at 180 degrees, using the movements you use in daily life.

"It enhances the way you interact with applications, such as browsing through pictures, social media and games like Fruit Ninja or Subway Surfer; it uses little power, and works in low light or in the dark."

The software can recognise gestures within 180 degrees from the front of a display at a range of 50cm to a metre away from it. Developers or users can create their own gestures in ways that help them navigate quickly.

The Elliptic SDK is currently being licensed to mobile phone ODMs in collaboration with Wolfson Microelectronics for use with its Audio Hub chip and MEMs microphone technology operating at around 40 KHz – although it can operate at higher and lower frequencies – explains Haakon Bryhni, chief technical officer, for Elliptic Labs who is based at the company's Norwegian R&D centre.

"Along with voice control, touchless gesture control is fast becoming one of the'next-generation human/machine interfaces for mobile and wearable devices," says Andy Brennan, commercial director for Wolfson.

The main push for Wolfson and Elliptic appear to be mobile. At the upcoming Mobile World Congress, Elliptic Labs will be present and will demonstrate its technology to potential original device manufacturers (ODMs) at the mobile industry's annual tech jamboree. No mobile phone company has currently announced that it is to incorporate the Wolfson/Elliptic Lab touchless gesture platform, but Danielsen claims that we will see devices with their technology later this year.

Gesture controls certainly have the potential to improve the existing interface experience, and there is a huge opportunity for a number of tech companies to define the future of gesture controls. The marketplace is ready and waiting for that one killer application to take us to the next level of human/interface interaction.

 

Share |

Information Technology: Gesture Control keeps Hawking Talking

Stephen Hawking has been afflicted by Motor Neurone disease for decades - but this has not prevented him from continuing his innovative research on black holes and other phenomena. He has gained respect as the foremost expert on the cosmos of his generation.

In order to enable Hawking to continue his discoveries, Intel Corporation has for almost 20 years custom-built the computer that he relies on to communicate.

Last year, the company announced that'it'will develop a ground breaking system to give him the ability to communicate faster, even as his disease'progresses. Hawking controls his current computer through cheek movements and the motion triggers an infrared switch attached to his glasses that allows him to select the characters of individual words on the screen in front of him, but this laborious process only allows him to create a word one letter at a time - an inefficient method even with predictive text.

Responding to a request from Hawking, Intel is currently working on a new system that will incorporate the company's latest gesture and facial-recognition technology. Showcased at CES earlier this month, Intel's perceptual computing initiative is developing new ways to interact with computers using speech, eye-tracking, gestures and facial expressions.

The new system will be able to measure mouth and eyebrow movements, as well as using facial recognition and better text prediction. The plan is to increase dictation speeds by a factor of ten.

Transport: Driver Fatigue Detection

European legislation around the subject of driver fatigue is, arguably, far stricter than anywhere else in the world. But complying with the legislation is tough when you have to monitor the health and safety of your drivers and, in the case of European coach operators, passengers as well.

One such operator has deployed automated fatigue monitoring systems to ensure their drivers remain alert and their passengers safe. Royal Beuk installed Seeing Machines' fatigue-monitoring systems in its fleet of coaches and in coaches and trucks of partner companies.

Some 20 vehicles, selected by Royal Beuk, were initially equipped with plans down the road to rollout the Seeing Eye system across the entire fleet of more than 60 coaches after the initial evaluation.

A camera in the driver's compartment uses eye-tracking technology to detect if the driver is distracted or about to fall asleep. Using sensing equipment that requires no recalibration between different drivers, the system tracks head alignment while simultaneously tracking and analysing eye behaviour. This enables warnings to be given through in-cab alerts, or for alerts to be provided to operations management for direct intervention.

"Eye-tracking technology has a major part to play in keeping drivers and passengers safe on the roads," explains Ken Kroeger, CEO of Seeing Machines. "The technology has already been proven in extreme environments such as open-cut mines and now promises to bring the same benefits to public roads."

Kroeger explans that the system was chosen following a testing period lasting several years, by Royal Beuk, of competing technologies. The Seeing Machines eye-tracking technology, which was originally designed for mining trucks, is also used in the mining sector by Caterpillar and BHP. Royal Beuk was active in converting Seeing Machines for the public automotive sector.

Timeline: Motion Control

1928

The Theremin is an early electronic musical instrument which is controlled without physical contact by the performer.

1993

Sega Activator is the first controller to allow full body sensing and is based on Assaf Gurner's light harp demonstrated at the Consumer Electronics Show in January.

2006

Nintendo Wii Remote. The device uses gesture sensing technology using a combination of accelerometer and optical sensor technology.

2010

Microsoft launches Xbox Kinect, a motion-control accessory for the Xbox 360. which provides full-body 3D motion capture, facial recognition and voice recognition.

2013

Leap Motion starts shipping The Leap - a small USB peripheral device designed for Windows PCs and Macs.

Related forum discussions
forum comment To start a discussion topic about this article, please log in or register.    

Latest Issue

E&T cover image 1410

"Climate change in Antarctica is leading to interest in extracting the region's natural resources, but there's the small matter of a treaty."

E&T jobs

E&T Marketplace

The essential source of engineering products and suppliers.

E&T podcast

Tune into our latest podcast

iTunes logo

Subscribe

Choose the way you would like to access the latest news and developments in your field.

Subscribe to E&T