eyeSight CEO Gideon Shmuel: The Company Making Minority Report A Reality (Finally)

EyeSight: The Company Making Minority Report A Reality (Finally)

Along with jetpacks, moon bases and robot dogs that can fetch you a cup of tea, one of the most-requested 'future tech' products we get asked about all the time is 'Minority Report'.

Nope, not the movie. We've had about as much of Tom Cruise as we can stand for one year.

What people mean when they say Minority Report are the movement-based 'wavey hands' controls shown in the film.

But while the advent of the Nintendo Wii, and latterly the Microsoft Kinect system for Xbox, seemed to indicate we weren't far off, progress seems to have stalled a bit.

Even Flutter - a cool app for Mac OS X which allows you to control iTunes with gestures and a webcam - hasn't yet delivered enough to realise its promise. Yet.

Then we saw eyeSight.

eyeSight is a leading maker of just that kind of 'touch free' interface, allowing users to control every function of computers, TVs and even mobile devices with a flick of the wrist.

Yes, it works (and is already available on a smartphone in Korea). Recently they announced a full gesture solution for Windows 8, but what really got us intrigued was their demo videos from Computex. Take a look below.

So while eyeSight are not the only company going after the new gesture-control market, they're one of the only solutions around with working tech usable on current gen devices.

After watching their demos, we caught up over email with eyeSight's CEO Gideon Shmuel, and asked him how the tech works, how soon we'll be using it - and what the company's doing about a little thing called Windows Kinect.

What is the evidence people want to move to gesture-based controls?

User feedback has also been amazing. Features such as call control which enables a user to answer a call on loudspeaker with a wave of a hand were initially thought to be used in situations shown in the previous clip, (when touching the device is not possible), but user feedback showed that people enjoyed using this feature even when they can touch the device as it is more comfortable and fun for them to use it touch-fee.

In addition to this, success of other gesture recognition technology such as the Microsoft Kinect and research done by our partners in the chipset and camera module field show that gesture based control is "the next big thing" in user interface.

Lastly, the best piece of evidence is that all major device manufacturers of mobile phones, tablets, PCs as well as Smart TVs are all looking at gesture based controls for their devices. We are currently in talks with tier one manufacturers who want to integrate our technology into their devices this would have not happened if they did not believe that there is a need and want for such new interaction by end users.

What's the learning curve - and how does it compare to mouse/keyboard and touch?

eyeSight's language of gestures was designed from day one to be intuitive and simple to use, the guideline when designing the technology is to keep the language of gestures simple- so that it will be relevant for the mass market. A good question we like to ask ourselves is “Will our mothers be able to use this easily?”

Above: eyeSight admit most people still think of Minority Report when they see their product

What are the technical hurdles still in place?

The technology is already commercial and product ready.

During the research and development stage we had to deal with many limitations such as how to deal with "noisy" or moving background and different lighting conditions, something many of our competitors still haven’t mastered.

As eyeSight's technology uses complex algorithms developed to work on any 2D camera already integrated into a device there is no technological limitation and no need for special sensors.

Is Windows 8 'good enough' for eyeSight to be as useful as it wants/needs to be?

Yes, the new Metro UI of windows 8 is perfect for eyeSight's touch-free solution. On devices with no touch screen or when touching the devices is not ideal - eyeSight's solution fits perfectly with the UI to offer an seamless interaction with this new OS.

What's the first reaction people have when using the Eyesight control method?

"WOW" obviously. I think people are usually so surprised how easy it is to use. We’re asked a lot how can I get that for my phone, tablet, PC etc? The usual suspect is people comparing our technology to that used in Minority Report.

How do you avoid 'accidental use' - i.e. making a gesture without realising?

There are various ways to assure that only intentional motions are detected; such as activating the gesture control engines only within the apps and features that are intended for gesture control.

We also have face detection and tracking capabilities which we have developed and allow to support single or multiple users to control the system. An example would be in a living room while a family is watching TV: Our system will detect that there are x people in the room and the system can be set to only start detecting gestures once a user "asks" to get control (for instance by performing a wave hello motion) then the system will know to focus on that specific user and allow him/her to change channels, volume and more.

So while the system is focused, others users can move freely without having their motions falsely detected. This also ensures that if no one "asks" the system for control all gestures will be ignored.

What about Kinect? What are you doing different?

Kinect is hardware based, requires special sensors and is expensive compared to software solution. It cannot be embedded in a device as the specialist sensor requires space and a lot of extra processing power. Kinect is great for advanced full-body gaming but offers too much information for most uses e.g.. Simply wanting to change a channel or volume. There’s no need for full body control for those actions.

eyeSight’s solution on the other hand is software based and simple to integrate with a SDK for all main operating systems. Our technology has a small footprint and demands minimal resources from the processor.

Close

What's Hot