Touch-free gestures for tablet devices devised by eyeSight

Apparently, we’re getting one step closer to Minority Report territory. eyeSight has just introduced a new software-based technology that will allow you to use your tablet device without ever touching it; they’re doing it based on gestures.

Aside from the OS compatibility issues, this won’t work on the current Apple iPad because it does not have a front-facing camera. The tech from eyeSight uses the built-in camera on Android tablets (and Windows-based portable computers) to track and recognize hand gestures from the user. Instead of actually swiping across the touchscreen, you can just wave your hand.

This may sound like a fun, but mostly useless application for most people, but think about this: what if you are in a lab of some kind and you have toxic materials on your gloves? You can’t go and smear that toxic ooze all over the touchscreen on your tablet, but you can wave your hands around like you just don’t care.

“Users can remotely control their music and video player, browse through eBooks, manage presentations, play games, control PC apps and carry out many other tasks without touching the keyboard or touchscreen,” said Gideon Shmuel, eyeSight’s CEO. “It is ideal for functions that do not require hands-on management and offers a new and improved user experience.”

Elliptic Labs has something seemingly similar, but that uses ultrasound rather than the built-in camera.


eyeSight Introduces Gesture Recognition Technology for Android Tablets and Windows-based Portable Computers

eyeSight’s software-based technology uses the device’s standard built-in camera to track the user’s hand gestures and convert them into commands, offering a touch free experience

HERZLIYA, Israel, Feb.3, 2011 /PRNewswire/ — eyeSight Mobile Technologies, a developer of Touch Free Interfaces for consumer electronics, launched a software-based gesture recognition technology for portable computer devices.

The technology allows users to control applications, programs and tools on devices such as Android tablets and Windows-based notebooks and netbooks by using simple hand gestures. Moreover, eyeSight’s solution for Windows enables seamless integration to windows applications.
Last year, eyeSight introduced the Natural User Interface for Android mobile devices, and is now releasing its hand gesture interface solution for computer-based Android and Windows platforms.

eyeSight’s Hand Gesture Recognition Technology utilizes the existing standard built in 2D camera, and does not require any hardware changes or an expensive 3D camera.
“Users can remotely control their music and video player, browse through eBooks, manage presentations, play games, control PC apps and carry out many other tasks without touching the keyboard or touchscreen,” said Gideon Shmuel, eyeSight’s CEO. “It is ideal for functions that do not require hands-on management and offers a new and improved user experience.”

The company’s groundbreaking Touch Free user interface uses advanced real-time image processing and machine vision algorithms. By using eyeSight’s user interface, manufacturers of portable devices with different types of operating systems (OS), such as Microsoft Windows 7 and Android, can offer their customers an entirely new user experience. Users can remotely control applications on a wide variety of devices, including tablets, notebooks, netbooks, All-In-One PCs, portable computers, mobile phones, and more.

The Touch Free technology is a pure software solution, highly optimized for mobile platforms, offering low CPU and memory requirements. It is independent of the underlying processor and camera hardware, and produces high quality gesture recognitions using standard VGA cameras which are built into the devices.

About eyeSight
eyeSight Mobile Technologies is a leader in touch free Interfaces for consumer electronics. Its technology allows users to control mobile and portable devices with simple hand gestures by using the built-in camera, advanced real-time image processing and machine vision algorithms.
For more information about eyesight, visit http://www.eyesight-tech.com. To set a demo at the Mobile World Congress (MWC) 2011 please contact us.


Posted in: Uncategorized

One Comment

  1. yash says:

    Not sure how it would work in practical environment. A person’s gestures vary with geographical regions and so many other factors.

Leave a Comment