Eye-tracking devices are becoming more common than you may think. In 2013 the Samsung Galaxy S4 touted an eye-reading feature that would scroll through articles and websites according to where you were looking. The S4 also used eye tracking to keep the display of the device turned on when the user was focused on the screen and then pause or turn off the display when the user looked away.
It appears Samsung will be taking that technology even further in 2014. A recent rumor in the tech industry says that the Samsung Galaxy S5 — expected to be released in mid 2014 — will use an iris scanner for user verification. It certainly sounds futuristic, but it could become a reality for consumers very soon.
Retinal scanning doesn’t stop at security applications. Many believe the humble computer mouse may eventually be replaced by your gaze. Eye-tracking company Tobii Technology showcased a device at this month’s Consumer Electronics Show (CES) that lets users move their mouse simply by looking around their screen and click by blinking.
Tobii demonstrated how their technology allows users to browse the Internet with a laptop or desktop PC, all with their eyes. The company claims their technology is just as accurate as a touchscreen. “If users can hit a link or an object with their finger on the screen, they can also hit it with their eyes,” says Tobii strategic business development manager, Peter Tiberg.
The technology even extends to gaming, with the Tobii EyeX allowing eye “gestures,” like closing one eye to aim through the sights of a gun in a first-person shooter. Tobii is confident their technology can replace the mouse once users feel comfortable cutting the cord between themselves and the antiquated technology they’ve been conditioned to accept.
Another major implementation of eye tracking in mobile will eventually be found in Google Glass. A patent filed by Google in 2013 showed a proprietary technology that allows users to unlock Google Glass by tracking an image of a bird flying across the screen. It would also allow the user to scroll text on the device without having to continuously rub the side of their head.
Another developer used the sensors in the current version of Google Glass to help those in wheelchairs control their movement by looking in the direction they want to go. Users can increase or decrease their speed by looking at on-screen controls, and can even identify obstacles as they approach them using only their eyes.
Another third-party app developer named DriveSafe showed off a prototype for Google Glass that can detect when a driver becomes drowsy and direct them to the nearest rest stop. The company hinted at furthering the technology to make Glass vibrate if the driver’s eyes begin closing completely. Glass could also make using the controls in the center console much safer, allowing users to adjust temperatures and change songs with a subtle eye movement rather than reaching over to adjust dials or press buttons.
Retailers also have a lot to gain from eye-tracking technology. People analytics is a rapidly emerging space that allows businesses to gauge how users and visitors interact with their product or service according to a number of factors, including tracking what their eyes are focusing on. Some online providers allow businesses to utilize eye tracking to gather people analytics, which allows them to gauge which areas of their website they view first and which they avoid. Websites such as Crazy Egg allow retailers to figure out why a potential customer left a website without buying something or why visitors quickly left a page.
Similar technology could be used in brick-and-mortar stores, allowing retailers to quickly evaluate which displays are performing best and what adjustments can be made to improve their laggards. They could also determine how long customers linger in front of products and where they look first, helping decide where to place a product on a shelf or how much to charge in shelving fees.
Other applications include building interactive displays that allow customers to view inventory in new ways. Rather than using a touch screen to flick through a catalog, users can stand in front of a screen and look at what the store has to offer without lifting a finger.
Microsoft isn’t taking advantage of it yet, but the Kinect that ships with the Xbox One has the hardware capability to track the eye movement of users looking at the TV. This opens up possibilities for advertisers to analyze the effectiveness of product placements on users who stream TV through the device.
The new Kinect came under significant backlash from gamers because it requires the device to always be on, making users wary of eavesdropping and an inability to determine when it’s happening. Perhaps over time, as use of eye tracking technology in consumer applications becomes more commonplace, people will embrace the technology. In the meantime, however, it’s not a good idea to track eye movement without the users’ consent or knowledge.
Because consumers can’t easily determine whether they are being tracked, it’s even more important to ensure that there is a balance between privacy and commercial utilization. If the consumer is made aware that an action he is performing when interacting with a TV or other device may track his eye movement, and the benefits are made clear, he or she may be more willing to use it.
If eye tracking is automatically used without consent, it could be seen as a breach of privacy and swiftly disabled. It’s important to ensure that any application being built is clear about how the data will be used before it’s enabled, outlining the benefits for the user in an easy-to-understand manner.
Eventually, eye-tracking technology will likely be built into almost every camera, from the ones that come in your computer to add-ons like the XBox Kinect. Removing the need to use touch to interact with screens means that developers are free to imagine new, simpler ways to interact with their applications. Whether it’s tracking the user’s reading habits or using eye movements to play games without holding a controller or dancing around your living room, developers can now create multimedia experiences that were previously reserved for sci-fi flicks.