However, Jacob points out that tracking a user’s gaze will be more challenging on a cell phone. Eye tracking seems like a very clever idea.” “One of the problems with the cell phone is that there’s no place for the user interface. “This is a good step forward,” says Robert Jacob, a professor of computer science at Tufts University who also works on eye tracking. The system is at least 76 percent accurate in daylight and while the user is standing still and 60 percent accurate when a person is walking, says Campbell. While the eye tracking approach is rudimentary, the researchers hope to develop more sophisticated methods soon. The phone app divides the camera frame into nine regions and looks for the eye in one of these regions. The program places an “error box” virtually around an eye, and can recognize the eye as long as it doesn’t move outside of this box. A user must move the phone slightly so the icon is directly in front of her eye and then select an application by blinking. The program tracks the position of an eye relative to the screen (rather than where a person is looking). A user must calibrate the system by taking a picture of the left or right eye both indoors and outdoors.ĮyePhone runs on a Nokia 810 smart phone. During a learning phase, the system is trained to identify a person’s eye at varying distances and under different lighting. The Dartmouth researchers got around this with an algorithm that learns to identify a person’s eye under different conditions. “Existing algorithms were highly inaccurate in mobile conditions–even if you are standing and there’s a small movement in your arm, you’d get a large amount of blurring and error,” says Campbell. This isn’t surprising–keeping track of a gaze via a mobile phone is much more challenging than on a desktop computer because both the user and the phone are moving, and the surrounding environment is so changeable.
But so far, little work has been done on eye tracking on mobile phones. Mobile eye tracking could be useful for all mobile phone users, says Dartmouth professor Andrew Campbell, who led the development of the new system, called EyePhone. “Most of the time we are looking at the information we find most interesting.” “The naturalness of gaze interaction makes eye tracking promising,” says John Hansen, an associate professor at the IT University of Copenhagen in Denmark who works on gaze tracking.
Mobile moves EyePhone, developed at Dartmouth College, tracks a person’s eye relative to a phone’s screen, letting users activate applications by blinking.Įye tracking has been used for years, primarily as a way for people with disabilities to use computers and to enable advertisers to track a person’s focus of interest.
A team at Dartmouth College has now created an eye-tracking system that lets a user operate a smart phone with eye movement. Voice control is one alternative to using your fingers, but researchers are also working on other hands-free ways to control mobile devices.
It’s hard sending a text message with arms full of groceries or while wearing winter gloves.