- So far, they have been able to train software to identify where a person is looking with an accuracy of about a centimetre on a mobile phone and 1.7 centimetres on a tablet.
- “It’s still not exact enough to use for consumer applications,” said a graduate student at MIT.
- The study was conducted by researchers at Max Planck Institute for Informatics in Germany, Massachusetts Institute of Technology (MIT) and University of Georgia in the US.
A team of scientists with one of them from India revealed that is working on a new mobile application which will help users to control their smartphones. The App will accurately identify where the user is looking in real time, an advance that may lead smartphones, tablets and other mobile devices to be controlled by eye movements.
In addition, for the price sensitive market like India, it very critical to optimise the price tag for better earnings and users’ affordability. And to make the eye tracker app compact and accurate enough to be included in smartphones, researchers are crowdsourcing the collection of gaze data and using it to teach mobile software how to figure out where a person is looking in.
Looking into the scenario, the researchers at Max Planck Institute for Informatics in Germany, Massachusetts Institute of Technology (MIT) and University of Georgia in the US, as of now, are able to train software to identify where a person is looking with an accuracy of about a centimetre on a mobile phone and 1.7 centimetres on a tablet.
“It’s still not exact enough to use for consumer applications,” said Aditya Khosla, a graduate student at MIT. However, he believes the system’s accuracy will improve with more data.
The technology has required hardware that has made it tricky, which will be expensive to implement. The hardware will add a capability to gadgets like phones and tablets. It could make eye tracking a lot more widespread and also be helpful as a way to let you play games or navigate your smartphone without having to tap or swipe.
The researchers started out by building an app called GazeCapture that gathered data about how people look at their phones in different environments outside the confines of a lab, ‘MIT Technology Review.’
Users’ gaze was recorded with the phone’s front camera as they were shown pulsating dots on a smartphone screen. To make sure they were paying attention, they were then shown a dot with an “L” or “R” inside it, and they had to tap the left or ride side of the screen in response.
GazeCapture information was then used to train software called iTracker. The handset’s camera captures your face, and the software considers factors like the position and direction of your head and eyes to figure out where your gaze is focused on the screen.
“About 1,500 people have used the GazeCapture app so far,” Khosla, the student said, adding that if the researchers can get data from 10,000 people they will be able to reduce the software’s error rate to half a centimetre, which should be good enough for a range of eye-tracking applications.
With inputs from PTI