Tell The Algorithm When To Care

Eye-tracking technology provides an advanced interface that will benefit several industries, from marketing to healthcare. In recent years, Apple and Google have acquired companies developing this technology (SMI and Eyefluence, respectively). These movements likely point to a need to optimize in the VR/AR space, but eye-tracking can expand to several use cases.

Eye-tracking technology allows a developer to track which direction a user is looking for and for how long. Eye-tracking is already in use as an interface. For example, the start-up xLabs provides real-time eye tracking with any webcam. Users can look in a particular direction and trigger an interaction with an application. In the case of VR/AR, it may mean designing experiences to respond to eye-tracking to deepen the sense of immersion.

Eye-tracking provides a unique opportunity to make our algorithms even better in the world of machine learning. Barret et al. show that human attention, as measured by eye-tracking technology, can be used to improve several standard natural language processing problems.

The authors develop an RNN model that integrates text and eye-tracking measures (i.e., fixation duration time) and show significant improvement in several NLP tasks (sentiment analysis, abusive language detection, and grammatical error detection). They train the neural network to learn features from both gaze and text and then can use a model to classify the book alone. The reading behavior essentially gives weight to words and tells the algorithm where to focus more.

Barret et al. mention several others that have improved accuracy with gaze tracking (here, here, and here). Worth adding to the reading list is an inspiring example of how to integrate new technology into the machine learning process.

Written by

Angela Wilkins

I like science, machine learning, start-ups, venture capital, and technology.