Our client needed to assess gaze tracking as part of their cognitive assessment platform. Specifically, they needed a mobile application to assess cognitive functioning in realtime for healthy vs. not-healthy behavior in dementia, concussion, and intoxication patient cohorts.
Our team designed a tablet-based application to record a person’s ability to track a visual stimulus (nystagmogram) for healthy, non-concussed individuals. We conducted 2D and 3D feature engineering of the videos to extract facial landmarks, including corrections for head position, orientation, and angle. We trained a convolutional neural net to learn the focus of the gaze, and score a new individual’s ability to track a visual stimulus. Performance-tuned models delivered responses in realtime within six degrees of accuracy on a mobile platform.