Synchronized eye movements predict test scores in online video education
See allHide authors and affiliations
Edited by David J. Heeger, New York University, New York, NY, and approved November 9, 2020 (received for review August 10, 2020)

Significance
Education is increasingly delivered online, but are students actually paying attention? Here we demonstrate that efficacy of video instruction can be assessed remotely with standard web cameras. Specifically, we show that attentive students have similar eye movements when watching instructional videos and that synchronization of eye movements is a good predictor of individual learning performance. Measuring synchronization of eye movements while preserving privacy, as we have shown here, has the potential to make online education adaptive to attentional state and advance mechanistic studies on the efficacy of different online education formats. Attention has become a commodity online. With the increasing abundance of video content, remote sensing of attention at scale may be relevant beyond education, including entertainment, advertising, and politics.
Abstract
Experienced teachers pay close attention to their students, adjusting their teaching when students seem lost. This dynamic interaction is missing in online education. We hypothesized that attentive students follow videos similarly with their eyes. Thus, attention to instructional videos could be assessed remotely by tracking eye movements. Here we show that intersubject correlation of eye movements during video presentation is substantially higher for attentive students and that synchronized eye movements are predictive of individual test scores on the material presented in the video. These findings replicate for videos in a variety of production styles, for incidental and intentional learning and for recall and comprehension questions alike. We reproduce the result using standard web cameras to capture eye movements in a classroom setting and with over 1,000 participants at home without the need to transmit user data. Our results suggest that online education could be made adaptive to a student’s level of attention in real time.
Footnotes
- ↵1To whom correspondence may be addressed. Email: jmadsen{at}ccny.cuny.edu.
Author contributions: J.M. and L.C.P. designed research; J.M., S.U.J., P.J.G., R.S., and L.C.P. performed research; L.C.P. contributed new reagents/analytic tools; J.M. analyzed data; and J.M. and L.C.P. wrote the paper.
Competing interest statement: L.C.P. and J.M. are listed as inventors in a related pending patent application. L.C.P. was affiliated with Neuromatters Corp., KCore Analytics Inc., and 3Finches Inc. at the time of the study.
This article is a PNAS Direct Submission.
This article contains supporting information online at https://www.pnas.org/lookup/suppl/doi:10.1073/pnas.2016980118/-/DCSupplemental.
Data Availability.
Anonymized data to produce each figure in matlab format is available in the Open Science Framework (https://osf.io/m7gj4/). A full list of questions and answer options can be found in Open Science Framework (https://osf.io/fjxaq/). The code used to carry out the online experiment is available in Github (https://github.com/elicit-experiment).
Published under the PNAS license.
References
- ↵
- ↵
- ↵
- ↵
- ↵
- C. E. Wolff,
- N. van den Bogert,
- H. Jarodzka,
- H. P. A. Boshuizen
- ↵
- ↵
- ↵
- ↵
- D. A. Slykhuis,
- E. N. Wiebe,
- L. A. Annetta
- ↵
- F. Y. Yang,
- C. Y. Chang,
- W. R. Chien,
- Y. T. Chien,
- Y. H. Tseng
- ↵
- U. Hasson,
- E. Yang,
- I. Vallines,
- D. J. Heeger,
- N. Rubin
- ↵
- J. M. Franchak,
- D. J. Heeger,
- U. Hasson,
- K. E. Adolph
- ↵
- ↵
- M. Dorr,
- T. Martinetz,
- K. R. Gegenfurtner,
- E. Barth
- ↵
- C. Christoforou,
- S. Christou-Champi,
- F. Constantinidou,
- M. Theodorou
- ↵
- K. Burleson-Lesser,
- F. Morone,
- P. DeGuzman,
- L. C. Parra,
- H. A. Makse
- ↵
- T. J. Smith,
- P. K. Mital
- ↵
- M. L. Lai et al.
- ↵
- H. X. Wang,
- J. Freeman,
- E. P. Merriam,
- U. Hasson,
- D. J. Heeger
- ↵
- ↵
- M. Hegarty,
- S. Kriz,
- C. Cate
- ↵
- ↵
- ↵
- ↵
- ↵
- S. Kambhampati
- A. Papoutsaki et al.
- ↵
- ↵
- S. S. Cohen,
- S. Henin,
- L. C. Parra
- ↵
- O. Le Meur,
- A. Ninassi,
- P. Le Callet,
- D. Barba
- ↵
- J. J. Ki,
- S. P. Kelly,
- L. C. Parra
- ↵
- ↵
- ↵
- J. Preissle,
- M. D. Le Compte
- ↵
- ↵
- M. van Someren,
- Y. Barnard,
- J. Sandberg
- ↵
- S. Eivazi,
- R. Bednarik
- ↵
- S. C. Chen et al.
- ↵
- ↵
- ↵
- L. Fiorella,
- A. T. Stull,
- S. Kuhlmann,
- R. E. Mayer
- ↵
- M. van Wermeskerken,
- S. Ravensbergen,
- T. van Gog
- ↵
- B. B. De Koning,
- H. K. Tabbers,
- R. M. Rikers,
- F. Paas
- ↵
- ↵
- T. Van Gog,
- H. Jarodzka,
- K. Scheiter,
- P. Gerjets,
- F. Paas
- ↵
- H. Jarodzka,
- K. Scheiter,
- P. Gerjets,
- T. Van Gog
- ↵
- ↵
- A. Bergt,
- A. E. Urai,
- T. H. Donner,
- L. Schwabe
- ↵
- ↵
- P. J. Guo,
- J. Kim,
- R. Rubin
- ↵
- M. Ginda,
- M. C. Richey,
- M. Cousino,
- K. Börner
- ↵
- D. Lagun,
- M. Lalmas
- ↵
- ↵
- ↵
- A. T. Poulsen,
- S. Kamronn,
- J. Dmochowski,
- L. C. Parra,
- L. K. Hansen
- ↵
- D. Bevilacqua et al.
- ↵
- B. S. Bloom,
- M. D. Engelhart,
- E. J. Furst,
- W. H. Hill,
- D. R. Krathwohl
- ↵
Log in using your username and password
Log in through your institution
Purchase access
Subscribers, for more details, please visit our Subscriptions FAQ.
Please click here to log into the PNAS submission website.
Citation Manager Formats
Article Classifications
- Biological Sciences
- Psychological and Cognitive Sciences