Skip to main content

Main menu

  • Home
  • Articles
    • Current
    • Special Feature Articles - Most Recent
    • Special Features
    • Colloquia
    • Collected Articles
    • PNAS Classics
    • List of Issues
  • Front Matter
    • Front Matter Portal
    • Journal Club
  • News
    • For the Press
    • This Week In PNAS
    • PNAS in the News
  • Podcasts
  • Authors
    • Information for Authors
    • Editorial and Journal Policies
    • Submission Procedures
    • Fees and Licenses
  • Submit
  • Submit
  • About
    • Editorial Board
    • PNAS Staff
    • FAQ
    • Accessibility Statement
    • Rights and Permissions
    • Site Map
  • Contact
  • Journal Club
  • Subscribe
    • Subscription Rates
    • Subscriptions FAQ
    • Open Access
    • Recommend PNAS to Your Librarian

User menu

  • Log in
  • My Cart

Search

  • Advanced search
Home
Home
  • Log in
  • My Cart

Advanced Search

  • Home
  • Articles
    • Current
    • Special Feature Articles - Most Recent
    • Special Features
    • Colloquia
    • Collected Articles
    • PNAS Classics
    • List of Issues
  • Front Matter
    • Front Matter Portal
    • Journal Club
  • News
    • For the Press
    • This Week In PNAS
    • PNAS in the News
  • Podcasts
  • Authors
    • Information for Authors
    • Editorial and Journal Policies
    • Submission Procedures
    • Fees and Licenses
  • Submit
Research Article

Synchronized eye movements predict test scores in online video education

View ORCID ProfileJens Madsen, View ORCID ProfileSara U. Júlio, Pawel J. Gucik, View ORCID ProfileRichard Steinberg, and View ORCID ProfileLucas C. Parra
  1. aDepartment of Biomedical Engineering, City College of New York, New York, NY 10031;
  2. bSchool of Education, City College of New York, New York, NY 10031;
  3. cDepartment of Physics, City College of New York, New York, NY 10031

See allHide authors and affiliations

PNAS February 2, 2021 118 (5) e2016980118; https://doi.org/10.1073/pnas.2016980118
Jens Madsen
aDepartment of Biomedical Engineering, City College of New York, New York, NY 10031;
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Jens Madsen
  • For correspondence: jmadsen@ccny.cuny.edu
Sara U. Júlio
aDepartment of Biomedical Engineering, City College of New York, New York, NY 10031;
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Sara U. Júlio
Pawel J. Gucik
aDepartment of Biomedical Engineering, City College of New York, New York, NY 10031;
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Richard Steinberg
bSchool of Education, City College of New York, New York, NY 10031;
cDepartment of Physics, City College of New York, New York, NY 10031
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Richard Steinberg
Lucas C. Parra
aDepartment of Biomedical Engineering, City College of New York, New York, NY 10031;
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Lucas C. Parra
  1. Edited by David J. Heeger, New York University, New York, NY, and approved November 9, 2020 (received for review August 10, 2020)

  • Article
  • Figures & SI
  • Info & Metrics
  • PDF
Loading

Significance

Education is increasingly delivered online, but are students actually paying attention? Here we demonstrate that efficacy of video instruction can be assessed remotely with standard web cameras. Specifically, we show that attentive students have similar eye movements when watching instructional videos and that synchronization of eye movements is a good predictor of individual learning performance. Measuring synchronization of eye movements while preserving privacy, as we have shown here, has the potential to make online education adaptive to attentional state and advance mechanistic studies on the efficacy of different online education formats. Attention has become a commodity online. With the increasing abundance of video content, remote sensing of attention at scale may be relevant beyond education, including entertainment, advertising, and politics.

Abstract

Experienced teachers pay close attention to their students, adjusting their teaching when students seem lost. This dynamic interaction is missing in online education. We hypothesized that attentive students follow videos similarly with their eyes. Thus, attention to instructional videos could be assessed remotely by tracking eye movements. Here we show that intersubject correlation of eye movements during video presentation is substantially higher for attentive students and that synchronized eye movements are predictive of individual test scores on the material presented in the video. These findings replicate for videos in a variety of production styles, for incidental and intentional learning and for recall and comprehension questions alike. We reproduce the result using standard web cameras to capture eye movements in a classroom setting and with over 1,000 participants at home without the need to transmit user data. Our results suggest that online education could be made adaptive to a student’s level of attention in real time.

  • online education
  • eye tracking
  • intersubject correlation

Footnotes

  • ↵1To whom correspondence may be addressed. Email: jmadsen{at}ccny.cuny.edu.
  • Author contributions: J.M. and L.C.P. designed research; J.M., S.U.J., P.J.G., R.S., and L.C.P. performed research; L.C.P. contributed new reagents/analytic tools; J.M. analyzed data; and J.M. and L.C.P. wrote the paper.

  • Competing interest statement: L.C.P. and J.M. are listed as inventors in a related pending patent application. L.C.P. was affiliated with Neuromatters Corp., KCore Analytics Inc., and 3Finches Inc. at the time of the study.

  • This article is a PNAS Direct Submission.

  • This article contains supporting information online at https://www.pnas.org/lookup/suppl/doi:10.1073/pnas.2016980118/-/DCSupplemental.

Data Availability.

Anonymized data to produce each figure in matlab format is available in the Open Science Framework (https://osf.io/m7gj4/). A full list of questions and answer options can be found in Open Science Framework (https://osf.io/fjxaq/). The code used to carry out the online experiment is available in Github (https://github.com/elicit-experiment).

Published under the PNAS license.

View Full Text

References

  1. ↵
    1. R. G. Packard
    , The control of “classroom attention”: A group contingency for complex behavior 1. J. Appl. Behav. Anal. 3, 13–28 (1970).
    OpenUrlCrossRefPubMed
  2. ↵
    1. D. M. Bunce,
    2. E. A. Flens,
    3. K. Y. Neiles
    , How long can students pay attention in class? A study of student attention decline using clickers. J. Chem. Educ. 87, 1438–1443 (2010).
    OpenUrlCrossRef
  3. ↵
    1. H. Deubel,
    2. W. X. Schneider
    , Saccade target selection and object recognition: Evidence for a common attentional mechanism. Vis. Res. 36, 1827–1837 (1996).
    OpenUrlCrossRefPubMed
  4. ↵
    1. J. E. Hoffman,
    2. B. Subramaniam
    , The role of visual attention in saccadic eye movements. Percept. Psychophys. 57, 787–795 (1995).
    OpenUrlCrossRefPubMed
  5. ↵
    1. C. E. Wolff,
    2. N. van den Bogert,
    3. H. Jarodzka,
    4. H. P. A. Boshuizen
    , Keeping an eye on learning: Differences between expert and novice teachers’ representations of classroom management events. J. Teach. Educ. 66, 68–85 (2015).
    OpenUrl
  6. ↵
    1. H. J. Bucher,
    2. P. Schumacher
    , The relevance of attention for selecting news content. An eye-tracking study on attention patterns in the reception of print and online media. Communications 31, 347–368 (2006).
    OpenUrlCrossRef
  7. ↵
    1. L. Lorigo et al.
    , Eye tracking and online search: Lessons learned and challenges ahead. J. Am. Soc. Inf. Sci. Technol. 59, 1041–1052 (2008).
    OpenUrlCrossRef
  8. ↵
    1. H. Jarodzka,
    2. K. Holmqvist,
    3. H. Gruber
    , Eye tracking in educational science: Theoretical frameworks and research agendas. J. Eye Mov. Res., doi:10.16910/jemr.10.1.3 (2017).
    OpenUrlCrossRef
  9. ↵
    1. D. A. Slykhuis,
    2. E. N. Wiebe,
    3. L. A. Annetta
    , Eye-tracking students’ attention to PowerPoint photographs in a science education setting. J. Sci. Educ. Technol. 14, 509–520 (2005).
    OpenUrl
  10. ↵
    1. F. Y. Yang,
    2. C. Y. Chang,
    3. W. R. Chien,
    4. Y. T. Chien,
    5. Y. H. Tseng
    , Tracking learners’ visual attention during a multimedia presentation in a real classroom. Comput. Educ. 62, 208–220 (2013).
    OpenUrl
  11. ↵
    1. U. Hasson,
    2. E. Yang,
    3. I. Vallines,
    4. D. J. Heeger,
    5. N. Rubin
    , A hierarchy of temporal receptive windows in human cortex. J. Neurosci. 28, 2539–2550 (2008).
    OpenUrlAbstract/FREE Full Text
  12. ↵
    1. J. M. Franchak,
    2. D. J. Heeger,
    3. U. Hasson,
    4. K. E. Adolph
    , Free viewing gaze behavior in infants and adults. Infancy 21, 262–287 (2016).
    OpenUrl
  13. ↵
    1. U. Hasson et al.
    , Neurocinematics: The neuroscience of film. Projections 2, 1–26 (2008).
    OpenUrlCrossRef
  14. ↵
    1. M. Dorr,
    2. T. Martinetz,
    3. K. R. Gegenfurtner,
    4. E. Barth
    , Variability of eye movements when viewing dynamic natural scenes. J. Vis. 10, 28 (2010).
    OpenUrlAbstract/FREE Full Text
  15. ↵
    1. C. Christoforou,
    2. S. Christou-Champi,
    3. F. Constantinidou,
    4. M. Theodorou
    , From the eyes and the heart: A novel eye-gaze metric that predicts video preferences of a large audience. Front. Psychol. 6, 579 (2015).
    OpenUrl
  16. ↵
    1. K. Burleson-Lesser,
    2. F. Morone,
    3. P. DeGuzman,
    4. L. C. Parra,
    5. H. A. Makse
    , Collective behavior in video viewing: A thermodynamic analysis of gaze position. PloS One 12, e0168995 (2017).
    OpenUrl
  17. ↵
    1. T. J. Smith,
    2. P. K. Mital
    , Attentional synchrony and the influence of viewing task on gaze behavior in static and dynamic scenes. J. Vis. 13, 16 (2013).
    OpenUrlAbstract/FREE Full Text
  18. ↵
    1. M. L. Lai et al.
    , A review of using eye-tracking technology in exploring learning from 2000 to 2012. Educ. Res. Rev. 10, 90–115 (2013).
    OpenUrl
  19. ↵
    1. H. X. Wang,
    2. J. Freeman,
    3. E. P. Merriam,
    4. U. Hasson,
    5. D. J. Heeger
    , Temporal eye movement strategies during naturalistic viewing. J. Vis. 12, 16 (2012).
    OpenUrlAbstract/FREE Full Text
  20. ↵
    1. L. Itti,
    2. P. Baldi
    , Bayesian surprise attracts human attention. Vis. Res. 49, 1295–1306 (2009).
    OpenUrlCrossRefPubMed
  21. ↵
    1. M. Hegarty,
    2. S. Kriz,
    3. C. Cate
    , The roles of mental animations and external animations in understanding mechanical systems. Cogn. InStruct. 21, 209–249 (2003).
    OpenUrl
  22. ↵
    1. R. E. Mayer,
    2. M. Hegarty,
    3. S. Mayer,
    4. J. Campbell
    , When static media promote active learning: Annotated illustrations versus narrated animations in multimedia instruction. J. Exp. Psychol. Appl. 11, 256–265 (2005).
    OpenUrlCrossRefPubMed
  23. ↵
    1. G. R. Loftus
    , Eye fixations and recognition memory for pictures. Cogn. Psychol. 3, 525–551 (1972).
    OpenUrlCrossRef
  24. ↵
    1. M. C. Potter,
    2. E. I. Levy
    , Recognition memory for a rapid sequence of pictures. J. Exp. Psychol. 81, 10–15 (1969).
    OpenUrlCrossRefPubMed
  25. ↵
    1. F. W. Schneider,
    2. B. L. Kintz
    , An analysis of the incidental-intentional learning dichotomy. J. Exp. Psychol. 73, 85–90 (1967).
    OpenUrlPubMed
  26. ↵
    1. S. Kambhampati
    1. A. Papoutsaki et al.
    , “Webgazer: Scalable webcam eye tracking using user interactions” in Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence, S. Kambhampati, Ed. (AAAI Press, 2016), pp. 3839–3845.
  27. ↵
    1. J. P. Dmochowski et al.
    , Audience preferences are predicted by temporal reliability of neural processing. Nat. Commun. 5, 4567 (2014).
    OpenUrlCrossRef
  28. ↵
    1. S. S. Cohen,
    2. S. Henin,
    3. L. C. Parra
    , Engaging narratives evoke similar neural activity and lead to similar time perception. Sci. Rep. 7, 4578 (2017).
    OpenUrl
  29. ↵
    1. O. Le Meur,
    2. A. Ninassi,
    3. P. Le Callet,
    4. D. Barba
    , Overt visual attention for free-viewing and quality assessment tasks: Impact of the regions of interest on a video quality metric. Signal Process. Image Commun. 25, 547–558 (2010).
    OpenUrl
  30. ↵
    1. J. J. Ki,
    2. S. P. Kelly,
    3. L. C. Parra
    , Attention strongly modulates reliability of neural responses to naturalistic narrative stimuli. J. Neurosci. 36, 3092–3101 (2016).
    OpenUrlAbstract/FREE Full Text
  31. ↵
    1. M. A. Just,
    2. P. A. Carpenter
    , A theory of reading: From eye fixations to comprehension. Psychol. Rev. 87, 329–354 (1980).
    OpenUrlCrossRefPubMed
  32. ↵
    1. K. Rayner
    , The 35th Sir Frederick Bartlett lecture: Eye movements and attention in reading, scene perception, and visual search. Q. J. Exp. Psychol. 62, 1457–1506 (2009).
    OpenUrlCrossRef
  33. ↵
    1. J. Preissle,
    2. M. D. Le Compte
    , Ethnography and Qualitative Design in Educational Research (Academic Press, 1984).
  34. ↵
    1. T. Van Gog,
    2. F. Paas,
    3. J. J. Van Merriënboer
    , Uncovering expertise-related differences in troubleshooting performance: Combining eye movement and concurrent verbal protocol data. Appl. Cogn. Psychol. 19, 205–221 (2005).
    OpenUrlCrossRef
  35. ↵
    1. M. van Someren,
    2. Y. Barnard,
    3. J. Sandberg
    , The Think Aloud Method: A Practical Approach to Modeling Cognitive Processes (Academic Press, 1994).
  36. ↵
    1. S. Eivazi,
    2. R. Bednarik
    , “Predicting problem-solving behavior and performance levels from visual attention data” in Proceedings of the Workshop on Eye Gaze in Intelligent Human Machine Interaction at IUI (Association for Computing Machinery, New York, NY, 2011), pp. 9–16.
  37. ↵
    1. S. C. Chen et al.
    , Eye movements predict students’ computer-based assessment performance of physics concepts in different presentation modalities. Comput. Educ. 74, 61–72 (2014).
    OpenUrl
  38. ↵
    1. S. Mathôt,
    2. S. Van der Stigchel
    , New light on the mind’s eye: The pupillary light response as active vision. Curr. Dir. Psychol. Sci. 24, 374–378 (2015).
    OpenUrlCrossRefPubMed
  39. ↵
    1. M. M. Bradley,
    2. L. Miccoli,
    3. M. A. Escrig,
    4. P. J. Lang
    , The pupil as a measure of emotional arousal and autonomic activation. Psychophysiology 45, 602–607 (2008).
    OpenUrlCrossRefPubMed
  40. ↵
    1. L. Fiorella,
    2. A. T. Stull,
    3. S. Kuhlmann,
    4. R. E. Mayer
    , Instructor presence in video lectures: The role of dynamic drawings, eye contact, and instructor visibility. J. Educ. Psychol. 111, 1162 (2019).
    OpenUrl
  41. ↵
    1. M. van Wermeskerken,
    2. S. Ravensbergen,
    3. T. van Gog
    , Effects of instructor presence in video modeling examples on attention and learning. Comput. Hum. Behav. 89, 430–438 (2018).
    OpenUrl
  42. ↵
    1. B. B. De Koning,
    2. H. K. Tabbers,
    3. R. M. Rikers,
    4. F. Paas
    , Attention guidance in learning from a complex animation: Seeing is understanding?. Learn. InStruct. 20, 111–122 (2010).
    OpenUrl
  43. ↵
    1. B. B. De Koning,
    2. H. K. Tabbers,
    3. R. M. Rikers,
    4. F. Paas
    , Attention cueing as a means to enhance learning from an animation. Appl. Cognit. Psychol. 21, 731–746 (2007).
    OpenUrlCrossRef
  44. ↵
    1. T. Van Gog,
    2. H. Jarodzka,
    3. K. Scheiter,
    4. P. Gerjets,
    5. F. Paas
    , Attention guidance during example study via the model’s eye movements. Comput. Hum. Behav. 25, 785–791 (2009).
    OpenUrl
  45. ↵
    1. H. Jarodzka,
    2. K. Scheiter,
    3. P. Gerjets,
    4. T. Van Gog
    , In the eyes of the beholder: How experts and novices interpret dynamic stimuli. Learn. InStruct. 20, 146–154 (2010).
    OpenUrl
  46. ↵
    1. T. Piquado,
    2. D. Isaacowitz,
    3. A. Wingfield
    , Pupillometry as a measure of cognitive effort in younger and older adults. Psychophysiology 47, 560–569 (2010).
    OpenUrlCrossRefPubMed
  47. ↵
    1. A. Bergt,
    2. A. E. Urai,
    3. T. H. Donner,
    4. L. Schwabe
    , Reading memory formation from the eyes. Eur. J. Neurosci. 47, 1525–1533 (2018).
    OpenUrl
  48. ↵
    1. S. I. de Freitas,
    2. J. Morgan,
    3. D. Gibson
    , Will MOOCs transform learning and teaching in higher education? Engagement and course retention in online learning provision. Br. J. Educ. Technol. 46, 455–471 (2015).
    OpenUrlCrossRef
  49. ↵
    1. P. J. Guo,
    2. J. Kim,
    3. R. Rubin
    , “How video production affects student engagement: An empirical study of MOOC videos” in Proceedings of the First ACM Conference on Learning @ Scale Conference, L@S ’14 (Association for Computing Machinery, New York, NY, 2014), pp. 41–50.
  50. ↵
    1. M. Ginda,
    2. M. C. Richey,
    3. M. Cousino,
    4. K. Börner
    , Visualizing learner engagement, performance, and trajectories to evaluate and optimize online course design. PLoS One 14, e0215964 (2019).
    OpenUrl
  51. ↵
    1. D. Lagun,
    2. M. Lalmas
    , “Understanding user attention and engagement in online news reading” in Proceedings of the Ninth ACM International Conference on Web Search and Data Mining (Association for Computing Machinery, New York, NY, 2016), pp. 113–122.
  52. ↵
    1. S. S. Cohen,
    2. J. Madsen et al.
    , Neural engagement with online educational videos predicts learning performance for individual students. Neurobiol. Learn. Mem. 155, 60–64 (2018).
    OpenUrlCrossRef
  53. ↵
    1. S. Dikker et al.
    , Brain-to-brain synchrony tracks real-world dynamic group interactions in the classroom. Curr. Biol. 27, 1375–1380 (2017).
    OpenUrlCrossRef
  54. ↵
    1. A. T. Poulsen,
    2. S. Kamronn,
    3. J. Dmochowski,
    4. L. C. Parra,
    5. L. K. Hansen
    , EEG in the classroom: Synchronised neural recordings during video presentation. Sci. Rep. 7, 43916 (2017).
    OpenUrl
  55. ↵
    1. D. Bevilacqua et al.
    , Brain-to-brain synchrony and learning outcomes vary by student–teacher dynamics: Evidence from a real-world classroom electroencephalography study. J. Cogn. Neurosci. 31, 401–411 (2019).
    OpenUrl
  56. ↵
    1. B. S. Bloom,
    2. M. D. Engelhart,
    3. E. J. Furst,
    4. W. H. Hill,
    5. D. R. Krathwohl
    , Taxonomy of Educational Objectives: Cognitive Domain (Longman Group, 1956), vol. 1.
  57. ↵
    1. J. H. Steiger
    , Tests for comparing elements of a correlation matrix. Psychol. Bull. 87, 245–251 (1980).
    OpenUrlCrossRef

Log in using your username and password

Forgot your user name or password?

Log in through your institution

You may be able to gain access using your login credentials for your institution. Contact your library if you do not have a username and password.
If your organization uses OpenAthens, you can log in using your OpenAthens username and password. To check if your institution is supported, please see this list. Contact your library for more details.

Purchase access

You may purchase access to this article. This will require you to create an account if you don't already have one.

Subscribers, for more details, please visit our Subscriptions FAQ.

Please click here to log into the PNAS submission website.

PreviousNext
Back to top
Article Alerts
Email Article

Thank you for your interest in spreading the word on PNAS.

NOTE: We only request your email address so that the person you are recommending the page to knows that you wanted them to see it, and that it is not junk mail. We do not capture any email address.

Enter multiple addresses on separate lines or separate them with commas.
Synchronized eye movements predict test scores in online video education
(Your Name) has sent you a message from PNAS
(Your Name) thought you would like to see the PNAS web site.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Citation Tools
Synchronized eye movements predict test scores in online video education
Jens Madsen, Sara U. Júlio, Pawel J. Gucik, Richard Steinberg, Lucas C. Parra
Proceedings of the National Academy of Sciences Feb 2021, 118 (5) e2016980118; DOI: 10.1073/pnas.2016980118

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
Request Permissions
Share
Synchronized eye movements predict test scores in online video education
Jens Madsen, Sara U. Júlio, Pawel J. Gucik, Richard Steinberg, Lucas C. Parra
Proceedings of the National Academy of Sciences Feb 2021, 118 (5) e2016980118; DOI: 10.1073/pnas.2016980118
del.icio.us logo Digg logo Reddit logo Twitter logo CiteULike logo Facebook logo Google logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Mendeley logo Mendeley

Article Classifications

  • Biological Sciences
  • Psychological and Cognitive Sciences
Proceedings of the National Academy of Sciences: 118 (5)
Table of Contents

Submit

Sign up for Article Alerts

Jump to section

  • Article
    • Abstract
    • Results
    • Discussion
    • Materials and Methods
    • Data Availability.
    • Acknowledgments
    • Footnotes
    • References
  • Figures & SI
  • Info & Metrics
  • PDF

You May Also be Interested in

Setting sun over a sun-baked dirt landscape
Core Concept: Popular integrated assessment climate policy models have key caveats
Better explicating the strengths and shortcomings of these models will help refine projections and improve transparency in the years ahead.
Image credit: Witsawat.S.
Model of the Amazon forest
News Feature: A sea in the Amazon
Did the Caribbean sweep into the western Amazon millions of years ago, shaping the region’s rich biodiversity?
Image credit: Tacio Cordeiro Bicudo (University of São Paulo, São Paulo, Brazil), Victor Sacek (University of São Paulo, São Paulo, Brazil), and Lucy Reading-Ikkanda (artist).
Syrian archaeological site
Journal Club: In Mesopotamia, early cities may have faltered before climate-driven collapse
Settlements 4,200 years ago may have suffered from overpopulation before drought and lower temperatures ultimately made them unsustainable.
Image credit: Andrea Ricci.
Steamboat Geyser eruption.
Eruption of Steamboat Geyser
Mara Reed and Michael Manga explore why Yellowstone's Steamboat Geyser resumed erupting in 2018.
Listen
Past PodcastsSubscribe
Birds nestling on tree branches
Parent–offspring conflict in songbird fledging
Some songbird parents might improve their own fitness by manipulating their offspring into leaving the nest early, at the cost of fledgling survival, a study finds.
Image credit: Gil Eckrich (photographer).

Similar Articles

Site Logo
Powered by HighWire
  • Submit Manuscript
  • Twitter
  • Facebook
  • RSS Feeds
  • Email Alerts

Articles

  • Current Issue
  • Special Feature Articles – Most Recent
  • List of Issues

PNAS Portals

  • Anthropology
  • Chemistry
  • Classics
  • Front Matter
  • Physics
  • Sustainability Science
  • Teaching Resources

Information

  • Authors
  • Editorial Board
  • Reviewers
  • Subscribers
  • Librarians
  • Press
  • Site Map
  • PNAS Updates
  • FAQs
  • Accessibility Statement
  • Rights & Permissions
  • About
  • Contact

Feedback    Privacy/Legal

Copyright © 2021 National Academy of Sciences. Online ISSN 1091-6490