Teaching critical thinking

Edited by Samuel C. Silverstein, College of Physicians and Surgeons, New York, NY, and accepted by the Editorial Board July 14, 2015 (received for review March 17, 2015)
August 17, 2015
112 (36) 11199-11204

Significance

Understanding and thinking critically about scientific evidence is a crucial skill in the modern world. We present a simple learning framework that employs cycles of decisions about making and acting on quantitative comparisons between datasets or data and models. With opportunities to improve the data or models, this structure is appropriate for use in any data-driven science-learning setting. This structure led to significant and sustained improvement in students’ critical thinking behaviors, compared with a control group, with effects far beyond that of statistical significance.

Abstract

The ability to make decisions based on data, with its inherent uncertainties and variability, is a complex and vital skill in the modern world. The need for such quantitative critical thinking occurs in many different contexts, and although it is an important goal of education, that goal is seldom being achieved. We argue that the key element for developing this ability is repeated practice in making decisions based on data, with feedback on those decisions. We demonstrate a structure for providing suitable practice that can be applied in any instructional setting that involves the acquisition of data and relating that data to scientific models. This study reports the results of applying that structure in an introductory physics laboratory course. Students in an experimental condition were repeatedly instructed to make and act on quantitative comparisons between datasets, and between data and models, an approach that is common to all science disciplines. These instructions were slowly faded across the course. After the instructions had been removed, students in the experimental condition were 12 times more likely to spontaneously propose or make changes to improve their experimental methods than a control group, who performed traditional experimental activities. The students in the experimental condition were also four times more likely to identify and explain a limitation of a physical model using their data. Students in the experimental condition also showed much more sophisticated reasoning about their data. These differences between the groups were seen to persist into a subsequent course taken the following year.

Continue Reading

Acknowledgments

We acknowledge the support of Deborah Butler in preparing the manuscript. We also thank Jim Carolan for the diagnostic survey data about the study participants. This research was supported by the University of British Columbia’s Carl Wieman Science Education Initiative.

Supporting Information

Supporting Information (PDF)
Supporting Information

References

1
Z Kanari, R Millar, Reasoning from data: How students collect and interpret data in science investigations. J Res Sci Teach 41, 748–769 (2004).
2
E-K Kumassah, J-G Ampiah, E-J Adjei, An investigation into senior high school (shs3) physics students understanding of data processing of length and time of scientific measurement in the Volta region of Ghana. Int J Res Stud Educ Technol 3, 37–61 (2013).
3
R-L Kung, C Linder, University students’ ideas about data processing and data comparison in a physics laboratory course. Nordic Stud Sci Educ 2, 40–53 (2006).
4
J Ryder, J Leach, Interpreting experimental data: The views of upper secondary school and university science students. Int J Sci Educ 22, 1069–1084 (2000).
5
J Ryder, Data interpretation activities and students’ views of the epistemology of science during a university earth sciences field study course. Teaching and Learning in the Science Laboratory, eds D Psillos, H Niedderer (Kluwer Academic Publishers, Dordrecht, The Netherlands), pp. 151–162 (2002).
6
M-G Séré, R Journeaux, C Larcher, Learning the statistical analysis of measurement errors. Int J Sci Educ 15, 427–438 (1993).
7
J Baron, Why teach thinking? - An essay. Appl Psychol 42, 191–214 (1993).
8
K-A Ericsson, R-T Krampe, C Tesch-Romer, The role of deliberate practice in the acquisition of expert performance. Psychol Rev 100, 363–406 (1993).
9
D Kuhn, M Pease, What needs to develop in the development of inquiry skills? Cogn Instr 26, 512–559 (2008).
10
S Allie, A Buffler, B Campbell, F Lubben, First-year physics students’ perceptions of the quality of experimental measurements. Int J Sci Educ 20, 447–459 (1998).
11
N-G Holmes, D-A Bonn, Doing science or doing a lab? Engaging students with scientific reasoning during physics lab experiments. 2013 Physics Education Research Conference Proceedings, eds Engelhardt P-V, Churukian A-D, Jones D-L (Portland, OR), pp 185–188. (2013).
12
M-G Séré, et al., Images of science linked to labwork: A survey of secondary school and university students. Res Sci Educ 31, 499–523 (2001).
13
L-W Anderson, L-A Sosniak Bloom’s Taxonomy: A Forty-Year Retrospective; National Society for the Study of Education Yearbooks (Univ of Chicago Press, Chicago, 1994).
14
N-G Holmes, J Ives, D-A Bonn, The impact of targeting scientific reasoning on student attitudes about experimental physics. 2014 Physics Education Research Conference Proceedings, eds Engelhardt P-V, Churukian A-D, Jones D-L (Minneapolis, MN), pp 119–122. Available at www.compadre.org/Repository/document/ServeFile.cfm?ID=13463&DocID=4062. Accessed July 28, 2015. (2014).
15
M Kapur, Productive failure. Cogn Instr 26, 379–424 (2008).
16
K VanLehn, Toward a theory of impasse-driven learning, Learning Issues for Intelligent Tutoring Systems, Cognitive Sciences, eds Mandl H, Lesgold A (Springer-Verlag, New York), pp 19–41. (1988).
17
D-L Butler, Individualizing instruction in self-regulated learning. Theory Pract 41, 81–92 (2002).
18
M-G Séré, Towards renewed research questions from the outcomes of the European project Labwork in Science Education. Sci Educ 86, 624–644 (2002).
19
S Bulu, S Pedersen, Scaffolding middle school students’ content knowledge and ill-structured problem solving in a problem-based hypermedia learning environment. Educ Tech Res Dev 58, 507–529 (2010).
20
G Salomon, D-N Perkins, Rocky roads to transfer: Rethinking mechanism of a neglected phenomenon. Educ Psychol 24, 113–142 (1989).
21
R-J Sternberg, T Ben-Zeev, Complex Cognition: The Psychology of Human Thought (Oxford Univ Press, New York). (2001).
22
M Krzywinski, N Altman, Points of significance: Error bars. Nat Methods 10, 921–922 (2013).
23
; Buereau International des Pois et Mesures, International Electrotechnical Commission, International Federation for Clinical Chemistry and Laboratory Medicine, International Organization for Standardization, International Union of Pure and Applied Chemistry, International Union of Pure and Applied Physics, International Organization of Legal Metrology, Guides to the Expression of Uncertainty in Measurement (Organization for Standardization, Geneva). (2008).
24
L Ding, R Chabay, B Sherwood, R Beichner, Evaluating an electricity and magnetism assessment tool: Brief electricity and magnetism assessment. Phys Rev Spec Top-PH 2, 010105 (2006).
25
D Hestenes, M Wells, A mechanics baseline test. Phys Teach 30, 159–166 (1992).
26
D Hestenes, M Wells, G Swackhamer, Force concept inventory. Phys Teach 30, 141–158 (1992).
27
; R Core Team, R: A Language and Environment for Statistical Computing (R Foundation for Statistical Computing, Vienna). (2014).
28
D Bates, M Maechler, B Bolker, S Walker, lme4: Linear Mixed-Effects Models Using Eigen and S4 (R Foundation for Statistical Computing, Vienna). Available at CRAN.R-project.org. Accessed July 28, 2015. (2014).
29
J Fox, S Weisberg, An R Companion to Applied Regression (Sage Publications, Inc., Thousand Oaks, CA), 2nd Ed. (2011).
30
A Hofstein, V-N Lunetta, The laboratory in science education: Foundations for the twenty-first century. Sci Educ 88, 28–54 (2004).

Information & Authors

Information

Published in

The cover image for PNAS Vol.112; No.36
Proceedings of the National Academy of Sciences
Vol. 112 | No. 36
September 8, 2015
PubMed: 26283351

Classifications

Submission history

Published online: August 17, 2015
Published in issue: September 8, 2015

Keywords

  1. critical thinking
  2. scientific reasoning
  3. scientific teaching
  4. teaching experimentation
  5. undergraduate education

Acknowledgments

We acknowledge the support of Deborah Butler in preparing the manuscript. We also thank Jim Carolan for the diagnostic survey data about the study participants. This research was supported by the University of British Columbia’s Carl Wieman Science Education Initiative.

Notes

This article is a PNAS Direct Submission. S.C.S. is a Guest Editor invited by the Editorial Board.

Authors

Affiliations

N. G. Holmes1 [email protected]
Department of Physics, Stanford University, Stanford, CA 94305;
Carl E. Wieman
Department of Physics, Stanford University, Stanford, CA 94305;
Graduate School of Education, Stanford University, Stanford, CA 94305;
D. A. Bonn
University of British Columbia, Vancouver, BC, Canada

Notes

1
To whom correspondence should be addressed. Email: [email protected].
Author contributions: N.G.H., C.E.W., and D.A.B. designed research; N.G.H. and D.A.B. performed research; N.G.H. analyzed data; and N.G.H., C.E.W., and D.A.B. wrote the paper.

Competing Interests

The authors declare no conflict of interest.

Metrics & Citations

Metrics

Note: The article usage is presented with a three- to four-day delay and will update daily once available. Due to ths delay, usage data will not appear immediately following publication. Citation information is sourced from Crossref Cited-by service.


Altmetrics

Citations

Export the article citation data by selecting a format from the list below and clicking Export.

Cited by

    Loading...

    View Options

    View options

    PDF format

    Download this article as a PDF file

    DOWNLOAD PDF

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Personal login Institutional Login

    Recommend to a librarian

    Recommend PNAS to a Librarian

    Purchase options

    Purchase this article to access the full text.

    Single Article Purchase

    Teaching critical thinking
    Proceedings of the National Academy of Sciences
    • Vol. 112
    • No. 36
    • pp. 11139-E5111

    Figures

    Tables

    Media

    Share

    Share

    Share article link

    Share on social media