How Research on Student Learning Explains the Effectiveness of Empirically Driven Classroom Activities
by Elise Piazza, Vision Science
Recipient of the Teagle Foundation Award for Excellence in Enhancing Student Learning, 2014
Related Teaching Effectiveness Award essay: Achieving Widespread Participation through Evidence-Based Classroom Discourse
As a GSI for Introduction to Cognitive Science, I developed several empirically driven activities to increase student participation by engaging the entire class in practicing scientific methodology. In her talk for the GSI Center’s “How Students Learn” project, Professor Kathleen Metz gave a presentation on how active engagement in the scientific method improves students’ conceptual understanding; her analysis of how students learn was borne out in the ways students engaged in the activities for my section.
The topic of human cognition was ideally suited for such in-class experiments, since students could act as both scientists and participants. At the outset of the course, I collected survey data on the students’ initial opinions and biases (about concepts like consciousness and artificial intelligence), the results of which served as a philosophical icebreaker and stimulated discussion. I reenacted classic experiments on moral decision-making and exposed the students to perceptual illusions that reveal the brain’s efficient strategies for understanding the world. We also conducted our own Turing Test (a test of a machine’s ability to exhibit human-like behavior), in which I challenged them to stump chatbots (artificial conversational agents) with creative questions, and the results unveiled how ambiguities in human language make it so difficult to implement in an artificial system.
Building upon previous work (Brown et al. 1989; NRC 2007), Metz asserts that productive participation in scientific practices (such as prediction, hypothesis testing, and modeling) and discourse stimulates a deeper conceptual understanding than memorization of facts. In my discussion sections, instead of simply reiterating the myriad theories of cognitive science introduced during lecture, I conducted in-class experiments to directly engage students with the scientific process. I also held several debates on controversial course topics (e.g., “Do we have free will?”), in which teams of students argued a position using empirical evidence to support their claims. Through these activities, they acquired skills like conducting literature searches, sharing hypotheses, and interpreting data.
Metz also highlights how scientific inquiry invites students to critique common assumptions and biases — for instance, by comparing their intuitions during thought experiments to empirical data. Cognitive science, which spans a range of disciplines (from theoretical to empirical), lends itself well to this practice. After covering classical philosophical theories and thought experiments that make predictions about the nature of the mind, I introduced actual experimental paradigms that have been used to test those predictions. For example, after learning the concept of “qualia,” we discussed variation in human color sensitivity as evidence that each person’s experience of the world is unique.
Metz stresses that students who “actively construct their own knowledge” learn best. This aspect of the course could be further extended. To expand the class’s participation in scientific culture, I could ask students to design their own experiment to test a certain theory, write a report explaining the results, and even present their findings at an undergraduate research conference. Metz (2004) also emphasizes the importance of identifying and resolving uncertainty in scientific data, so I would encourage students to critique each others’ results and interpretations. To further expose them to scientific practices, I could incorporate computational tools (e.g., software used to model, analyze, and visualize data) into assignments (e.g., write a game-playing program).
I evaluated the success of these methods in several ways. In section, I observed that the activities increased participation to nearly 100% — compared to about 20% during our more traditional discussions — and that previously reticent students flourished within the structured format of the debates. Students began coming to office hours to discuss the research covered in section, relevant scientific literature they had found, or my own research. Over the course of the semester, students’ essays showed major improvements in the use of scientific evidence to defend arguments and in illustrating concepts with vivid, real-life examples, indicating a deeper understanding of the material than at the start of the course. In the teaching evaluations, students appreciated the interactive style and empirical evidence that supplemented the more philosophically-oriented lecture material. Several students were so engaged that they sought research assistantships on campus, which I was able to help them secure. To add a more scientific assessment in a future course, I could quantitatively compare scientific reasoning skills (Bao et al. 2009) at the beginning and end of the course, using quizzes that test conceptual knowledge as well as the ability to generate hypotheses, design experiments, and evaluate interpretations.
Bao, Lei, et al. “Learning and Scientific Reasoning.” Science 323:586–587.
Brown, John S., Allan Collins, and Paul Duguid. 1989. “Situated Cognition and the Culture of Learning.” Educational Researcher 18:32–42.
Metz, Kathleen E. 2004. “Children’s Understanding of Scientific Inquiry: Their Conceptualization of Uncertainty in Investigations of Their Own Design.” Cognition and Instruction 22:219–290.
National Research Council. 2007. Taking Science to School: Learning and Teaching Science in Grades K-8. Washington, DC: The National Academic Press.