As a science teacher, specialist, or supervisor, we are often tasked with explaining to our supervisor why students are struggling with the EOC (End-of-Course) science assessments. Often when asked to explain the disappointing passing rates, to our principals or supervisors we shake our heads and declare it was about student motivation at the sophomore level. Students didn’t have a need to pass, so they didn’t try.
When I was in the same situation as science supervisor, the more I began to investigate this phenomenon; I found my district was not alone in the challenge to change the scores for our high school exams. Surrounding districts had the same disappointing passing rates. I interviewed teachers, parents, teacher specialists, and students to find they all had the same comments; often stating the EOC as irrelevant, too difficult, or not aligned to what was being taught.
To further gather evidence, I held focus groups with students, the majority of whom did not pass the EOC. With each group, I asked them to work through 10 items from the released EOC with a partner and to discuss their reasoning and solution aloud. I taped the session and collected their discussions as data that included comments, concerns, questions, and misconceptions. As I observed and recorded, I began to understand the value of direct observation in the creation of explanations about data and evidence.
When students were discussing the test items, teams began with breaking down the question, looking for clues in the prompt, working through diagrams, doing calculations, eliminating answer choices and then selecting their answer choice. They often discussed the labs they did, lectures they heard, and the chapters in the book they read. I even asked them to record what they scored on chapter tests about the same scientific materials, which usually were passing grades.
I then met with groups of teachers to watch the videos and review the data I collected. The teachers were impressed with the effort each student used to analyze and solve each test item. They seemed surprised that students recalled classroom discussions and lab experiences associated with each concept. But, they were dismayed when misconceptions began to appear and when team discussions broke down in the analysis of evidence and wrong answer were selected based on distractors or “red herrings” within the diagrams, answer choices or prompts.
The teacher groups used this new evidence to create explanations for the EOC scores of their own students. Several of these explanations were that their students:
- Lack experience-drawing conclusions based on lab observations.
- Were not given opportunities to deal with distracting evidence during lab experiences.
- Were not required to provide scientific reasoning for the conclusions drawn during investigations.
- Were never given an opportunity to reveal their misconceptions prior to instruction about scientific phenomena.
- Did not engage in discussions or argumentation about their data after lab investigations
- Were provided lab experiences that did not promote inquiry, asking questions, or drawing conclusions based on analysis of evidence.
- Did not have experience in writing scientific explanations that incorporated the data collected in the lab investigations and the scientific reasoning, with scientific vocabulary, that is taught during classroom instruction.
Armed with these explanations, and the fact that students called on their experiences in labs to respond to EOC test items, a team was formed to seek strategies and resources to change the ways science investigations were provided within the district science curricula. The goal was to make labs more meaningful to students yet still reasonable work for teachers.
One specific strategy found to be highly effective was based on the research of Drs. Joe Krajcik and Patricia McNeill about Scientific Explanations. Their research showed how investigations that required data collection and then scientific explanations of the evidence, using the format of “Claim, Evidence and Reasoning,” (CER) changed students understanding of scientific phenomena and resulted in greater achievement.
This strategy was chosen as one that teachers were interested in trying. Professional Development and Professional Learning Communities were focused on the strategy to help facilitate the changes needed in instructional practice within the district science community. As a result of integrating this strategy into district curriculum and assessments for grades K-12 and in the AP programs, teachers found it easy to incorporate, and students were able to apply and practice their skills of data analysis and drawing conclusions.
As a result, teachers and students began to look at lab investigations differently. No longer did lab investigations have a “cookbook” quality. Labs were conducted to be able to collect data and examine evidence. They discussed the evidence students collected through discourse and argumentation. Scientific explanations were generated, and rebuttals about explanations were part of every lab investigation and conclusion.
Changing the way investigations were set up and assessed through CER, met the needs identified by the focus groups. Students applied these same strategies to unit and EOC test items, which resulted in increased scores on unit tests and the EOC assessment in following years.
As a professional science community the district was able to use the CER format to determine an accurate explanation for the EOC data, which lead to more questions, and the implementation of a strategy that generated a change to the data.
As a classroom strategy, CER lead to students gaining a deeper understanding of science and an understanding the value of scientific investigations for increased achievement. The incorporation of the practice of scientists and engineers to engage in scientific discourse based in evidence and data analysis are authentic to teaching and the scientific community.
By Dr. Terry Talley,
National Institute for STEM Education