For over 40 years, the National Assessment of Educational Progress (NAEP) tests have assessed how well students can calculate and what they know. However, a according to a recent report, two new additions to the test show that while students may understand how to complete scientific experiments, they struggle to explain why they got those results.
“It’s tragic that our students are only grasping the basics and not doing the higher-level analysis and providing written explanations needed to succeed in higher education and compete in a global economy,” NAEP chairman David Driscoll said in a press statement announcing The Nation's Report Card Science in Action: Hands-On and Interactive Computer Tasks from the 2009 Science Assessment.
The report shows that students were challenged by parts of investigations that required more variables to manipulate, strategic decision-making in collecting data, and explaining why certain results were the correct conclusion.
Originally administered as a part of the 2009 NAEP tests, the new portions use a combination of interactive computer tasks and hands-on tasks to determine whether students are able to solve problems as scientists.
“This innovative format allows for a richer analysis than a paper-and-pencil test,” Driscoll said. “Interactive computer tasks allow us to more deeply examine students' abilities to solve problems because the tasks generate much more data.”
The first addition to the science tests, Hands On Tasks (HOTs), while not necessarily a new technique in NAEP tests, allowed students 40 minutes to conduct experiments with physical objects. Alan J. Friedman, a member of the National Assessment Governing Board, explained to The Washington Post that the second addition, Interactive Computer Tasks (ICTs), goes beyond the HOTs portion of the test. The ICTs measured how students ran their own experiments in simulated natural or laboratory environments with the ability to go back, adjust variables and correct their mistakes on a computer.
“Increasingly, graduates are called on to do things in today’s world that require more than rote memory and how to follow instructions,” Friedman said. “There was no way to memorize for this test and no amount of rote drill and practice that could prepare students for it; these tests test what students can do in more complex environments and the richness of what students can do with real stuff.”
However, the new report explains that 42 percent of 4th graders, 41 percent of 8th graders and 27 percent of 12th graders gave correct answers on the steps they attempted but had a hard time effectively explaining their conclusions using the evidence they just collected.
“Having something that is more than just anecdotes, that is rigorous research across a wide range of students, is very helpful, because it’s a better form of evidence on which to make decisions,” Chris Dede, professor from Harvard Graduate School of Education said to The Washington Post.
This report comes just months after another report showed that students have only made slight gains in their performance in STEM (Science, Technology, Engineering and Mathematics) areas, echoing calls for educators to beef up and expand their STEM programs.
“That tells us that our science teaching isn’t getting us as far as we need to go,” Dede continued.
James Dugan is a writer for 360 Education Solutions