Issues in Earth Science
“Eww, There’s Some Geology in my Fiction!”
Issue 8, Nov 2017
Suggestions for Activities and Discussions to accompany a Reading of
Breaking the Ice by Justin Short
Designing an Experiment
One of the least practiced (in the classroom) of the practices of science is experimental design. We may give students a pre-made experiment to follow like a recipe, or experimental data to interpret, or even some materials to ‘play around with’ experimentally, but we don’t often help them figure out how to invent an experiment to answer a question.
One reason for this lack of experimental design in the classroom is that students’ experimental designs usually don’t ‘work,’ and so the activity does not reinforce the ‘content’ that we are wanting them to learn. However, learning how to invent a science experiment is perhaps at least as important as simply learning the theories—the ‘content’—of the discipline. Because we have given students few opportunities to do experimental design in the past (and have not guided them toward a better final outcome), they don’t know how to constrain their variables, address experimental challenges, or even figure out whether or not their experimental results make sense.
Interpreting Experimental Results
What do we mean by “the experiment worked?” Most often, we mean that the experiment yielded the result that we already expected, the result that the book, or the teacher, or the lab manual says we should get. However, authentic science researchers don’t know the results of their experiments ahead of time, and leading classroom students to think that research results should be judged by how well it matches the teacher’s answer key undermines our ability to engage them in the practices of science. Judging experimental results by how well they match an answer key also undermines our efforts to teach the nature of science: a knowledge building enterprise that relies on empirical evidence.
Typically, in the college classroom, student-written reports include phrases like “the experiment failed due to human error.” This failure to more thoroughly analyze the experimental results and consider what the results might mean, or exactly what went ‘wrong’, reflects an answer-key driven experience with science in middle and high school. Real science is usually quite confusing and foggy at first, and there are no answer keys. The researcher needs to figure out what the experimental results mean from the results themselves, not from a teacher with an answer key in their back pocket. How can we engage students in similar experiences?
One place to start is to push students beyond “failed due to human error” to examine the actual results of their experiments—not what they think the results should have been. In every experiment, what happens is what happens. It certainly reveals something about the universe and the results are certainly consistent with the laws of nature. What exactly went ‘wrong’ with the experiment? What do the actual results tell us?
Choosing a Question for Experimentation
In this issue’s activity, we put these two problems in educational preparation (doing experimental design and evaluating whether or not it ‘worked’) together to create an activity in which students design their own experiment and then evaluate what the results mean. They further consider whether or not the results are consistent enough (high enough precision) to yield real and meaningful understanding about the universe.
This practice of designing experiments and figuring out what the results mean is, arguably, the most important practice of science. It is also one which students— well—have very little, if any, experience. So, let’s dive in to some real science!
One element of the story “Breaking the Ice” turned on the idea that ice on Earth is more likely to break than the much colder ice at the rings of Saturn where the Ringlets came from. Is this implied decrease in strength with increasing temperature true? Can we test it experimentally? How can we design an experiment, using commonly available materials, to test this possibility?
Developing an experimental plan and testing ideas to see if they work as we expect
A creative activity like inventing an experiment is not something that can be done ‘on demand’ in a classroom, like you might ask students to explain a theory or recall information. Creative activities take time to think about, to pose and test, and then to develop solutions to experimental challenges (perhaps another reason it is done so little in the classroom). Thus, you can’t expect students to design and do an experiment in a single class period. Instead, you might start the investigation by having students spend a few minutes over several days talking about possible ways to develop the experiment. As possible approaches begin to jell, you can encourage students (or help them) gather materials to test whether some of the ideas will work. Students can then revise ideas (again, perhaps using a few minutes over several class periods). After a couple of iterations like this, students can plan their experiment and carry it out using larger class-time blocks.
To design an experiment, students will need to work through a number of ideas, including the following.
What are the independent and dependent variables implicit in the experimental question? (In this case, temperature of ice and the force at which that ice will fail, or break.)
How can the independent variable be experimentally controlled (In this case, how can the temperature of the ice be changed and controlled) and how can the dependent variable be measured (In this case, how can the force at which the ice breaks be measured)?
How can other variables and factors be held constant? (In this case, students need to identify other factors that might affect the strength of the ice, like thickness of the ice, its width, impurities and defects, etc, and how can those be held constant over several experiments)
An Example Experimental Design (using commonly available materials)
Temperature control—the independent variable:
If students are in a region where temperature falls significantly below freezing, or if they have access to a walk-in freezer with temperature control, they might be able to design an experiment using those to control temperature. However, if those aren’t available, they will need to design an ‘experimental chamber” in which temperature can be varied. Variable temperature might be achieved, for example, by ice mixtures with varying amounts of salt. Adding salt to ice lowers the freezing temperature (which allows us to melt road ice, and to freeze ice cream in homemade ice cream freezers).
One example experimental chamber is shown in the picture below, in which salt and ice are added to a Tupperware container inside a Styrofoam box (which provides some insulation from the warm room conditions). A metal can placed inside the salt-ice mixture provides a dry area in which experiments can be done. The metal is a good heat conductor and thus allows the cold conditions of the salt-ice mixture to lower the temperature inside the can. A styrofoam top to the can limits mixing with warmer air in the room. The salt used for the example experimental design described here and below was a mixture of Ca chloride and Mg chloride purchased at Walmart (Prestone Driveway Heat) and used to melt driveway ice down to -32°C (meaning the equilibrium temperature for a mix of salt and ice is about this temperature). If you use this with your students, take note that this is not edible table salt and there are safety considerations listed on the container. Temperature was monitored inside the experimental chamber prior to each experiment using a digital thermometer as shown in the second picture below.
Force measurement—the dependent variable:
We had to “try out” a few ideas before finding a solution to measuring force, and your students will likely need to do this also. Our first experimental vision involved using thin slices of ice (so thin that we discovered we couldn’t manufacture them with reproducible thicknesses) with corresponding small amounts of force needed to break the ice. We thought that a small 16oz food scale should suffice. After some trial and error, we discovered that 16 oz of force was nowhere near enough to break the slices of ice we were able to manufacture. Thus, we gravitated to a mechanical bathroom scale allowing measurement of forces up to 300 pounds. The bathroom scale was placed against the wooden piston extending into the experimental chamber as shown in the picture below. Force could be monitored as it was gradually increased up to the point of ice breakage, and the force at breakage recorded.
Some mechanism is needed to convey the force to the ice chunk inside the experimental chamber. We chose a quarter-inch wooden dowel rod for the piston to convey the force (again, this was a revision from a previous idea using a lollipop stick which proved to not be strong enough).
Control thickness and other dimensions of ice
As mentioned above, we initially thought we could produce thinner, weaker pieces of ice. Our initial plan was to freeze ice in an ice cube tray and then cut thin slices, or perhaps slivers, with a hacksaw. However, the brittle ice fractured unreproducibly as we cut it with a hacksaw and this approach had to be abandoned.
Instead, we purchased an ice cube tray with a flat bottom (so ice thickness would remain uniform across the ice sheet) and only partially filled it with water, creating thin layers of ice of uniform width, length, and thickness. To keep thickness uniform, we measured 2 tsp water into each ice cube block (each with dimensions of 1.75” x 1.75”, yielding, once frozen, ice sheets that were 0.2” in thickness.
Control the leverage that the force has on the ice.
The thickness of ice is not the only factor that influences its break-strength. The geometry of how the force is applied and the leverage on the ice also play a part. For example, imagine that you have a slab of ice laying across a 1” gap, and then the same thickness slab laying across a 12” gap. Clearly, the slab across the 12” gap will be easier to break.
To keep the gap constant, we laid the ice slab across a 1 1/8” wide circular hole drilled in a wooden block. The ice was centered over this hole, and the piston conveying the force to the ice was centered above the hole and the ice. The hole drilled part way through the wooden block at the bottom of the experimental chamber is seen in the picture below.
Results and Interpretation
Once the experiments are completed, students need to tabulate and/or graph the results to aid in interpretation. The results for the example experimental design reported here are given above.
Specific questions that can be addressed with these results include the following.
1) Do the results indicate that the strength of the ice increases with decreasing temperature as implied by the story?
In this case, no. As seen in the graph, the difference between the strength of the ice a little bit above freezing and a little bit below freezing (the averages differ by about 1.7 pounds) is less than the variation from one experiment to the next at the same temperature (which vary by more than 40 pounds at each temperature). Take note that this analysis requires that experiments be repeated to establish a baseline uncertainty in the measurements. Evaluation of experimental and measurement uncertainty (which can be inferred from the variation in a set of independent measurements) is a fundamental approach in all science. Since we can’t compare our results to an answer key, we have to let the experiments themselves inform us about how precise they are. The range of variation, reflecting the uncertainty, can be seen graphically in the image above.
2) Do the results indicate that the strength of the ice decreases with decreasing temperature, the opposite implied in the story?
Again, no. The average breaking force was indeed lower at the lower temperature (65 pounds versus 66.7 pounds). However, as can be seen from the graph, the variation in measurements is greater than this difference, meaning that the difference is not statistically significant. For those who like math statistics rather than graphical statistics, the standard deviation of the average at the lower temperature is 6.6 pounds and at the higher temperature is 6.5 pounds. A difference of 1.7 pounds is thus significantly less that the standard deviation of the measurements.
3) So, can we say anything at all based on our experimental results?
Of course. We can say that any change in strength with temperature, if it exists, has to lie within certain bounds. The slope of the change in strength with temperature must lie within the approximate bounding slopes shown in the graph below. We can also say that our experiments did not confirm that strength increases as temperature falls (although that might still be true within the limits shown in the graph below).
General questions your students can ask of their experimental results include the following.
Does the variation in your results indicate that your precision is high enough to distinguish a ‘real’ effect of the independent variable (temperature) on the dependent variable (strength of the ice)?
If there is a significant change in strength with temperature, does that effect indicate that strength increases with decreasing temperature or that it decreases with increasing temperature?
If there is a significant change in strength with temperature, do the results make sense? Are there any other factors that might be causing the results to ‘seem’ right but not be (for example, if the warmer ice block is melting during the course of the experiment, it would get thinner, and thus easier to break, which gives the ‘expected’ result but for the wrong reasons).
If there is no significant change in strength with temperature, can you think of a way that the next researcher might improve the experiment to get a more precise result? The next researchers to pursue this study can build on your results! As Isaac Newton said, we stand on the shoulders of giants—we build on the results of the people who came before us. In the example experimental design reported above, improvements might include the following.
1) Finding a way to increase the difference in temperature. A greater temperature difference increases the chance that variation in strength due to temperature can be detected above the uncertainty in the measurements. Better insulation, especially over the top, would make colder temperatures possible using the same salt-ice mix as used above, down to a theoretical limit of -32°C. The theoretical cold limit for ice and table salt is -21.1°C.
2) More time might be allowed for the ice to equilibrate with the air temperature inside the experimental chamber (we only allowed 2-5 minutes, and in many cases the chamber temperature may not have reached equilibrium).
3) Better control of where the piston strikes the ice. If the piston was not exactly centered above the 1 1/8” hole, and the ice centered over the hole, the ‘leverage’ on the ice would be different, possibly accounting for some of the variation in measurements observed.
Comparing our authentic classroom research with “real science.”
The shear strength of ice at various temperatures has been reported by Shun-ying Ji, Hong-liang Liu, Peng-fei Li, and Jie Su in the article “Experimental Studies on the Bohai Sea Ice Shear Strength”, published in the Journal of Cold Regions Engineering in December 2013. Their results are shown in the graph below.
These researchers find a statistically significant effect of temperature on ice strength. However, notice that their results also do not confirm that lower temperature results in greater ice strength at the temperature range of our experiments above (the trend of their data is flat in this temperature region, and their results do not show a statistically significant increase in strength until the temperature is below -8°C). Notice also that their statistical uncertainty is as large (or larger) than our experiments done with a bathroom scale! For example, their measurements at -6°C range from less than 0.4 to 1 megapascals, a range of more than a factor of 2, similar to the variation from 40 to a little over 80 pounds in our experiments at -7°C. The researchers overcame their uncertainty by making many measurements over a wide range of temperatures, which allowed them to discern a statistically significant trend of greater strength of ice at lower temperatures. So, the story, “Breaking the Ice,” got it right after all!
Connecting to the Next Generation Science Standards.
The investigative and experimental activities above support the following NGSS performance expectations: HS-ESS2-5, MS-ESS2-1, MS-ESS3-2, MS-PS1-4, MS-PS1-6, MS-PS3-3; HS-PS1-3, HS-PS3-4
Students can exercise skills in the practices of 1) Planning and Conducting Investigations, especially deciding on types, how much, and accuracy of data needed to produce reliable measurements and consider limitations on the precision of the data and refine the design accordingly; 2) Analyzing and interpreting data, especially considering the uncertainty of the measurements.
Students experience the crosscutting concepts of 1) Patterns, 2) Cause and effect: mechanism and explanation, 3) Systems and system models, and 4) Energy and matter: flows, cycles and conservation.
The Teacher Resources for Breaking the Ice are written by Russ and Mary Colson, authors of Learning to Read the Earth and Sky.
Return to Breaking the Ice by Justin Short
Return to “Eww, There’s Some Geology in My Fiction.”
Find more essays, games, and stories at
©2017 Issues in Earth Science