Mathematical Thinking CATs || Fault Finding and Fixing || Plausible Estimation
Creating Measures || Convincing and Proving || Reasoning from Evidence

Classroom Assessment Techniques 'Plausible Estimation' Tasks

(Entire CAT)

Malcolm Swan
Mathematics Education, University of Nottingham
Malcolm.Swan@nottingham.ac.uk

Jim Ridgway
School of Education, University of Durham
Jim.Ridgway@durham.ac.uk

Plausible Estimation tasks involve students in an activity central to modelling in science, other areas of intellectual activity, and in everyday life. The core skill is to create (or check) estimates of quantities that, at first glance, seem unknowable. Students are also required to communicate their assumptions and results and check the plausibility of their answers - important thinking processes for all science, math, and engineering fields. These tasks (an example) show students that many seemingly impossible-to-estimate quantities (such as the number of new cars sold in the United States) can be reasonably derived using basic, known quantities and simple reasoning. In addition, Plausible Estimation tasks practice arithmetic fluency, ability to handle large numbers, and conversion of units.

A 'Plausible Estimation' task consists of a one or two easily-stated questions which at first glance seem impossible to answer without reference material, but which can be reasonably estimated by following a series of simple steps that use only common sense and numbers that are generally known or are amenable to estimation. One such example is, "How many babies are born in the United States each minute?" To answer this question, students must identify and estimate the relevant pieces of information, such as the U.S. population and the age distribution of the U.S. population. With these estimates, and simple multiplication (mindful of units), students can make a reasonable estimate of this quantity.

WHAT IS INVOLVED?

 Instructor Preparation Time: Minimal if use existing tasks. Preparing Your Students: Students will need some coaching on their first task. Class Time: 45 minutes. Disciplines: Appropriate for all, requires very little mathematical knowledge beyond fluency in basic skills. Class Size: Any. Special Classroom/Technical Requirements: None. Individual or Group Involvement: Either. Analyzing Results: Intensive for formal scoring for large classes. Best used as an informal way to get your students thinking mathematically. Other Things to Consider: Fairly demanding task for students who are unfamiliar with open-ended problems.

Description
(an example) are sometimes called 'Fermi' problems after the physicist Enrico Fermi (1901-1954). One favorite problem was, " How many piano tuners are there in Chicago?" Fermi problems have the following characteristics:

• An interesting estimation problem is posed in a simple way.
• Most people instantly respond by saying that it is a problem they could not possibly solve without recourse to reference material.
• An estimate of the solution may be found by a series of simple steps that use only common sense and numbers that are either generally known or are amenable to estimation.

Thus, one way we could estimate an answer to Fermi's question about how many piano tuners are in Chicago is to:

• estimate the size of the population
• estimate the number of households in the population
• estimate the total number of pianos in one's own class, family, street, church etc.
• estimate the frequency of tuning
• estimate the time it takes to tune a piano
• estimate the number of piano tuners.
The downloadable materials for students begin with the sample task and solution that appears below. When we use these tasks for assessment, we are looking for:
• sensible assumptions
• careful reasoning which is carefully communicated, and
• sensible use of units.

Example of a Task and Solution
Plausible estimation tasks are designed to see how well you can develop a chain of reasoning that will enable you to estimate an unknown quantity from things that you already know or can easily guess at. The best way of explaining this is to give an example.

 How much will you drink in your lifetime? How many baths would this fill?

Providing guidance
Whether your students work in groups or individually, many will ask for guidance while doing the tasks. The amount of guidance that students need should decline as they become familiar with this type of problem. Early in class, you are likely to provide guidance in the form of questions directly related to each stage of the solution process:

• What do you know that is relevant?
• What assumptions can you make?
• How plausible are your assumptions?
• Is your chain of reasoning accurate?
• Can you do the problem another way and see if the result is the same?
The amount and type of help you provide the students depends upon your goals for the task. Later, if your primary goal is to encourage students to struggle with solving the problems on their own (and learn that they can "do estimation"), you may choose to provide very little assistance.

Reporting out of individual or group work
If you decide to come together as a large group to discuss what students came up with (or report out), it is again helpful to decide the degree to which you will participate in these discussions (and will depend upon your goals for the session). For instance, you can facilitate the students' discussion by having them defend their ideas and write their ideas on the board, and adding almost none of your own. This approach of focussing on critical questions can direct students away from viewing you as the authority. Alternatively, you might model your own estimation skills for students by leading the discussion, soliciting student comments and organizing them in a useful manner and adding comments to guide them into an understanding of the problem.

Formal and informal use
These tasks can be used formally or informally. In formal assessment (where you grade the assignment as an examination), do not intervene except where specified. Even modest interventions - reinterpreting instructions, suggesting ways to begin, offering prompts when students appear to be stuck - have the potential to alter the task for the student significantly.

In informal assessment (an exercise, graded or non-graded), you may want to be less rigid in giving the students help. Under these circumstances, you may reasonably decide to do some coaching, talk with students as they work on the task, or pose questions when they seem to get stuck. In these instances you may be using the tasks for informal assessments - observing what strategies students favor, what kinds of questions they ask, what they seem to understand and what they are struggling with, and what kinds of prompts get them unstuck. This can be extremely useful information in helping you make ongoing instructional and assessment decisions. As students have more experiences with these kinds of tasks, the amount of coaching you do should decline and students should rely less on this kind of assistance. Evidence that students are learning from these activities comes in two different forms. First, the quality of the solutions they produce will improve. Second, they will use the key questions (are the assumptions reasonable? is the logic correct? is the answer plausible?) as they try to estimate a solution.

Group work versus individual work
The open-ended nature of these tasks makes for great group work problems. Students can discuss various measures and their merit and are likely to come up with many more ideas than if they worked alone. The
CL-1 Collaborative Learning site can provide instructions on how to use group work effectively within the classroom. However, individual work may give you more clues as to each student's sophistication with this type of problem.

Presumed background knowledge
One nice attribute of 'Plausible Estimation' tasks is that they require almost no mathematical knowledge. Students do need to have a basic knowledge of geometry concepts (area, perimeter, length), basic numeric skills (multiplication, addition, subtraction, and division, using large numbers), and an understanding of units and conversion of units.

2. Hand out copies of the task to students, either working individually or in groups.
3. State your goals for the 'Plausible Estimation' task, emphasizing that they should be able to defend both their assumptions and the reasoning which leads to their answer.
4. Walk around and listen to students as they discuss and work through the problems, providing guidance as necessary.
5. Have students present their solutions, either in written or verbal form.

Variations
The tasks included in this site can be

• First, select a question whose quantitative answer you can look up in a reference manual (as a check against the students' answers), and which can be derived from fairly simple assumptions or estimates. If the solution process needs a little-known estimate, provide that to the students.

• Then, ask the students to estimate the value, placing an emphasis on their assumptions and chain of reasoning.
An extension to Plausible Estimation tasks is to ask for 'bounded estimates' - what range of values would you give in order to be pretty certain that you have included the true value being estimated? One approach is to consider the effects of taking lower and upper bounds on every estimate made in the calculation, and to see the effects on the final estimate. In the bathtub problem, for example, you could explore the effect on the final estimate of smaller and larger tubs, and smaller and larger liquid intake. Asking about the sensitivity of solutions to initial assumptions is an important thinking skill in science.

The following examples from various disciplines may provide some insight into creating your own variations.

 Discipline Example "Plausible Estimation" task Astronomy How many stars are visible to the naked eye in the entire sky? Biology How many raccoons are killed by cars in the US each year? How many corn kernels are in an acre of corn? How many fish are in a nearby lake? How many geese migrate through your city each fall? Chemistry How many atoms in a grain of sand? How high is a stack of pennies if it's one mole's (6 x 1023) worth? How many oxygen molecules are present in the air in this room? Suppose that your great, great, great grandmother poured a glass of water into the Atlantic Ocean. Suppose that you dip a glass of water from the ocean tomorrow. How many molecules from the first glass of water are in the second glass of water?
 Category of performance Typical response The student needs significant instruction Student is unable to design a reasonable chain of operations. The student needs some instruction Student designs a reasonable chain of operations with the desired results, although there can be computational errors. There may be no attempt to round the results to fit the significant digits of the answer, and the answer may be given without units. The student's work needs to be revised Student designs a reasonable chain of operations, and reaches desired results with minimal computational errors and appropriate precision. Assumptions are communicated, but may need to be expanded or clarified. The student's work meets the essential demands of the task Student designs a computational strategy that logically estimates the desired quantity from the given information. The reasonableness of the computed estimate is ascertained, and the assumptions upon which the estimate is based are clearly communicated.

Most assessment practices seem to emphasise the reproduction of imitative, standardised techniques. I want something different for my students. I want them to become mathematicians - not rehearse and reproduce bits of mathematics.

I use the five 'mathematical thinking' tasks to stimulate discussion between students. They share solutions, argue in more logical, reasoned ways and begin to see mathematics as a powerful, creative subject to which they can contribute. Its much more fun to try to think and reach solutions collaboratively. Assessment doesn't have to be an isolated, threatening business.

Malcolm Swan is a lecturer in Mathematics Education at University of Nottingham and is a leading designer on the MARS team. His research interests lie in the design of teaching and assessment. He has worked for many years on research and development projects concerning diagnostic teaching (including ways of using misconceptions to promote long term learning), reflection and metacognition and the assessment of problem solving. For five years he was Chief Examiner for one of the largest examination boards in England. He is also interested in teacher development and has produced many courses and resources for the inservice training of teachers.

Thinking mathematically is about developing habits of mind that are always there when you need them - not in a book you can look up later.

For me, a big part of education is about helping students develop uncommon common sense. I want students to develop ways of thinking that cross boundaries - between courses, and between mathematics and daily life.

People should be able to tackle new problems with some confidence - not with a sinking feeling 'we didn't do that yet'. I wanted to share a range of big ideas concerned with understanding complex situations, reasoning from evidence, and judging the likely success of possible solutions before they were tried out. One problem I had is that my students seemed to learn things in 'boxes' that were only opened at exam time. Thinking mathematically is about developing habits of mind that are always there when you need them - not in a book you can look up later.

You can tell the teaching is working when mathematical thinking becomes part of everyday thinking. Sometimes it is evidence that the ideas have become part of the mental toolkit used in class - 'lets do a Fermi [make a plausible estimate] on it'. Sometimes it comes out as an anecdote. On graduate told me a story of how my course got him into trouble. He was talking with a senior clinician about the incidence of a problem in child development, and the need to employ more psychologists to address it. He 'did a Fermi' on the number of cases (wildly overestimated) and the resource implications (impossible in the circumstances). He said there was a silence in the group...you just don't teach the boss how to suck eggs, even when he isn't very good at it. He laughed.

Jim Ridgway is Professor of Education at the University of Durham, and leads the MARS team there. Jim's background is in applied cognitive psychology. As well as kindergarten to college level one assessment, his interests include the uses of computers in schools, fostering and testing higher order skills, and the study of change. His work on assessment is diverse, and includes, the selection of fast jet pilots, and cognitive analyses of the processes of task design. In MARS he has special responsibility for data analysis and psychometric issues, and for the CL-1 work.

The Mathematics Assessment Resource Service, MARS, offers a range of services and materials in support of the implementation of balanced performance assessment in mathematics across the age range K to CL-1. MARS is funded by the US National Science Foundation, and builds on earlier funding which began in 1992 for the Balanced Assessment Project (BA) from which MARS grew.

MARS offers effective support in:

The Design of Assessment Systems: assessment systems are tailored to the needs of specific clients. Design ranges from the contribution of individual tasks, through to full scale collaborative work on test development, scoring and reporting. Clients include Cities, States, and groups concerned with educational effectiveness, such as curriculum projects and professional development initiatives.

Professional Development for Teachers: most teachers need help in preparing their students for the much wider range of task types that balanced performance assessment involves. MARS offers professional development workshops for district leadership and 'mentor teachers', built on materials that are effective when used later by such leaders with their colleagues in school.

Developing Design Skills: many clients have good reasons to develop their own assessment, either for individual student assessment or for system monitoring. Doing this well is a challenge. MARS works with design teams in both design consultancy and the further development of the team's own design skills.

To support its design team, MARS has developed a database, now with around 1000 interesting tasks across the age range, on which designers can draw, modify or build, to fit any particular design challenge.