Go to Collaborative Learning Go to FLAG Home Go to Search
Go to Learning Through Technology Go to Site Map
Go to Who We Are
Go to College Level One Home
Go to Introduction Go to Assessment Primer Go to Matching CATs to Goals Go to Classroom Assessment Techniques Go To Tools Go to Resources




Go to CATs overview
Go to Attitude survey
Go to ConcepTests
Go to Concept mapping
Go to Conceptual diagnostic tests
Go to Interviews
Go to Mathematical thinking
Go to Performance assessment
Go to Portfolios
Go to Scoring rubrics
Go to Student assessment of learning gains (SALG)
Go to Weekly reports

Go to previous page

Classroom Assessment Techniques
Conceptual Diagnostic Tests

(Screen 5 of 6)
Go to next page

Theory and Research
Effective conceptual diagnostic tests are grounded in the research on alternative conceptions in science, commonly perceived as misconceptions by scientists (see the extensive references in Wandersee, Mintzes, and Novak, 1994). The latest bibliography of research papers (Pfundt and Duit, 1994) contains some 3600 entries, of which 66% are related to physics, 20% to biology, and 14% to chemistry. The bulk of the physics work has been in classical mechanics and electricity.

Many of the classic papers in misconceptions research deal with students younger than those in universities. But they are not irrelevant to a higher education context! You will discover that a large percentage of your students hold these alternative frameworks about the workings of the natural world. I have presented a separate list of sources to a selected few disciplinary-specific papers.

Force Concept Inventory
The Force Concept Inventory is the best developed and most widely used diagnostic test in physics (Hestenes, Wells, and Swackhamer, 1992; Hake 1998). Ibraham Halloun, Richard Hake, Eugene Mosca, and David Hestenes revised this test in 1995; this revision is the current version. The FCI has 30 qualitative items, with subscales, dealing only with the Newtonian concept of force. It is extremely effective in eliciting the "commonsense" notions of students about motion. The questions were designed to be meaningful to students without formal training in mechanics.

Mechanics Baseline Test
The Mechanics Baseline Test is more difficult than the FCI (Hestenes and Malcolm, 1992). It focuses on concepts that can be understood only after formal training in Newtonian mechanics. It contains 26 items, some of which involve simple calculations (but none that require a calculator). I have found the MBI challenging even for new graduate students in physics, who would be expected to do well on such a test.

Conceptual Surveys in Physics
The Two-Year College Physics Workshop is developing a set of conceptual surveys: Conceptual Survey in Electricity (CSE), Conceptual Survey in Magnetism (CSM), and Conceptual Survey in Electricity and Magnetism (CSEM). The CSEM is a shorter, combined subset of the CSE and CSM. The goal is to provide conceptual tests for common physics topics other than mechanics. The latest versions are Form G (7/98). The CSE has 32 items, CSM 21, and CSEM 32.

Conceptual Learning Assessments for Workshop Physics
Three assessment tests have been used to assess conceptual learning gains in Workshop Physics courses. They are: Force and Motion, Heat and Temperature, and Electric Circuits. They are available with a password to educators as Microsoft Word files.

Astronomy Diagnostic Test
The Astronomy Diagnostic Test (version 2) originated as an assessment for the conceptual astronomy research project (Zeilik et al., 1997). Early versions relied heavily on the results of an assessment from Project STAR; this test contained 60 physics and astronomy items. Lightman and Sadler (1993) give a 16-item version of this STAR assessment. Zeilik, Schau, and Mattern (1998) presented 15 central items from ADT (version 1). Although this ADT version was subjected to small-group interviews, it was never validated by extensive individual interviews. These interviews have been carried out at Montana State University (Adams and Slater) and at the University of Maryland (Hufnagel and Deming) and formed the basis for ADT (version 2).

California Chemistry Diagnostic Test (CCDT)
This 44-item test was developed by a 16-member team in response to a need in California for a "Freshman Chemistry Placement Test"to be used at all levels of institutions (Russell, 1994). The CCDT was given a trial run in 1987 in 53 schools with over six thousand students; a revised version was tested in 1988. The correlation coefficient with final grades in first term general chemistry was 0.42. The CCDT is availble through the American Chemical Society's Division of Chemical Education: ACS DivCHED Examinations Institute, 223 Brackett Hall, Clemson University, Clemson SC 29634-1913.


Links


Sources
Misconceptions research
General
Driver, R. (1993). The pupil as scientist? London: Milton Keynes.

Pfundt, H. and Duit, R. (1994). Bibliography: Students' Alternative Frameworks and Science Education, 4th edition, Kiel: Germany.

Astronomy
Nussbaum, J. (1979). Children's conception of the earth as a cosmic body: A cross-age study. Science Education, 63, 83-93.

Sneider, C. and Pulos S. (1983). Children's cosmographies: Understanding the earth's shape and gravity. Science Education, 67, 205-221.

Vosniadou, S. (1990). Conceptual development in astronomy. In S. Glynn, R. Yeany, and B. Britton (eds.), The psychology of learning science (pp. 149-177). Hillsdale, NJ: Lawrence Erlbaum.

Biology
Arnaudin, M. W. and Mintzes, J. J. (1985). Students' alternative conceptions of the circulatory system: A cross-age study. Science Education, 69, 721-733.

Bell, B. (1981). When is an animal not an animal? Journal of Biological Education, 15, 213-218.

Wandersee, J. H. (1986). Can the history of science help science educators anticipate students' misconceptions? Journal of Research in Science Teaching, 23, 581-597.

Chemistry
Ben-Zvi, N. and Gai, R. (1994). Macro- and micro-chemical comprehension of real work phenomena. Journal of Chemical Education, 71, 730-732.

Hackling, M. and Garnett, D. (1985). Misconceptions of chemical equilibia. European Journal of Science Education, 7, 205-214.

Nakhleh, M. B. (1992). Why some students don't learn chemistry: Chemical misconceptions. Journal of Chemical Education, 69, 191-196.

Novik, S. and Menis, J. (1976). A study of student perceptions of the mole concept. Journal of Chemical Education, 53, 720-722.

Stavy, R. (1988). Children's conception of gas. International Journal of Science Education, 10, 553-560.

Physics
Champagne, A., Klopfer, L. and Anderson, J. (1980). Factors influencing the learning of classical mechanics. American Journal of Physics, 48, 1074-1079.

Clement, J. (1982). Studies of preconceptions in introductory mechanics. American Journal of Physics, 50, 66-71.

Fredette, N. and Clement, J. (1981). Student misconceptions of an electric current: What do they mean? Journal of College Science Teaching, 10, 280-285.

Watts, D. M. (1985). Students' conceptions of light-A case study. Physics Education, 20, 183-187.

Diagnostic tests
Bisard, Walter and Zeilik, Michael (1998). Conceptually centered astronomy with actively engaged students. Mercury, 27 (4), 16-19.

Hake, Richard R. (1998). Interactive engagement versus traditional methods: A six-thousand-student survey of mechanics test data for introductory physics courses. American Journal of Physics, 66 (1), 64-74.

Hestenes, David and Wells, Malcolm (1992). A mechanics baseline test. The Physics Teacher, 30, 159-166.

Hestenes, David, Wells, Malcolm, and Swackhamer, Gregg (1992). Force concept inventory. The Physics Teacher, 30 (3): 141-151.

Lightman, Alan and Sadler, Philip (1993). Teacher predictions versus actual student gains. The Physics Teacher, 31 (3): 162-167.

Odom, A. L. and Barrow, L. H. (1995). Development and application of a two-tier diagnostic test measuring college biology students' understanding of diffusion and osmosis after a course of instruction. Journal of Research in Science Teaching, 32 (1): 45-61.

Russell, A. A. (1994). A rationally designed general chemistry diagnostic test. Journal of Chemical Education, 71 (4): 314-317.

Treagust, D. F. (1988). Development and use of diagnostic tests to evaluate students' misconceptions in science. International Journal of Science Education, 10 (2), 159-169.

Wandersee, James H., Mintzes, Joel J., and Novak, Joseph D. (1994). Research on alternative conceptions in science. Handbook of Research of Science Teaching and Learning, edited by Dorothy L. Gabel. New York: Macmillan Publishing, 177-210.

Zeilik, Michael, Schau, Candace, and Mattern, Nancy (1998). Misconceptions and their change in university-level astronomy courses. The Physics Teacher, 36: 104-107.

Zeilik, M., Schau, C., Mattern, N., Hall, S., Teague, K., and Bisard, W. (1997). Conceptual astronomy: A novel model for teaching postsecondary science courses. American Journal of Physics, 65 (10): 987-996.


Go to previous page Go to next page


Tell me more about this technique:

Got to the top of the page.



Introduction || Assessment Primer || Matching Goals to CATs || CATs || Tools || Resources

Search || Who We Are || Site Map || Meet the CL-1 Team || WebMaster || Copyright || Download
College Level One (CL-1) Home || Collaborative Learning || FLAG || Learning Through Technology || NISE