Introduction
Joan Lucariello, PhD, City University of New York

Students do not come to school as blank slates to be filled with instruction. Rather, they come to school with considerable knowledge, some correct and some not. Either way, that knowledge is based on intuition, every-day experience, as well as what they have been taught in other settings.

Teachers and researchers generally refer to preinstructional knowledge as preconceptions. Since a considerable amount of our knowledge is organized by subject matter (mathematics, science, etc.), so too are our preconceptions. Before beginning instruction on any new topic, teachers need to know their students’ preconceptions because learning, and therefore instruction itself, varies depending on whether students’ preconceptions agree with the concepts being taught or contradict those concepts.

When preconceptions are consistent with the concepts in the assigned curriculum, student preconceptions are called anchoring conceptions. Learning, in such cases, is much easier. It becomes a matter of conceptual growth, enrichment, or adding to student knowledge. More often, teachers find themselves teaching concepts that are difficult for their students to learn because students’ preconceptions are inconsistent with the concepts being taught. In these cases, preconceptions are termed alternative conceptions or misconceptions.

Description of some common alternative conceptions (misconceptions) in different domains.

There are several reasons that teachers need to figure out students’ preconceptions before they begin instruction on a new topic. First (as noted above), learning and teaching vary considerably, depending on whether the teacher is confronting alternative conceptions (misconceptions) or correct (anchoring) conceptions about the concepts or theories to be taught. Second, any pre-assessment of student academic skills and student knowledge may not accurately reflect actual pupil skill and knowledge. In addition, student learning and achievement can increase when teachers better understand their students’ thinking about a concept. Furthermore, when teachers listen to and comprehend student thinking, it can expand their understanding of the subject matter, change their beliefs on how to teach, modify their practices and contribute to their professional growth.

The first phase of instruction should be “diagnostic.” Teachers should employ diagnostic techniques not only to find out what students know or understand about a topic (“preconceptions”), but also to find out about other aspects of student thinking. These include student-learning processes, those things that are either hard or easy for students to grasp, and the errors that students commonly make (see Marks, 1990, for further discussion). The problem is that even when teachers are interested in diagnosing student ideas and responses, they often use inappropriate methods to do so (Morrison & Lederman, 2003). Therefore, recommended diagnostic teaching practices — “Dos” — are provided in this module.

Do's and don'ts
Dos

Strategies teachers can use prior to instruction to determine what knowledge (correct or incorrect) students have about concepts or theories to be taught

  1. Conduct quizzes (Pretests)

  1. Try to learn HOW students are going about solving problems or arriving at their answers. Knowing the ideas and/or strategies that students are using to arrive at an answer (whether the answer is right or wrong) can be very informative for the purposes of making instruction more effective.

  1. Encourage your students to ask questions and then examine the content of those questions.

  2. Ask students to define major concepts, then analyze their definitions for errors in logic or strategy (the rule) that are causing the students to make mistakes.

  3. Use the technique of "differential diagnosis" (Brown & Burton, 1978).

  1. Use the technique of DFA (Difficulty Factors Assessment) (see Rittle-Johnson & Koedinger, 2005) to identify which features of a problem are either causing student difficulty or facilitating student learning.

  1. Present problems to students to students in which you systematically vary important features of the problem (Brown & Burton, 1978).

Strategies 8–13 are designed to analyze student thinking in ways that are not dependent on their language and verbal skills. These approaches should be considered for evaluating students whose verbal skills are not strong:

  1. Provide common misconceptions among the answer choices on a quiz (i.e. multiple choice) or in response to classroom questions.

  1. Give “implicit-confidence” tests. These tests involve a simple modification of the traditional multiple-choice test. (See Klymkowsky, Taylor, Spindler, & Garvin-Doxas, 2006).

  1. Administer questionnaires in specific subject matter areas in order to identify whether a misconception is unique to a particular context or is due to a problem with general reasoning or language.

  1. Administer a categorization/sorting task.

  1. Have students model (draw or use props) their solutions (Bright, 1996) to a problem and analyze their models for pattern or strategy errors.

  2. Use student memory/recall of problems and definitions as a window into their thinking.
Don'ts
  1. Do not jump too quickly to one hypothesis about student thinking processes or reasoning. Prematurely focusing on one hypothesis about what a student may be thinking can cause you to be unaware of competing hypotheses that may provide a better explanation (Brown & Burton, 1978).

  2. Do not stick with or lock onto only one type of response behavior a student may display. Vary problem features to determine whether the same response behavior is displayed under different problem conditions.

  3. Do not make a diagnosis about students’ logic, reasoning or thinking processes on the basis of their performance on one problem or one kind of problem.

  4. When probing students’ understanding, try to avoid asking only general, open-ended questions. And don’t stop probing after asking one question. Followup questions can reveal the source of a misconception, or an error in reasoning or strategy selection. Followup questions are also important when a student gives the right answer. Remember that faulty thinking can occasionally lead to correct answers.

How to use information gathered from pre-instruction quizzes

Analyzing responses to pretest questions can provide you with a good understanding of what your students are thinking:

  1. Analyze correct responses.

  2. Analyze errors by looking for patterns. Patterns can reveal the thinking (or rules) that students use that lead them to their mistakes.

Assessing how students are arriving at their answers

Information on student strategies can be gleaned in several ways:

  1. Use students’ own verbal self-reports.

  2. Analyze students’ overt behavior, such as their written work (Rittle-Johnson & Koedinger, 2005; Rittle-Johnson & Siegler, 1999) and journal writing (Vacc & Bright, 1999).

  3. Ask questions — there is a relationship between good teacher questioning skills and student thinking (Moyer & Milewicz, 2002). Good questioning is a skill. When the purpose of questioning is making informed instructional decisions, teachers should interpret students’ responses to determine what students know, rather than whether or not their answers match the expected responses (Vacc & Bright, 1999). Data show that pre-service teachers may not be using competent questioning techniques (Moyer & Milewicz, 2002; Ralph, 1999a, 1999b). Moreover, even in-service teachers may rely too heavily on recall-level questions (Morrison & Lederman, 2003).

  1. Interview students about the problems they solve to determine how they arrived at their answers (See Bright, 1996; Ginsburg, 1997; Ginsburg, Jacobs, & Lopez, 1998, Ginsburg & Pappas, 2004; Moyer & Milewicz, 2002).

Examples of good questioning skills

From Ralph (1999a, 1999b) and cited in Moyer & Milewicz (2002):

  • Prepare important questions ahead of time.

  • Deliver questions clearly and concisely.

  • Pose questions that stimulate thinking.

  • Provide children with enough time to think about and prepare their answers.

  • Avoid asking questions that require one-word answers.

From Vacc & Bright (1999):

  • Ask questions that require critical thinking (e.g., ask students to compare different solution strategies).

  • Ask probing questions that refer specifically to what a student says, does or thinks in order to gain further information about students’ solution strategies (e.g., “How did you know how many nickels and how many pennies to put down?”).

  • Ask followup questions for both incorrect answers and correct answers.

Interview protocol

You might begin an interview with open-ended questions that get students to examine and justify their answers:

How did you figure that out? What were you thinking when you got that answer? How would you explain to another student how you got your answer? Can you give me another example to explain what you mean?

Then move to more focused or specific questions:

What was the first thing you thought of? What did you do next? Did you have any pictures in your head as you worked through that part of the problem?

If students are having difficulty verbally expressing their ideas, you can also use more “Recognition Questions.”  For these questions, the student does not have to produce the content, but rather the content is put into the question and the student need only respond to it in some way:

Examples of recognition questions

Did you use the procedure of __X____ or did you do use the procedure of __Y___to solve the problem?

From Moyer and Milewicz (2002):

Use the probing with followup technique, where a variety of question-types are used to further investigate the child’s answers. This includes questioning both correct and incorrect responses, and use of specific questioning. If only incorrect responses are probed with follow-up questions, the child misses out on the opportunity to articulate and defend the methods used to arrive at accurate solutions. Moreover, probing only incorrect responses assumes that if the child provided the correct answer, he or she must understand the concept. This is not always true. Children can arrive at a correct answer though faulty reasoning or partial understanding of a concept. Specific questions are tailored to individual responses and can be better able to get at student thinking than are more general, open-ended questions.

Things to avoid in a diagnostic interview (from Moyer & Milewicz, 2002)
  1. Avoid “check listing” in the interview. With check listing the teacher proceeds from one question to the next on the interview protocol, relying on a script and paying little attention to the student’s responses. Check listing usually does not include followup questions, but does involve the use of verbal checkmarks (the specific and repetitive use of one- or two-word verbalizations, such as “OK,” “Right,” or “Good Job.”) These verbal checkmarks say to students that they no longer need to continue thinking about the question, because their answers were complete and “checked-off” on the list.

  2. Avoid instructing when you should be assessing. Instructing includes asking leading questions and providing hints about the answer. The use of leading questions can result in a guessing game in which the student concentrates more on figuring out what the interviewer is thinking than on explaining his/her own thinking. Instructing also means that the questioning itself is halted while the concept is re-taught. 

"Differential diagnosis" technique explanation

The "differential diagnosis" technique requires the making of at least two hypotheses about what students might be thinking and then choosing or creating problems that can separate between them. According to Brown and Burton (1978), this approach avoids the problem of focusing too quickly on one hypothesis and then generating problems that confirm the hypothesis.

Explanation and examples of DFA

Difficulty Factors Assessment is a way to identify which features of a problem are either causing students difficulty or facilitating their learning. In DFA, target factors are systematically varied and crossed with other problem features, leading to multiple versions of the problem).

DFA example (taken from Rittle-Johnson & Koedinger, 2005)

For example, in the case of math, to find out whether a story (word-problem) context makes it easier or harder for students to solve a particular kind of math problem, one version of the assessment might ask children to add ½ + 2/3 in the context of a story, and to add ¾ + 1/7 without a story. The format of these two questions would vary only the presence or absence of a story. A second version of the assessment would do the opposite: add ¾ + 1/7 in the context of a story and ½ + 2/3 without a story.

Examining student responses across the two versions of the problem provides estimates of student accuracy when a problem is or is not embedded in a story context, and these estimates are not confounded by potential differences in the difficulty of the numbers used.

You can also use DFA to analyze students’ errors and the strategies they use to see how each factor affects problem solving.

Reasons for and examples of problem variations

Varying systematic features of problems will enable you to be more certain of what is really underlying student performance and avoid locking onto only one type of behavior (Brown & Burton, 1978).
Brown & Burton (1978) gave a student teacher the following:
Problem A
19 +  9 = 199
After which the student teacher generated the following problems and concluded that the misconception (bug) was to “write the bottom digit after the top number.”

Problem B Problem C
23 81
+6 +8
236 818

However, this hypothesis failed when the student was given his first test problem (which differed in format from all the others in not having a shorter/smaller bottom digit):

8 +  12 =

To which the student responded with 812 as the answer.

However, the student teacher had the wrong idea as to the misconception (bug). The actual misconception is that “single digit operands are linked onto the end of the other operand.” Hence, the misconception answer to this last problem would be 128. By presenting examples with only a shorter bottom digit, the student teacher had obtained what seemed to be confirming evidence of his hypothesis. Hence, vary the problem when trying to uncover what students are really thinking.

In quizzing, include misconceptions among wrong answer options.

When making up a quiz, be sure to include misconceptions among the wrong answer choices. In this way, students who may not be able to articulate their misconceptions are likely to choose them as the “correct” answer.

The “Implicit-Confidence” test

Tests called Two-Dimensional Tests (TDTs) are designed to determine student’s uncertainty or confidence in their answers (see Klymkowsky et al., 2006). There are two versions of these confidence-based TDTs: implicit and self-assessed (also called explicit). See Figure 1 below (taken from Klymkowsky et al., 2006).

Two-dimensional tests


With the self-assessed TDT, students indicate their level of confidence as a separate response.

With an implicit TDT, student confidence is embedded in a single response. Students are told that only one answer is correct. However, they are given the option to hedge their response. For example, if students are uncertain as to whether the correct answer is “A” or “B,” they can pick “A or B,” and so on. Students can also indicate that they have “no idea” what is the correct choice.

Implicit tests have an advantage over explicit (self-assessed) tests because students do not have to assess their own confidence level; the answer choice they select contains this information.

When designed correctly, a TDT test can provide teachers with very useful information. If incorrect answers (called distracters) state commonly held misconceptions, and a large percentage of students answer wrongly (but confidently), this indicates that students are still clinging to their misconceptions even after instruction. If this happens, or if many students answer “No Idea,” teachers may need to reevaluate their approach to presenting the material. But, if a majority of the students taking the test are confidently correct, then it is likely the teacher presented the concept effectively.

This kind of test is especially effective when used as a pre-assessment to reveal the percentage of students that hold misconceptions common to the topic being introduced.

Assessing misconceptions about science through questionnaires

It is well-known that students have many misconceptions when it comes to the study of force and motion. A multiple-choice test, known as the Force Concept Inventory (FCI), has been developed to probe those misconceptions in high school and college students (see Savinainen & Scott, 2002).

You can download the FCI by visiting the Modeling Instruction Program page and clicking on 'Research and Evaluation'. Please note, that you will need to contact the author, Dr. David Koch, in order to receive a password to access the questionnaire. Teachers, please use your school e-mail for the request, and also include the subject you teach, the name of your school, and its location (city & state). Dr. Koch can be reached by email.

In addition, an open-ended instrument, known as the “Views of Nature of Science (VNOS)” questionnaire, has been developed for K-12 students to assess learners’ conceptions of science (see Lederman, Abd-El-Khalick, Bell, & Schwartz, 2002). VNOS addresses conceptions and misconceptions that students can have about what science is and what scientists do.

Views of Nature of Science (VNOS) questionnaire

Students also have misconceptions about the shape of the earth. Vosniadou and Brewer (1992) have developed a protocol for identifying students’ mental models of the earth. In addition, Hayes, Goodhew, Heit, and Gillan (2003) have such a protocol adapted from Vosniadou and Brewer (1992).

Method and examples of categorization/sorting tasks

In this activity, the teacher presents to students model objects (or pictures of objects) to sort into different categories. The instructions are “Put the things together that go together,” or “Put the things that are alike (or “the same kind of thing”) together.”

For example, you can present a collection of pictures consisting of a mix of 10-20 exemplars and non-exemplars of a particular concept and ask students to sort the pictures into categories (e.g., living thing; animal). (See Stovall & Nesbit, 2003). In this way, you can see what principle each student is using as they group the pictures. (This in turn reveals the students’ understanding of the concept).

Examples of memory/recall problems

For example, McNeil & Alibali (2005) showed students the following equation for 5 seconds and told them to memorize it, but not solve it: 

7 + 4 + 5 = 7 + __

Later, when they were asked to recall the problem, many students wrote the following: 

7+ 4 + 5 + 7 = __

This revealed that many students were following the arithmetic pattern of “operations = answer,” and not understanding the concept of equivalence (the real meaning of the equal sign). The correct view of the equal sign is a symbol of equivalence. It is the “relational” view of the equal sign as a symbol that denotes a relationship between two quantities.

Evidence and explanation: Why and how do these teaching strategies work?

Diagnosing students’ prior (pre-instructional) knowledge is critical to successful teaching for several reasons. First, teachers often think that their students know more than they really know (Diakidoy & Iordanou, 2003; Eckert, Dunn, Codding, Begeny, & Kleinmann, 2006). Second, teaching has a greater chance of leading to learning when strategies are informed by students’ current understandings of the concepts and theories being taught (e.g., Carpenter, Fennema, Peterson, Chiang, & Loef, 1989; Fennema et al., 1996). Third, student learning does not occur in a vacuum. Students do not come to school as empty vessels, but rather come with considerable pre-instructional knowledge related to the topics taught in school. Learning builds on, and is related to this prior knowledge (Bransford, Brown, & Cocking, 2000; Gelman & Lucariello, 2002; Piaget 1926, 1930, 1951, 1952, 1954; Piaget & Inhelder, 1969; Resnick, 1983). We all interpret new information in terms of our current knowledge.

Student pre-instructional knowledge is based on intuitions, everyday experiences, and/or what they have been taught in other contexts. Such pre-instructional knowledge is generally referred to as preconceptions. Since most knowledge is organized into domains such as mathematics, science, art, history, etc., it’s not surprising that preconceptions are organized the same way.

The selection of an appropriate teaching strategy depends on whether a preconception is an anchoring conception or a misconception (“alternative conception” or “intuitive theory”). When students’ preconceptions are consistent with the concepts being taught, these preconceptions are termed anchoring conceptions. In this case, the students’ prior knowledge does not interfere with learning, but instead becomes the base onto which the learner can build new knowledge. When an anchoring concept is present, learners link new information to what they already know and learning becomes a matter of conceptual growth or assimilation. (Piaget 1926, 1930, 1951, 1952, 1954; Piaget and Inhelder, 1969; Resnick, 1983).

However, when students’ preconceptions are inconsistent with, and even contradictory to concepts in the curriculum, they are termed alternative conceptions or misconceptions. These misconceptions) interfere with learning for several reasons. First, students use their prior knowledge to interpret new experiences. If the prior knowledge is incorrect the interpretation of a new but related concept is likely to be incorrect. Second, misconceptions can be entrenched and tend to be very resistant to instruction (Brewer and Chinn, 1991). This happens because, learning, in such cases, is not simply a matter of adding to the students’ knowledge. Rather, it is a matter of radically reorganizing or replacing the students’ knowledge. Conceptual change or accommodation has to occur before learning can happen (Carey, 1985, 1986; Posner, Strike, Hewson, & Gertzog, 1982; Strike & Posner, 1985, 1992). It is the teacher’s job to bring about this conceptual change.

The fact that students have alternative conceptions (misconceptions) means that, in many cases, the errors they make are not random. Rather, student errors can reveal a world of systematic misconceptions that can be commonly held (Brown & Burton, 1978). To be able to make sense of student errors, and uncover their underlying causes, teachers must know what students are thinking.

Alternative conceptions (misconceptions) are quite common in both children and adults for a couple of reasons. As we look for meaning and try to understand our experiences, it is only natural that not all the ideas we develop are correct. Moreover, some concepts/theories in different content areas are counter-intuitive and can be very difficult to grasp. This makes our understanding of them flawed or incomplete.

Frequently asked questions
Do all students hold preconceptions?

Yes, all students come to school knowing a lot about many of the topics on which they will be instructed. Some of their knowledge is correct (“anchoring conceptions”), while some is erroneous (“misconceptions”). And, it’s not only students; adults (including teachers) bring preconceptions with them from many domains of knowledge.

Are misconceptions common?

Yes, they are. It is normal for both children and adults to have misconceptions. Therefore, before instruction on a new topic begins, it is important for teachers to find out what ideas their students have about the topic.

Are there typical misconceptions that occur for different subject areas?

Yes. Although some misconceptions are unique to individuals, most misconceptions that students experience in different subject matter areas are typical misconceptions, that is, ones that are subject-specific and also common to many students. It is especially important for teachers to become familiar with the misconceptions that students commonly hold in the subjects they teach. This knowledge is critical to providing efficient and effective instruction.

For whom, and under what conditions, do these recommendations work?
Age

Most of the strategies discussed here are general techniques that should work for students in grades K-12. However, use of the “implicit confidence test,” (see Dos and Don’ts) is best reserved for high school students. Strategies such as the categorization/sorting task (see Dos and Don’ts page) and modeling or drawing their solutions (see Dos and Don’ts page) might be particularly suited for younger students.

Individual differences

For students with delayed language development, for those who are learning English as their second language, or for students who are not verbally adept, teachers should rely more heavily on sorting/categorization (see Do’s and Don’ts page) and modeling method tasks (see Dos and Don’ts page) for diagnosing preconceptions and identifying misconceptions.

Where can teachers get more information?

Bransford, J. D., Brown, A. L., Cocking, R. R.  (Eds.).  (2000).  How people learn: Brain, mind, experience and school. Washington, DC: National Academy Press.

Brown, J. S., & Burton, R.R.  (1978).  Diagnostic models for procedural bugs in basic mathematical skills. Cognitive Science, 2, 155-192.

Ginsburg, H.P., Jacobs, S.F., & Lopez, L.S.  (1998).  The teacher's guide to flexible interviewing in the classroom: Learning what children know about math. Boston: Allyn and Bacon.

Morrison, J. A., & Lederman, N.G.  (2003).  Science teachers’ diagnosis and understanding of students’ preconceptions. Science Education, 87, 849-867.

References

Bransford, J. D., Brown, A. L., Cocking, R. R. (Eds.). (2000) How people learn: Brain, mind, experience and school. Washington, DC: National Academies Press.

Brewer, W. F., & Chinn, C. A.  (1991).  Entrenched beliefs, inconsistent information, and knowledge change.  In L. Birnbaum (Ed.), The International Conference of the Learning Sciences: Proceedings of the 1991 conference (pp. 67-73). Charlottesville, VA: Association for the Advancement of Computing in Education.

Bright, G. W.  (1996, September).  Implementing the assessment standards for school mathematics. Teaching Children Mathematics, 18-22.

Brown, J. S., & Burton, R. R.  (1978).  Diagnostic models for procedural bugs in basic mathematical skills. Cognitive Science, 2, 155-192.

Carey, S.  (1985).  Conceptual change in childhood. Cambridge, MA: MIT Press.

Carey, S.  (1986).  Cognitive science and science education. American Psychologist, 41, 1123-1130.

Carpenter, T. P., Fennema, E., Peterson, P. L., Chiang, C. P., & Loef, M.  (1989).  Using knowledge of children’s mathematics thinking in classroom teaching: An experimental study. American Educational Research Journal, 26(4), 499-531.

Diakidoy, I. N., & Iordanou, K.  (2003).  Pre-service teachers’ and teachers’ conceptions of energy and their ability to predict pupils’ level of understanding. European Journal of Psychology of Education, 18, 357-368.

Eckert, T. L., Dunn, E. K., Codding, R. S., Begeny, J. C., & Kleinmann, A. E.  (2006).  Assessment of mathematics and reading performance: An examination of the correspondence between direct assessment of student performance and teacher report. Psychology in the Schools, 43, 247-265.

Fennema, E., Carpenter, T. P., Franke, M. L., Levi, L., Jacobs, V. R., & Empson, S. B.  (1996).  A longitudinal study of learning to use children’s thinking in mathematics instruction. Journal for Research in Mathematics Education, 27(4), 403-434.

Gelman, R., & Lucariello, J.  (2002).  Role of learning in cognitive development.  In H. Pashler (Series Ed.) & C. R. Gallistel (Vol. Ed.), Stevens’ handbook of experimental psychology:  Learning, motivation, and emotion (vol. 3, 3rd ed., pp. 395-443). New York: Wiley.

Ginsburg, H. P.  (1997).  Entering the child's mind: The clinical interview in psychological research and practice. New York: Cambridge University Press.

Ginsburg, H. P.,  Jacobs, S. F., & Lopez, L. S.  (1998).  The teacher's guide to flexible interviewing in the classroom: Learning what children know about math. Boston: Allyn and Bacon.

Ginsburg, H. P., & Pappas, S.  (2004).  SES, ethnic, and gender differences in young children’s informal addition and subtraction: A clinical interview investigation. Journal of Applied Developmental Psychology, 25, 171-192.

Hayes, B. K., Goodhew, A., Heit, E., & Gillan, J.  (2003).  The role of diverse instruction in conceptual change. Journal of Experimental Child Psychology, 86, 253-276.

Klymkowsky, M. W., Taylor, L. B., Spindler, S. R., & Garvin-Doxas, R. K.  (2006).  Two-dimensional, implicit confidence tests as a tool for recognizing student misconceptions. Journal of College Science Teaching, Nov-Dec, 44-48.

Lederman, N. G., Abd-El-Khalick, F., Bell, R. L., & Schwartz, R. S.  (2002).  Views of nature of science questionnaire (VNOS): Toward valid and meaningful assessment of learners’ conceptions of nature of science. Journal of Research in Science Teaching, 39(6), 497-521.

Marks, R.  (1990).  Pedagogical content knowledge: From a mathematics case to a modified concept. Journal of Teacher Education, 41(3), 3-11.

McNeil, N. M., & Alibali, M. W.  (2005).  Why won’t you change your mind? Knowledge of operational patterns hinders learning and performance on equations. Child Development, 76, 883-899.

Morrison, J. A., & Lederman, N.G.  (2003).  Science teachers’ diagnosis and understanding of students’ preconceptions. Science Education, 87, 849-867. 

Moyer, P. S., & Milewicz, E.  (2002).  Learning to question: Categories of questioning used by pre-service teachers during diagnostic mathematics interviews. Journal of Mathematics Teacher Education, 5, 293-315.

Piaget, J.  (1926).  The language and thought of the child. New York: Kegan Paul.

Piaget, J.  (1930).  The child’s conception of the world. New York: Harcourt, Brace, & World.

Piaget, J.  (1951).  Play, dreams, and imitation in childhood. New York: Norton.

Piaget, J.  (1952).  The origins of intelligence in children.  New York: International Universities Press.

Piaget, J.  (1954).  The construction of reality in the child. New York: Ballantine.

Piaget, J., & Inhelder, B.  (1969).  The psychology of the child. New York: Basic Books.

Posner, G. J., Strike, K. A., Hewson, P. W., & Gertzog, W. A.  (1982).  Accommodation of a scientific conception: Toward a theory of conceptual change. Science Education, 66(2), 211-227.

Ralph, E. G.  (1999a).  Developing novice teachers’ oral-questioning skills.  McGill Journal of Education, 34(1), 29-47.

Ralph, E. G.  (1999b).  Oral-questioning skills of novice teachers: Any questions? Journal of Instructional Psychology, 26(4), 286-296.

Resnick, L. B.  (1983).  Mathematics and science learning: A new conception. Science, 220, 477-478.

Rittle-Johnson, B., & Koedinger, K. R.  (2005).  Designing knowledge scaffolds to support mathematical problem solving. Cognition and Instruction, 23(3), 313-349.

Rittle-Johnson, B., & Siegler, R. S.  (1999).  Learning to spell: Variability, choice, and change in children’s strategy use. Child Development, 70, 332-348.

Savinainen, A., & Scott, P.  (2002).  The Force Concept Inventory: A tool for monitoring student learning. Physics Education, 37, 45-52.

Stovall, G., & Nesbit, C. R.  (2003).  Let’s try action research! Science and Children, 40(5), 44-48.

Strike, K. A., & Posner, G. J.  (1985).  A conceptual change view of learning and understanding. In L. West & L. Pines (Eds.), Cognitive structure and conceptual change (pp. 147-176).  San Diego, CA: Academic Press.

Strike, K. A., & Posner, G. J.  (1992).  A revisionist theory of conceptual change. In R. A. Duschl & R. J. Hamilton (Eds.), Philosophy of science, cognitive psychology, and educational theory and practice (pp. 211-231).  Albany, NY: Statue University of New York Press.

Vacc, N. N., & Bright, G. W.  (1999).  Elementary preservice teachers’ changing beliefs and instructional use of children’s mathematical thinking. Journal for Research in Mathematics Education, 30(1), 89-110.

Vosniadou, S., & Brewer, W. F.  (1992).  Mental models of the earth: A study of conceptual change in childhood. Cognitive Psychology, 24, 535-585.