Students do not come to school as blank slates to be filled with instruction. Rather, they come to school with considerable knowledge, some correct and some not. Either way, that knowledge is based on intuition, everyday experience, as well as what they have been taught in other settings.

Teachers and researchers generally refer to pre-instructional knowledge as preconceptions. Since a considerable amount of our knowledge is organized by subject matter (mathematics, science, etc.), so too are our preconceptions. Before beginning instruction on any new topic, teachers need to know their students' preconceptions because learning, and therefore instruction itself, varies depending on whether preconceptions agree with the concepts being taught or contradict those concepts.

Understand your students' preconceptions

Student preconceptions, when consistent with concepts in the assigned curriculum, are called anchoring conceptions. Learning, in such cases, is much easier. It becomes a matter of conceptual growth, enrichment, or adding to student knowledge. More often, teachers find themselves teaching concepts that are difficult for their students to learn because students' preconceptions are inconsistent with the concepts being taught. In these cases, preconceptions are termed alternative conceptions or misconceptions.

There are several reasons that teachers need to figure out students' preconceptions before they begin instruction on a new topic. First, learning and teaching vary considerably, depending on whether the teacher is confronting alternative (misconceptions) or correct (anchoring) conceptions about the concepts or theories to be taught. Second, any pre-assessment of student academic skills and student knowledge may not accurately reflect actual pupil skill and knowledge. In addition, student learning and achievement can increase when teachers better understand their students' thinking about a concept. Though misconceptions generally pose obstacles to learning, they also offer some value to the learning experience. They are evidence of previous content coverage and interest and represent a starting point, though a challenging one, from which to teach new material. Furthermore, when teachers listen to and comprehend student thinking, it can expand their understanding of the subject matter, change their beliefs on how to teach, modify their practices and contribute to their professional growth.

The first phase of instruction should be "diagnostic." Teachers should employ diagnostic techniques not only to find out what students know or understand about a topic (preconceptions), but also to find out about other aspects of student thinking. These include student learning processes, those things that are either hard or easy for students to grasp, and the errors that students commonly make (see Marks, 1990, for further discussion). The problem is that even when teachers are interested in diagnosing student ideas and responses, they often use inappropriate methods to do so (Morrison & Lederman, 2003). Therefore, recommended diagnostic teaching practices — "Dos" — are provided in this module.

Do's and don'ts

Strategies teachers can use prior to instruction to determine what knowledge (correct or incorrect) students have about concepts or theories to be taught

  1. Conduct quizzes (pretests).

  2. Try to learn how students are going about solving problems or arriving at their answers. Knowing the ideas and/or strategies that students are using to arrive at an answer (whether the answer is right or wrong) can be very informative for the purposes of making instruction more effective.

  3. Encourage your students to ask questions and then examine the content of those questions.

  4. Ask students to define major concepts, then analyze their definitions for errors in logic or strategy (the rule) that are causing the students to make mistakes.

  5. Use the technique of "differential diagnosis": Make at least two hypotheses about what students might be thinking, and then choose or create problems that can separate between them. According to Brown and Burton (1978), this approach avoids the problem of focusing too quickly on one hypothesis and then generating problems that confirm the hypothesis.

  6. Use the technique of difficulty factors assessment (DFA) (see Rittle-Johnson & Koedinger, 2005) to identify which features of a problem are either causing student difficulty or facilitating student learning.

  7. Present problems to students to students in which you systematically vary important features of the problem (Brown & Burton, 1978).

  8. Consider how a student’s culture and dialect may impact how they answer questions and evaluate whether they are understanding the content, but just expressing it differently (Delpit, 2006).

Strategies 9-15 are designed to analyze student thinking in ways that are not dependent on their language and verbal skills. These approaches should be considered for evaluating students whose verbal skills are not strong:

  1. Provide common misconceptions among the answer choices on a quiz (i.e., multiple choice) or in response to classroom questions.

  2. Give “implicit-confidence” tests. These tests involve a simple modification of the traditional multiple choice test. (See Klymkowsky, Taylor, Spindler, & Garvin-Doxas, 2006).

  3. Administer questionnaires in specific subject matter areas in order to identify whether a misconception is unique to a particular context or is due to a problem with general reasoning or language.

  4. Administer a categorization/sorting task.

  5. Have students model (draw or use props) their solutions (Bright, 1996) to a problem and analyze their models for pattern or strategy errors.

  6. Use student memory/recall of problems and definitions as a window into their thinking.

  7. When using an alternative assessment, be sure that it is measuring what you are hoping to gauge from the student. For example, if you have a student build a boat out of foil to demonstrate an understanding of buoyancy, they may still understand the concept/(buoyancy), but struggle at building model boats.

  1. Do not jump too quickly to one hypothesis about student thinking processes or reasoning. Prematurely focusing on one hypothesis about what a student may be thinking can cause you to be unaware of competing hypotheses that may provide a better explanation (Brown & Burton, 1978).

  2. Do not stick with or lock onto only one type of response behavior a student may display. Vary problem features to determine whether the same response behavior is displayed under different problem conditions.

  3. Do not make a diagnosis about students' logic, reasoning or thinking processes on the basis of their performance on one problem or one kind of problem.

  4. When probing students' understanding, try to avoid asking only general, open-ended questions. And don't stop probing after asking one question. Follow-up questions can reveal the source of a misconception, or an error in reasoning or strategy selection. Follow-up questions are also important when a student gives the right answer. Remember that faulty thinking can occasionally lead to correct answers.

  5. Do not offer unmerited praise for students who are underperforming due to low initial ability at a task. Though your intention may be noble, this can lower the instinctive expectations of these students if they believe you only feel they can do minimal work (Rattan, Good, & Dweck, 2012).

Pretests and assessmenet
How to use information gathered from pre-instruction quizzes

Analyzing responses to pretest questions can provide you with a good understanding of what your students are thinking: 

  • Analyze correct responses.
  • Analyze errors by looking for patterns. Patterns can reveal the thinking (or rules) that students use that lead them to their mistakes.
Assessing how students are arriving at their answers

Information on student strategies can be gleaned in several ways:

  • Use students’ own verbal self reports.

  • Analyze students’ overt behavior, such as their written work (Rittle-Johnson & Koedinger, 2005; Rittle-Johnson & Siegler, 1999) and journal writing (Vacc & Bright, 1999).

  • Ask questions — there is a relationship between good teacher questioning skills and student thinking (Moyer & Milewicz, 2002). Good questioning is a skill. When the purpose of questioning is making informed instructional decisions, teachers should interpret students’ responses to determine what students know, rather than whether or not their answers match the expected responses (Vacc & Bright, 1999). Data show that pre-service teachers may not be using competent questioning techniques (Moyer & Milewicz, 2002; Ralph, 1999a, 1999b). Moreover, even in-service teachers may rely too heavily on recall-level questions (Morrison & Lederman, 2003).

  • students about the problems they solve to determine Interviewhow they arrived at their answers (See Bright, 1996; Ginsburg, 1997; Ginsburg, Jacobs, & Lopez, 1998, Ginsburg & Pappas, 2004; Moyer & Milewicz, 2002).

  • Evaluate the probability that students may have misconceptions through measures like the Scaling Individuals and Classifying Misconceptions (SICM) model and use that feedback as a basis for tailoring instruction to their needs (Bradshaw & Templin, 2014).

  • Facilitate class discussions designed to elicit student misconceptions, challenge inaccuracies and emphasize a correct conceptual framework throughout instruction (Andrews, Kalinowski, & Leonard, 2011).

Examples of good questioning skills
From Ralph (1999a, 1999b) and cited in Moyer & Milewicz (2002)
  • Prepare important questions ahead of time.

  • Deliver questions clearly and concisely.

  • Pose questions that stimulate thinking.

  • Provide children with enough time to think about and prepare their answers.

  • Avoid asking questions that require one-word answers.

From Vacc & Bright (1999)
  • Ask questions that require critical thinking (e.g., ask students to compare different solution strategies).

  • Ask probing questions that refer specifically to what a student says, does or thinks in order to gain further information about students’ solution strategies (e.g., “How did you know how many nickels and how many pennies to put down?”).

  • Ask followup questions for both incorrect answers and correct answers.

From Cohen, Steele, & Ross (1999)
  • Follow up answers with constructive feedback that communicates high expectation that a student will be able to master the material being questioned if they do not yet understand it.
Interview protocol

You might begin an interview with open-ended questions that get students to examine and justify their answers: 

  • How did you figure that out?
  • What were you thinking when you got that answer?
  • How would you explain to another student how you got your answer?
  • Can you give me another example to explain what you mean?

Then move to more focused or specific questions:

  • What was the first thing you thought of?
  • What did you do next?
  • Did you have any pictures in your head as you worked through that part of the problem?

If students are having difficulty verbally expressing their ideas, you can also use more "recognition questions." For these questions, the student does not have to produce the content, but rather the content is put into the question and the student need only respond to it in some way.

Examples of recognition questions

Did you use the procedure of [X] or did you do use the procedure of [Y] to solve the problem?

From Moyer and Milewicz (2002):

Use the probing with follow-up technique, where a variety of question types are used to further investigate the child's answers. This includes questioning both correct and incorrect responses, and use of specific questioning.

If only incorrect responses are probed with follow-up questions, the child misses out on the opportunity to articulate and defend the methods used to arrive at accurate solutions. Moreover, probing only incorrect responses assumes that if the child provided the correct answer, he or she must understand the concept. This is not always true. Children can arrive at a correct answer through faulty reasoning or partial understanding of a concept. Specific questions are tailored to individual responses and can be better able to get at student thinking than are more general, open-ended questions.

Things to avoid in a diagnostic interview (from Moyer & Milewicz, 2002)
  • Avoid "check listing" in the interview. With check listing the teacher proceeds from one question to the next on the interview protocol, relying on a script and paying little attention to the student's responses. Check listing usually does not include follow-up questions, but does involve the use of verbal checkmarks (the specific and repetitive use of one- or two-word verbalizations, such as "OK," "Right," or "Good Job.") These verbal checkmarks say to students that they no longer need to continue thinking about the question, because their answers were complete and "checked-off" on the list.

  • Avoid instructing when you should be assessing. Instructing includes asking leading questions and providing hints about the answer. The use of leading questions can result in a guessing game in which the student concentrates more on figuring out what the interviewer is thinking than on explaining his/her own thinking. Instructing also means that the questioning itself is halted while the concept is retaught.

"Confirmation bias" pitfalls and potential benefits

In his 2012 book, "How Children Succeed," Paul Tough discusses the concept of "confirmation bias" as it relates to chess. It involves, in this context, a player prioritizing a particular move as being the path to ultimate victory and therefore ignoring potential pitfalls during a game. There are clear disadvantages to this tendency, as it can lead a player to disregard other more profitable and appropriate moves and perhaps impede their chances of success. However, there is some conceivable benefit to this bias. Having a preconception of some perceived advantageous move allows the player to navigate a match with more confidence than he or she might otherwise have possessed. That confidence can aid in bolstering their performance. In this way, the confirmation bias is both a detriment and a benefit.

The same premise could apply in the classroom with students who come in with preconceptions that are perhaps inaccurate. Though it may take some work to intervene with these students to help them relearn information in a more accurate way, it is important to consider that this prior knowledge and experience may be strengthening their starting point, from a socioemotional perspective, as they approach your class.

Difficulty factors assessment (DFA)

Difficulty factors assessment is a way to identify which features of a problem are either causing students difficulty or facilitating their learning. In DFA, target factors are systematically varied and crossed with other problem features, leading to multiple versions of the problem).

DFA example (taken from Rittle-Johnson & Koedinger, 2005)

For example, in the case of math, to find out whether a story (word problem) context makes it easier or harder for students to solve a particular kind of math problem, one version of the assessment might ask children to add 1/2 + 2/3 in the context of a story, and to add 3/4 + 1/7 without a story. The format of these two questions would vary only the presence or absence of a story. A second version of the assessment would do the opposite: add 3/4 + 1/7 in the context of a story and 1/2 + 2/3 without a story.

Examining student responses across the two versions of the problem provides estimates of student accuracy when a problem is or is not embedded in a story context, and these estimates are not confounded by potential differences in the difficulty of the numbers used.

You can also use DFA to analyze students' errors and the strategies they use to see how each factor affects problem solving.

Present variations of problems

Varying systematic features of problems will enable you to be more certain of what is really underlying student performance and avoid locking onto only one type of behavior (Brown & Burton, 1978).

Brown & Burton (1978) gave a student teacher the following: 

Problem A

19 +  9 = 199 

After which the student teacher generated the following problems and concluded that the misconception (bug) was to “write the bottom digit after the top number.”

Problem B Problem C
23 81
+6 +8
236 818

However, this hypothesis failed when the student was given his first test problem (which differed in format from all the others in not having a shorter/smaller bottom digit):

8 +  12 =

To which the student responded with 812 as the answer.

However, the student teacher had the wrong idea as to the misconception (bug). The actual misconception is that “single digit operands are linked onto the end of the other operand.” Hence, the misconception answer to this last problem would be 128. By presenting examples with only a shorter bottom digit, the student teacher had obtained what seemed to be confirming evidence of his hypothesis. Hence, vary the problem when trying to uncover what students are really thinking.

In quizzing, include misconceptions among wrong answer options. In this way, students who may not be able to articulate their misconceptions are likely to choose them as the “correct” answer.

The “implicit-confidence” test

Tests called two-dimensional tests (TDTs) are designed to determine student’s uncertainty or confidence in their answers (see Klymkowsky et al., 2006). There are two versions of these confidence-based TDTs: implicit and self-assessed (also called explicit). See Figure 1 (taken from Klymkowsky et al., 2006).

Two-dimensional tests

With the self-assessed TDT, students indicate their level of confidence as a separate response.

With an implicit TDT, student confidence is embedded in a single response. Students are told that only one answer is correct. However, they are given the option to hedge their response. For example, if students are uncertain as to whether the correct answer is “A” or “B,” they can pick “A or B,” and so on. Students can also indicate that they have “no idea” what is the correct choice.

Implicit tests have an advantage over explicit (self-assessed) tests because students do not have to assess their own confidence level; the answer choice they select contains this information.

When designed correctly, a TDT test can provide teachers with very useful information. If incorrect answers (called distracters) state commonly held misconceptions, and a large percentage of students answer wrongly (but confidently), this indicates that students are still clinging to their misconceptions even after instruction. If this happens, or if many students answer “No Idea,” teachers may need to reevaluate their approach to presenting the material. But, if a majority of the students taking the test are confidently correct, then it is likely the teacher presented the concept effectively.

This kind of test is especially effective when used as a pre-assessment to reveal the percentage of students that hold misconceptions common to the topic being introduced.

Questionnaires: assess misconceptions about science
Force and motion

It is well known that students have many misconceptions when it comes to the study of force and motion. A multiple choice test known as the “Force Concept Inventory (FCI)” has been developed to probe those misconceptions in high school and college students (see Savinainen & Scott, 2002). (Teachers may download the FCI online. Please note, that you will need to contact the author, David Koch, in order to receive a password to access the questionnaire.)


In addition, an open-ended instrument, known as the “Views of Nature of Science (VNOS)” questionnaire, has been developed for K-12 students to assess learners’ conceptions of science (see Lederman, Abd-El-Khalick, Bell, & Schwartz, 2002). VNOS addresses conceptions and misconceptions that students can have about what science is and what scientists do.

Shape of the earth

Students also have misconceptions about the shape of the earth. Vosniadou and Brewer (1992) (PDF, 3.38MB) have developed a protocol for identifying students’ mental models of the earth. In addition, Hayes, Goodhew, Heit and Gillan (2003) have such a protocol adapted from Vosniadou and Brewer (1992).

Categorization/sorting tasks

In this activity, the teacher presents to students model objects (or pictures of objects) to sort into different categories. The instructions are: “Put the things together that go together,” or “Put the things that are alike (or “the same kind of thing”) together.”

For example, you can present a collection of pictures consisting of a mix of 10-20 exemplars and non-exemplars of a particular concept and ask students to sort the pictures into categories (e.g., living thing, animal). (See Stovall & Nesbit, 2003). In this way, you can see what principle each student is using as they group the pictures. (This in turn reveals the students’ understanding of the concept).

Memory/recall problems

For example, McNeil & Alibali (2005) showed students the following equation for five seconds and told them to memorize it, but not solve it: 

7 + 4 + 5 = 7 + __

Later, when they were asked to recall the problem, many students wrote the following: 

7+ 4 + 5 + 7 = __

This revealed that many students were following the arithmetic pattern of “operations = answer,” and not understanding the concept of equivalence (the real meaning of the equal sign). The correct view of the equal sign is a symbol of equivalence. It is the “relational” view of the equal sign as a symbol that denotes a relationship between two quantities.

Why and how do these teaching strategies work?

Diagnosing students' prior (pre-instructional) knowledge is critical to successful teaching for several reasons. 

  1. Teachers often think that their students know more than they really know (Diakidoy & Iordanou, 2003; Eckert, Dunn, Codding, Begeny, & Kleinmann, 2006).
  2. Teaching has a greater chance of leading to learning when strategies are informed by students' current understandings of the concepts and theories being taught (e.g., Carpenter, Fennema, Peterson, Chiang, & Loef, 1989; Fennema et al., 1996).
  3. Student learning does not occur in a vacuum. Students do not come to school as empty vessels, but rather come with considerable pre-instructional knowledge related to the topics taught in school. Learning builds on and is related to this prior knowledge (Bransford, Brown, & Cocking, 2000; Gelman & Lucariello, 2002; Piaget 1926, 1930, 1951, 1952, 1954; Piaget & Inhelder, 1969; Resnick, 1983). We all interpret new information in terms of our current knowledge.

Student pre-instructional knowledge is based on intuitions, everyday experiences, and/or what they have been taught in other contexts. Such pre-instructional knowledge is generally referred to as preconceptions. Since most knowledge is organized into domains such as mathematics, science, art, history, etc., it's not surprising that preconceptions are organized the same way.

The selection of an appropriate teaching strategy depends on whether a preconception is an anchoring conception or a misconception ("alternative conception" or "intuitive theory"). When students' preconceptions are consistent with the concepts being taught, these preconceptions are termed anchoring conceptions. In this case, the students' prior knowledge does not interfere with learning, but instead becomes the base onto which the learner can build new knowledge. When an anchoring concept is present, learners link new information to what they already know and learning becomes a matter of conceptual growth or assimilation. (Piaget 1926, 1930, 1951, 1952, 1954; Piaget and Inhelder, 1969; Resnick, 1983).

However, when students' preconceptions are inconsistent with, and even contradictory to concepts in the curriculum, they are termed alternative conceptions or misconceptions. These misconceptions interfere with learning for several reasons. 

  1. Students use their prior knowledge to interpret new experiences. If the prior knowledge is incorrect the interpretation of a new but related concept is likely to be incorrect.
  2. Misconceptions can be entrenched and tend to be very resistant to instruction (Brewer & Chinn, 1991). This happens because, learning, in such cases, is not simply a matter of adding to the students' knowledge. Rather, it is a matter of radically reorganizing or replacing the students' knowledge. 
  3. If new information that students hear is incongruent with what they already believe to be true, they may disregard it entirely and learn nothing in class (Sewell, 2002). Conceptual change or accommodation has to occur before learning can happen (Carey, 1985, 1986; Posner, Strike, Hewson, & Gertzog, 1982; Strike & Posner, 1985, 1992). It is the teacher's job to bring about this conceptual change.

The fact that students have alternative conceptions (misconceptions) means that, in many cases, the errors they make are not random. Rather, student errors can reveal a world of systematic misconceptions that can be commonly held (Brown & Burton, 1978). To be able to make sense of student errors, and uncover their underlying causes, teachers must know what students are thinking.

Alternative conceptions (misconceptions) are quite common in both children and adults for a couple of reasons. As we look for meaning and try to understand our experiences, it is only natural that not all the ideas we develop are correct. Moreover, some concepts/theories in different content areas are counter-intuitive and can be very difficult to grasp. This makes our understanding of them flawed or incomplete.

Do all students hold preconceptions?

Yes, all students come to school knowing a lot about many of the topics on which they will be instructed. Some of their knowledge is correct ("anchoring conceptions"), while some is erroneous ("misconceptions"). And, it's not only students; adults (including teachers) bring preconceptions with them from many domains of knowledge.

Are misconceptions common?

Yes, they are. It is normal for both children and adults to have misconceptions. Therefore, before instruction on a new topic begins, it is important for teachers to find out what ideas their students have about the topic.

Are there typical misconceptions that occur for different subject areas?

Yes. Although some misconceptions are unique to individuals, most misconceptions that students experience in different subject matter areas are typical misconceptions, that is, ones that are subject specific and also common to many students. It is especially important for teachers to become familiar with the misconceptions that students commonly hold in the subjects they teach. This knowledge is critical to providing efficient and effective instruction.

Are some students more likely to adjust preconceptions than others?

Research suggests that students with "approach" goals are more likely to adjust their previous understanding of subject matter than those with "avoid" goals. This means that students who are motivated by goals focused on approaching a particular level of mastery or performance will perhaps be more open to altering preconceptions than those whose goals focus more on avoiding a particular result (like failing a course). 

For example, a student interested in getting a strong grade in a science course may be more likely to seek out new information and truly learn the material, while a student motivated by simply not failing the course may be satisfied with minimal understanding of the material and achieving a passing grade. In this way, students with "approach" goals may be more likely to recognize and work to overcome previous inaccurate conceptions due to higher ambitions for content mastery and grade achievement. By working to foster more "approach" goal orientations in the classroom, student misconceptions may become more malleable (Johnson & Sinatra, 2014).

When do these recommendations work?

Most of the strategies discussed here are general techniques that should work for students in grades K-12. However, use of the "implicit confidence test" is best reserved for high school students. Strategies such as the categorization/sorting task and modeling or drawing their solutions might be particularly suited for younger students.

Individual differences

For students with delayed language development, for those who are learning English as their second language, or for students who are not verbally adept, teachers should rely more heavily on sorting/categorization and modeling method tasks for diagnosing preconceptions and identifying misconceptions.

Where can teachers get more information?

Bransford, J. D., Brown, A. L., Cocking, R. R. (Eds.). (2000). How people learn: Brain, mind, experience and school. Washington, DC: National Academy Press.

Brown, J. S., & Burton, R.R. (1978). Diagnostic models for procedural bugs in basic mathematical skills. Cognitive Science, 2, 155-192.

Ginsburg, H.P., Jacobs, S.F., & Lopez, L.S. (1998). The teacher's guide to flexible interviewing in the classroom: Learning what children know about math. Boston: Allyn and Bacon.

Morrison, J. A., & Lederman, N.G. (2003). Science teachers' diagnosis and understanding of students' preconceptions. Science Education, 87, 849-867.

Tough, P. (2012). How children succeed: Grit, curiosity, and the hidden power of character. Boston: Houghton Mifflin Harcourt.


Andrews, T., Kalinowski, M., & Leonard, S. (2011). “Are Humans Evolving?” A Classroom Discussion to Change Student Misconceptions Regarding Natural Selection. Evolution: Education and Outreach,4(3), 456-466.

Barthlow, M., & Watson, S. (2014). The Effectiveness of Process ‐ Oriented Guided Inquiry Learning to Reduce Alternative Conceptions in Secondary Chemistry. School Science and Mathematics,114(5), 246-255.

Bradshaw, L., & Templin, J. (2014). Combining Item Response Theory and Diagnostic Classification Models: A Psychometric Model for Scaling Ability and Diagnosing Misconceptions. Psychometrika,79(3), 403-425.

Bransford, J. D., Brown, A. L., Cocking, R. R. (Eds.). (2000) How people learn: Brain, mind, experience and school. Washington, DC: National Academies Press.

Brewer, W. F., & Chinn, C. A.  (1991).  Entrenched beliefs, inconsistent information, and knowledge change.  In L. Birnbaum (Ed.), The International Conference of the Learning Sciences: Proceedings of the 1991 conference (pp. 67-73). Charlottesville, VA: Association for the Advancement of Computing in Education.

Bright, G. W.  (1996, September).  Implementing the assessment standards for school mathematics. Teaching Children Mathematics, 18-22.

Brown, J. S., & Burton, R. R.  (1978).  Diagnostic models for procedural bugs in basic mathematical skills. Cognitive Science, 2, 155-192.

Carey, S.  (1985).  Conceptual change in childhood. Cambridge, MA: MIT Press.

Carey, S.  (1986).  Cognitive science and science education. American Psychologist, 41, 1123-1130.

Carpenter, T. P., Fennema, E., Peterson, P. L., Chiang, C. P., & Loef, M.  (1989).  Using knowledge of children’s mathematics thinking in classroom teaching: An experimental study. American Educational Research Journal, 26(4), 499-531.

Cohen, G. L., Steele, C. M., & Ross, L. D. (1999). The mentor’s dilemma: Providing critical feedback across the racial divide. Personality and Social Psychology Bulletin25(10), 1302-1318.

Diakidoy, I. N., & Iordanou, K.  (2003).  Pre-service teachers’ and teachers’ conceptions of energy and their ability to predict pupils’ level of understanding. European Journal of Psychology of Education, 18, 357-368.

Delpit, L. (2006). Other people's children : Cultural conflict in the classroom. New York: New Press : Distributed by W.W. Norton.

Eckert, T. L., Dunn, E. K., Codding, R. S., Begeny, J. C., & Kleinmann, A. E.  (2006).  Assessment of mathematics and reading performance: An examination of the correspondence between direct assessment of student performance and teacher report. Psychology in the Schools, 43, 247-265.

Fennema, E., Carpenter, T. P., Franke, M. L., Levi, L., Jacobs, V. R., & Empson, S. B.  (1996).  A longitudinal study of learning to use children’s thinking in mathematics instruction. Journal for Research in Mathematics Education, 27(4), 403-434.

Gelman, R., & Lucariello, J.  (2002).  Role of learning in cognitive development.  In H. Pashler (Series Ed.) & C. R. Gallistel (Vol. Ed.), Stevens’ handbook of experimental psychology:  Learning, motivation, and emotion (vol. 3, 3rd ed., pp. 395-443). New York: Wiley.

Ginsburg, H. P.  (1997).  Entering the child's mind: The clinical interview in psychological research and practice. New York: Cambridge University Press.

Ginsburg, H. P.,  Jacobs, S. F., & Lopez, L. S.  (1998).  The teacher's guide to flexible interviewing in the classroom: Learning what children know about math. Boston: Allyn and Bacon.

Ginsburg, H. P., & Pappas, S.  (2004).  SES, ethnic, and gender differences in young children’s informal addition and subtraction: A clinical interview investigation. Journal of Applied Developmental Psychology, 25, 171-192.

Hayes, B. K., Goodhew, A., Heit, E., & Gillan, J.  (2003).  The role of diverse instruction in conceptual change. Journal of Experimental Child Psychology, 86, 253-276.

Johnson, M., & Sinatra, G. (2014). The Influence of Approach and Avoidance Goals on Conceptual Change. The Journal of Educational Research,107(4), 312-325.

Koretz, D. M. (2008). Measuring up. Harvard University Press.

Klymkowsky, M. W., Taylor, L. B., Spindler, S. R., & Garvin-Doxas, R. K.  (2006).  Two-dimensional, implicit confidence tests as a tool for recognizing student misconceptions. Journal of College Science Teaching, Nov-Dec, 44-48.

Lederman, N. G., Abd-El-Khalick, F., Bell, R. L., & Schwartz, R. S.  (2002).  Views of nature of science questionnaire (VNOS): Toward valid and meaningful assessment of learners’ conceptions of nature of science. Journal of Research in Science Teaching, 39(6), 497-521.

Marks, R.  (1990).  Pedagogical content knowledge: From a mathematics case to a modified concept. Journal of Teacher Education, 41(3), 3-11.

McNeil, N. M., & Alibali, M. W.  (2005).  Why won’t you change your mind? Knowledge of operational patterns hinders learning and performance on equations. Child Development, 76, 883-899.

Morrison, J. A., & Lederman, N.G.  (2003).  Science teachers’ diagnosis and understanding of students’ preconceptions. Science Education, 87, 849-867. 

Moyer, P. S., & Milewicz, E.  (2002).  Learning to question: Categories of questioning used by pre-service teachers during diagnostic mathematics interviews. Journal of Mathematics Teacher Education, 5, 293-315.

Piaget, J.  (1926).  The language and thought of the child. New York: Kegan Paul.

Piaget, J.  (1930).  The child’s conception of the world. New York: Harcourt, Brace, & World.

Piaget, J.  (1951).  Play, dreams, and imitation in childhood. New York: Norton.

Piaget, J.  (1952).  The origins of intelligence in children.  New York: International Universities Press.

Piaget, J.  (1954).  The construction of reality in the child. New York: Ballantine.

Piaget, J., & Inhelder, B.  (1969).  The psychology of the child. New York: Basic Books.

Posner, G. J., Strike, K. A., Hewson, P. W., & Gertzog, W. A.  (1982).  Accommodation of a scientific conception: Toward a theory of conceptual change. Science Education, 66(2), 211-227.

Ralph, E. G.  (1999a).  Developing novice teachers’ oral-questioning skills.  McGill Journal of Education, 34(1), 29-47.

Rattan, A., Good, C., & Dweck, C. S. (2012). “It's ok—Not everyone can be good at math”: Instructors with an entity theory comfort (and demotivate) students. Journal of Experimental Social Psychology, 48(3), 731-737.

Ralph, E. G.  (1999b).  Oral-questioning skills of novice teachers: Any questions? Journal of Instructional Psychology, 26(4), 286-296.

Resnick, L. B.  (1983).  Mathematics and science learning: A new conception. Science, 220, 477-478.

Rittle-Johnson, B., & Koedinger, K. R.  (2005).  Designing knowledge scaffolds to support mathematical problem solving. Cognition and Instruction, 23(3), 313-349.

Rittle-Johnson, B., & Siegler, R. S.  (1999).  Learning to spell: Variability, choice, and change in children’s strategy use. Child Development, 70, 332-348.

Savinainen, A., & Scott, P.  (2002).  The Force Concept Inventory: A tool for monitoring student learning. Physics Education, 37, 45-52.

Sewell, Audrey. (2002). Constructivism and Student Misconceptions: Why Every Teacher Needs To Know about Them. Australian Science Teachers' Journal,48(4), 24-28.

Stovall, G., & Nesbit, C. R.  (2003).  Let’s try action research! Science and Children, 40(5), 44-48.

Strike, K. A., & Posner, G. J.  (1985).  A conceptual change view of learning and understanding. In L. West & L. Pines (Eds.), Cognitive structure and conceptual change (pp. 147-176).  San Diego, CA: Academic Press.

Strike, K. A., & Posner, G. J.  (1992).  A revisionist theory of conceptual change. In R. A. Duschl & R. J. Hamilton (Eds.), Philosophy of science, cognitive psychology, and educational theory and practice (pp. 211-231).  Albany, NY: Statue University of New York Press.

Tough, P. (2012). How children succeed : Grit, curiosity, and the hidden power of character. Boston: Houghton Mifflin Harcourt.

Vacc, N. N., & Bright, G. W.  (1999).  Elementary preservice teachers’ changing beliefs and instructional use of children’s mathematical thinking. Journal for Research in Mathematics Education, 30(1), 89-110.

Vosniadou, S., & Brewer, W. F.  (1992).  Mental models of the earth: A study of conceptual change in childhood. Cognitive Psychology, 24, 535-585.