Applying Assessment Strategies in Psychology
Critique of Assessment Strategies Applied to Goals and Outcomes

Goal 1: Knowledge Base of Psychology

Demonstrate familiarity with the major concepts, theoretical perspectives, empirical findings and historical trends in psychology.

1.1:

Characterize the nature of psychology as a discipline.

1.2:

Demonstrate knowledge and understanding representing appropriate breadth and depth in selected content areas of psychology: theory and research representing general domains, the history of psychology, relevant levels of analysis, overarching themes, and relevant ethical issues.

1.3:

Use the concepts, language, and major theories of the discipline to account for psychological phenomena.

1.4:

Explain major perspectives of psychology (e.g., behavioral, biological, cognitive, evolutionary, humanistic, psychodynamic, and sociocultural).


ASSESSMENT CATEGORY

ESTIMATE OF METHOD'S POTENTIAL

CLASSROOM/COURSE DATA

Overall Strong potential. Classroom activities and course data naturally provide venues in which to assess content knowledge in psychology. However, current assessment trends suggest that concentrating solely on these indices may not provide sufficient information to provide meaningful feedback on program integrity.

OBJECTIVE TESTS

Strong potential. Despite the inherent difficulties in constructing valid and reliable objective tests, the method can effectively assess content learning. However, most objective tests tend to evaluate student knowledge more routinely at lower levels of thinking (e.g., rote, simple application).

ESSAY TESTS

Strong potential. Despite the labor intensiveness of providing feedback on essay tests, this approach facilitates greater access to measuring deeper levels of content learning. Faculty are increasingly turning to the use of rubrics and specified criteria to address problems of reliability in grading.

EMBEDDED QUESTIONS AND ASSIGNMENTS

Strong potential. Choosing some courses in which program assessment activities can occur can still facilitate specific assessment of content knowledge. For example, embedding a departmental assessment of ability to demonstrate APA format in a methods class provides a a reasonable vehicle for assessing content knowledge of APA format. Other emphases are possible in embedded assessments, including ethics, persistent themes, or historical detail among others.

CLASSROOM ASSESSMENT TECHNIQUES

Strong potential. Faculty engage in classroom assessment techniques to provide spot checks of how well students are learning specific concepts. Although the focus is understandably narrow (e.g., the content of a particular class), the method provides optimal feedback for the faculty member concerned with what students are learning and retaining.


INDIVIDUAL PROJECTS/PERFORMANCE ASSESSMENT

Overall Strong potential. The knowledge base of psychology is predictably the foundation for most individual projects. These generally offer the advantage of studying some corner of the discipline in depth; however, breadth of exposure to content may be a casualty given the time limitations most faculty and students face.

WRITTEN PRODUCTS

Strong potential. When assignments are well-designed, written products should provide insight into what students know and don't know about content. Faculty have discovered that specifying how much content (e.g., number of required references) may facilitate the depth of exploration the faculty member had in mind when designing the project instructions.

ORAL PRESENTATIONS

Strong potential. oral presentations also provide insight into student learning of the content. In addition, the opportunity to engage students in questions allows faculty and classmates to probe the depth of student knowledge while building oral communications skills.

GRAPHIC TESTS AND DISPLAYS

Strong potential. Concept maps can reveal the nature of associations that students develop regarding specified content in the discipline.

POSTERS

Strong potential. posters can provide a more global sense of what students understand due to the brevity of the medium. However, informal questioning can fill in the gaps about what students have not communicated in the poster.

STRUCTURAL/SITUATIONAL ASSESSMENTS

Strong potential. Although situational assessments tend to emphasize application of learning, applications are built on a disciplinary foundation. The success of assessment of content learning will depend on the expertise demonstrated in the design of the applied assessment.


SUMMATIVE PERFORMANCE ASSESSMENT

Overall Mixed potential. Assessments that occur at the end of a program vary in their effectiveness for assessing content. In some cases, depth of knowledge required by some demonstrations will not allow an estimate of broad knowledge in the discipline.

STANDARDIZED TESTS

Mixed potential. Currently there are two primary standardized exams that allow for comparison across institutions as well as tracking changes in program achievement over time: the Academic Concentration Applied Test (ACAT) and the Major Fields Test by ETS. Each exam measures knowledge in the subdisciplines of psychology, but student course selection may adversely affect overall performance on either instrument. Care must be exercised in interpreting the results.

LOCALLY DEVELOPED TESTS

Strong potential. Developing a department examination is a time-consuming but effective way to track changes in student knowledge over time but does not provide normative comparison with other programs. In addition, test security and changes in content knowledge make this practice complex.

CAPSTONE EXPERIENCES

Mixed potential. The utility of capstone performance to assess content knowledge greatly depends on the scope of the course design. Students are more likely to develop deep levels of expertise in more narrowly defined areas of psychology in most capstone designs. To the extent that their performance represents what they can do within specific performance parameters, the capstone may be a satisfying method to assess the ability to deal with content in sophisticated ways. However, more broadly conceived capstone courses (e.g., history, systems of psychology) may provide broader assessment opportunities.

INTERNSHIP/PROFESSIONAL APPLICATIONS

Moderate potential. Internships and professional applications facilitate specific types of applications. For example, an industrial-organizational internship may be an optimal way for a student to demonstrate the knowledge base related to the subdiscipline, but it may not be satisfying as a broad assessment.

PORTFOLIOS

Strong potential. Selecting and justifying selections from explicit departmental criteria will facilitate student reflection regarding the level of expertise they have developed in the content of psychology.

ASSESSMENT CENTER METHODS

Strong potential. Similar to structured assessments, the in-basket strategies of assessment center methods can provide insight into student abilities to apply principles from the content of psychology.

CASE AND LONGITUDINAL STUDIES

Limited potential. Because much of content learning is fragile, longitudinal studies of content retention are likely to be disappointing sources of student learning. In addition, merely reporting how sturdy content learning is over time rather than directly assessing may be content learning will be a less reliable measure.


SELF-ASSESSMENT

Overall Mixed potential. As can be seen from student anticipation of how well they performed on an exam, student ability to judge their own expertise is variable.

STUDENT JOURNALS

Moderate potential. Journal instructions can specify the manner in which students should interact with the content of the discipline. For example, instructions might require that students demonstrate the appropriate application of five concepts or principles from the discipline. Students will vary in their own expert judgment on the success of addressing the concepts or principles in the manner anticipated by the faculty.

SELF-CRITIQUES

Mixed potential. Students have limited experience in making judgments about how well they have met the content criteria of a given assignment. Students often drift to the easier-to-judge aspects of performance, such a format concerns, interest generation, or comfort level rather than exploring how well they have reflected content expertise.


COLLABORATION

Mixed potential. Some methods offer effective avenues for examining content and theory, while others are less promising.

RESEARCH TEAMS

Strong potential. Research teams give students an opportunity to demonstrate content expertise in two dimensions: research methods and the subject matter that the research is designed to explore. Students can receive separate evaluations on the extent to which they have collectively demonstrated research expertise as well as whether they have appropriately represented the target content.

GROUP PROJECTS

Strong potential. Group projects can also provide a content-based opportunity to develop group skills. Projects can reflect successful or unsuccessful strategies to master relevant content and principles. However, group projects suffer similar limitations to individual projects. Committing in-depth study to one arena may require the sacrifice of exposure to other content in the course.

ON-LINE ACTIVITIES

Limited potential. Unless students are given very constrained instructions regarding how to pursue content collaboration, the use of chat room or email exchanges to monitor content expertise may be challenging.


INTERVIEWS & SURVEYS

Mixed potential. In general, surveys and interviews are not recommended because the assessment of content is not likely to be direct.

SATISFACTION SURVEYS

Poor potential. Assessing content expertise through satisfaction surveys is too indirect to be recommended.

PERFORMANCE REVIEWS

Moderate potential. Although this method is time consuming, the next step (e.g., grad school or employment) can provide for direct observation of the content of psychology.

EXIT INTERVIEWS

Limited potential. Exit interviews tend to focus on affective dimensions of learning as well as the collection of impressions that may facilitate program improvement. Content mastery is not routinely the focus of exit interviews.

EXTERNAL EXAMINER INTERVIEWS

Moderate potential. A rigorous external examiner protocol could focus on the depth of content mastery of individual students. However, the expense and time limitations of this approach tend to focus on other aspects of student performance.

FOCUS GROUPS

Not recommended. Focus groups typically convene to solve a specific problem rather than provide a measure of content mastery. Such academic development may be inferred but there are other more direct methods to assess mastery.

FOLLOW-UP ALUMNI INTERVIEWS

Not recommended. Engaging with alumni over the specifics of content that they can recall is likely to be a discouraging assessment strategy since the detail of the discipline dims with distance from graduation.


ARCHIVAL MEASURES

Mixed potential. Archival measures can provide some insight into the content bases to which students have been exposed but will do little to assess more formal learning of the content in the discipline.

TRANSCRIPT ANALYSIS

Limited potential. Individual transcript analysis can provide not just a measure of the various content bases to which the student has been exposed but through grades can provide a gross measure of achievement in those areas. However, assessment experts recommend that other noncourse-based strategies will be more effective in providing legitimate measures of student and program achievement.

ANALYSIS OF TRANSFER PATTERNS

Limited potential. Examining patterns of what transfer students provide can help departments determine what and when to offer in the curriculum, but will shed little light on the quality of learning.

SYLLABUS AUDIT

Limited potential. A syllabus audit can isolate the range of content exposure that students experience but will be poor indicators of actual learning.

DEMOGRAPHIC DATA ANALYSIS

Not recommended. Understanding the characteristics of the student body will provide little insight into their content mastery.

ALUMNI DATABASE

Not recommended. The alumni database does not directly reveal student expertise in the content of psychology. However, many programs rely on the percentage of students who go on to graduate school in the area as an indirect measure of content expertise.

LIBRARY STATISTICS USAGE/WEB HITS

Not recommended. Content expertise is not apparent in this archival analysis.