More than 200 people gathered in Atlanta Sept. 27-29 for a conference APA's Div. 2 (Society for the Teaching of Psychology) organized to help psychology instructors handle the growing pressure to demonstrate what students have learned. Organizers believe that the conference, "Measuring up: best practices in assessment in psychology education," is the first of its kind for any discipline.

In workshop sessions and speaker presentations, conference participants learned strategies for preparing for departmental reviews, improving students' writing, compiling evaluative portfolios for students and administrators, and assessing student learning online. Speakers offered tips for building faculty consensus on student learning objectives and for crafting and using comprehensive and standardized tests. They also shared ways to develop exit and alumni surveys and to evaluate student research projects.

"At a time of increased accountability in higher education, this conference provided an invaluable opportunity for participants to analyze issues associated with outcomes assessment in the psychology curriculum and to learn about a variety of assessment methods," says Cynthia Belar, PhD, APA's executive director for education. "It was yet another example of how psychology's contribution to teaching and learning is relevant not only for our own discipline, but to educational systems as well."

The conference grew out of the APA Education Directorate's 1999 Psychology Partnerships Project (P3), which brought together educators from high school through graduate school to forge partnerships. Consistent with the partnership philosophy from P3, conference organizers recruited keynote speakers from across educational contexts:

  • Diane Halpern, PhD, of Claremont McKenna College opened the conference by describing how to cover your assessment needs and maintain the "godliness" required to advocate for ethically appropriate assessment.

  • Randy Ernst, a high school teacher from Lincoln High School in Nebraska, offered insights into "high stakes hysteria" that attends assessment decisions and policies.

  • Donna Duffy, PhD, a service- learning scholar from Middlesex Community College, explored how service-learning courses can achieve objectives consistent with the Undergraduate Psychology Learning Goals and Outcomes, endorsed by APA's Board of Educational Affairs (BEA) and available at www.apa.org/ed/pcue/taskforcereport.pdf.

  • Robert Sternberg, PhD, of Yale University, offered teaching and assessment insights from a triarchic approach to the classroom that fosters analytic, practical and creative gains in students.

In addition, six psychology departments described their distinctive approaches to the assessment challenge:

  • Frostburg State University. The department assesses students' ability to "think and behave like a psychologist" through measures of content knowledge (e.g., the ETS Major Field Achievement Test in Psychology), skills (e.g., differentiation between research methods and critical reading of literature reviews), and willingness to use those skills (e.g., research consumption attitudes and counselor self-efficacy).

  • Mansfield University. Psychology faculty there favor papers and group projects that students gather into evaluative portfolios. The portfolios and various tests assess students' ability to understand psychology's conceptual bases, gather and assess library and computerized information, think critically about key disciplinary issues, use the discipline's language, understand research methods and statistics, communicate clearly with others, understand psychologists' various roles and work settings, appreciate diversity and understand ethics in psychology.

  • Spelman College. Student learning is gauged through institutional assessment (e.g., the Myers-Briggs Type Indicator, student course evaluations and senior exit surveys), departmental assessment (e.g., the ETS field exam and a comprehensive exam given to sophomore and seniors and alumnae surveys) and honors and experiential outcomes (e.g., participation in honors societies, engagement in research and practica, and acceptance into jobs and graduate schools).

  • James Madison University. The department approaches assessment in steps according to why it's done: to focus faculty on shared curriculum objectives, gather information on student competencies, guide program administration, demonstrate program strengths, identify program weaknesses, inform instructional delivery and evaluate program change.

  • Fordham University. Faculty emphasize building faculty buy-in to an overall assessment plan.

  • Kennesaw State University. The department assesses its students on 11 learning outcomes using alumni and senior exit surveys, student research proposals, senior capstone course assignments and a research test. Recently it has added the ETS field exam and an advisement survey as part of its efforts to offer strong academic and career advising.

In closing the meeting, conference program chair Jane Halonen, PhD, discussed how psychology departments can tap 10 BEA-endorsed achievement goals for undergraduate students. Halonen, who chaired the APA task force that developed the goals, explained that the group has produced an Assessment CyberGuide, which will soon be available at www.apa.org/ed, to guide departments' reading and use of the goals. The document is meant to help departments develop assessment plans based on the learning goals. Conference attendees agreed to capture the best ideas from the conference in a text to be published by APA. "We anticipate that the book's chapters will characterize the conference--intellectually rich, educationally stimulating and eminently practical," says Dana Dunn, PhD, of Moravian College, who will serve as primary editor for the project. "The contributors and editors alike believe assessment can be a creative, constructive and instructive part of psychology education."

Conference host Bill Hill, PhD, of Kennesaw State University, adds that, due to the positive feedback he's been getting from conference participants, Div. 2 has set up a task force to determine whether to hold an annual best practices conference with rotating topics. In addition, Hill has been invited to report on the conference at next February's American Association of Higher Education Assessment Forum.

Jane S. Halonen, PhD, professor and director of the James Madison University School of Psychology, contributed to this report.