Commission on Accreditation (CoA) Update August 2011

Richard Seime, PhD
Chair

Joyce Illfelder-Kaye, PhD
Associate Chair, Program Review

Elizabeth Klonoff, PhD
Associate Chair, Quality Assurance

Summer Meeting Review

The Commission on Accreditation (CoA) held its second program review meeting of the year on July 14-17, 2011, at the APA building in Washington, DC. CoA reviewed 63 doctoral, internship and postdoctoral residency programs for initial and continued accreditation, as well as four requests for change in accredited status. The program actions taken at this meeting are available on the Office of Program Consultation and Accreditation (OPCA) website and program listings have also been updated.

USDE Review and Recognition

On June 8, 2011, after a lengthy review process, CoA was reviewed for continued recognition by the National Advisory Committee on Institutional Quality and Integrity (NACIQI), the body charged with making recommendations to the senior department official of the U.S. Department of Education (USDE) regarding the federal recognition of accrediting agencies. The USDE recognition process is similar to the APA accreditation process, in that it includes a comprehensive self-study, an on-site visit, and opportunity for third-party comment. In CoA's case, due to the 2008 reauthorization of the Higher Education Opportunity Act and subsequent reformation of NACIQI, the review originally scheduled for 2009 was drawn out over a period of 2 years. During this process, CoA completed two self-studies (the first one submitted in 2009 was under the old regulations, so a new one was required to be submitted in January 2011) and had three site visits by USDE staff members. The final decision by the senior USDE official — reflecting consideration of both the USDE staff's analysis and NACIQI's recommendations — was that CoA's recognition be continued while requiring a compliance report on seven issues within 1 year (APA-accredited programs: this is essentially equivalent to a "deferral for information"). Five of the issues were minor and simply require additional documentation or clarifications to Office policies. Two of the issues, however, will require significant changes as to how CoA looks at programs on probationary status, requiring changes to the Accreditation Operating Procedures. CoA has one year to propose, collect public comment on, finalize and implement those changes, and provide its response to USDE for a final review by NACIQI in late 2012 or early 2013. 

Impending Changes to the Operating Procedures

At this time, CoA is seeking public comment on changes to the Accreditation Operating Procedures (AOP) (PDF, 253KB). As described above, the most significant changes involve the length of time that programs on “accredited, on probation” status have in order to remedy their deficiencies once they are identified by CoA. In its review, USDE determined that CoA's current process — which involves programs undergoing another full review (self-study, site visit and review by CoA) after being placed on probation — allows too much time to elapse and is inconsistent with the current interpretation of the timelines enforced by the USDE's Criteria for Recognition. CoA is committed to maintaining recognition by USDE. Thus, in order to be compliant with the Criteria, CoA is proposing changes to Sections 4.2 through 4.4 of the AOP. A number of other relatively minor changes are also being proposed.

To view the proposed changes, view comments received or register to provide comment, please visit the CoA Public Comment System. All comments must be received by October 15, 2011, so that CoA can finalize the proposed changes at its fall meeting. Changes to the AOP require final approval by both the APA Board of Educational Affairs and the APA Board of Directors.   

New and Revised Implementing Regulations

At the July meeting, CoA approved several new and revised IRs. All IRs can be accessed from the homepage of the Office of Program Consultation and Accreditation website (under "Criteria and Procedures"). Based on feedback, the IR documents on the website have recently been updated to improve users’ ability to navigate through them.

Three revisions to existing IRs involved minor changes and clarifications:

IR #

IR Name

What was changed

C-9

Intern Funding (previously titled "Unfunded Internships & Stipend Equity")

A section on stipend sufficiency was added to provide additional clarification and guidance to internship programs.

D.8-2

Procedures for Notification of CoA Actions In Accordance with the Secretary of Education’s Standards for Recognition of Accrediting Agencies

Section 2 (“Publicly-Available Information”) of the IR was edited to reflect that the OPCA staff information is available to the public, consistent with the USDE criteria.

E.1-1

Procedures for Program Consultation and Accreditation Office Maintenance
of Program Accreditation Records

Updated the language for the USDE criterion referenced within the IR and edited to reflect the OPCA’s actual practices with retaining program records.

In addition, the following IRs were adopted after CoA’s review of the public comments received:

IR #

IR Name

What it does

C-30

Outcome Data for Internships & Post-Doctoral Residency Programs

This new IR explains why outcome data is a critical component of a program's accreditation review, and provides guidance to internship and postdoctoral programs on the types and specificity of outcome data that CoA needs in order to make an accreditation decision.  

In response to questions raised in the public comments, the final version of the IR includes guidance regarding the appropriate interval for collecting distal data from program graduates.

C-16

Evaluating Program Adherence to the Principle of “Broad and General Preparation” for Doctoral Programs

Revisions to this existing IR include how CoA defines several contents areas of Domain B.3(a) and (b); CoA’s interpretations of broad & general training both ACROSS and WITHIN the required content areas; expectations for graduate-level training; and expectations for faculty qualifications to deliver content in these areas.

In response to questions raised in the public comments, the final version of the IR:

  • Places emphasis on the general curriculum (so as not to imply that the content areas must be covered only through coursework);
  • Clarifies that the definitions for each content area include examples of topics rather than checklists; and
  • Broadens the expectation for programs to focus on preparing students for practice and meeting local/state/national needs, as opposed to just licensure in a specific jurisdiction. 

D.4-7(a)

Use of Annual Reports for Reaffirmation of Accredited Status and Monitoring of Individual Programs

This new IR (previously D.4-8) provides the rationale and procedures for using annual reports for reaffirmation of programs’ accredited status.  The procedures apply to all programs, but the section on key thresholds is specific to doctoral programs.

Responses to Public Comments Provided Below

D.4-7(b)

Thresholds for Student Achievement Outcomes in Doctoral Programs

This new IR (previously D.4-7) provides the definitions and thresholds of student achievement outcomes.  Applies to doctoral programs only.

D.4-7(c)

Use of Narrative Annual Reports for Reaffirmation of Accredited Status and Monitoring of Individual Programs

This new IR (previously partially included in D.4-8) provides the rationale and procedures for using any requested narrative written reports for reaffirmation of programs’ accredited status.  Applies to all programs. 

IRs on the Horizon

In an effort to better explain its program review and decisionmaking process, CoA is working on drafting several new IRs for doctoral programs on areas such as outcome data (the doctoral program equivalent of IR C-30, PDF, 3.8MB), student selection, attrition, internship placement and licensure rates. Once the language has been approved for public comment, CoA will be seeking review and input from training programs and the public. In addition, based on correspondence received from the APA Committee on Disability Issues in Psychology (CDIP), CoA has appointed a working group to review whether and how to better integrate disability issues as a form of diversity within CoA policies.

Discussing Provisional Accreditation

At the July meeting, CoA discussed correspondence received from the Council of Chairs of Training Councils (CCTC), the APA Board of Educational Affairs (BEA), and a BEA-appointed working group on education & training for health service providers asking CoA to consider some form of provisional accreditation status for applicant programs. CoA felt that it needed additional information and perspectives from relevant communities of interest as to the “why” and “how” this should/would work. A series of questions developed by a CoA working group were sent to the heads of psychology training councils and groups just prior to the APA Convention. It is hoped that the responses will help CoA to take a more informed look at whether to pursue this issue and what changes would be involved in this process.

Future Accreditation Fee Changes

The accreditation annual and application fees were last changed in 2008 — more than 10 years after the previous fee increase. At the time of the 2008 increase, CoA noted that it would review its fees every 3 years. As such, at the July 2011 meeting CoA reviewed its current cost structure, the data on program types and sizes, and cost projections based on the longer-term needs of its communities of interest (e.g., electronic self-study submission, upgrades to ARO, online training modules).

After careful consideration, CoA determined that it is unable to cut or maintain fees for any level of training, although it hopes to minimize any potential fee increases for internship and postdoctoral residency programs. At this time, we are seeking additional input from the APA Finance Office in establishing a revised fee structure for 2013 that will take into account the size of the student body in doctoral programs. CoA also anticipates increasing the cost of appeals to cover the expenses involved in that process. Once these changes are finalized, the new fees will be made public.

CoA 5-Year Summary Report

For a number of years, CoA published an annual report that included the data collected from accredited programs’ ARO data. For a variety of resource reasons, that document has not been published for several years. OPCA staff are currently working on compiling a summary report that will illustrate the trends and changes in accredited programs over the past 5 years, including the size of programs at each level. The report is scheduled for production later this year. In addition to the summary report, the aggregate data from each of the past 5 years (and moving forward) will be made available to the public through the Accreditation website.

Reminders for Accredited Programs

The 2011 Annual Report Online (ARO) must be completed by September 15, 2011. All programs (except those that were granted initial accreditation during the 2010-2011 academic/training year) must complete this report as part of fulfilling its responsibilities as an accredited program. Since April, OPCA Research staff have sent out several reminders and tips for completing the ARO, and have provided consultation to programs in need of assistance. If you have any questions about the ARO, please contact the Research staff by email or (202) 336-6016. Please also contact us if you are a program director who is not receiving the emails, so that we can ensure that our database includes correct contact information for all current directors.

Pay Your Fees

Invoices for 2011 annual fees were mailed out in June (Internships/Postdoctoral programs) and August (Doctoral programs). Please contact the OPCA if you did not receive an invoice or have any questions about fee payment.

Report a Change in Program Leadership

The start of the new academic/training year might result in faculty or staff changes to your accredited program. CoA wants to reiterate the importance of contacting OPCA when there is a change in the director and providing the contact information (including the email address) for that person. If the new director was not previously involved with the program, his/her CV should also be provided for our records. This is consistent with Implementing Regulation C-19, (PDF, 3.8MB) which requires programs to notify CoA in advance of any substantive changes. This will also ensure that the correct individual(s) for your accredited program is listed in our database and receives all of CoA’s correspondence.

Review of Public Comments on Implementing Regulations D.4-7(a), D.4-7(b) and D.4-7(c)

Why is the internship threshold calculated on a one-year versus a three-year basis?

The new internship threshold in IR D.4-7(b) reflects placement in APA-accredited internships only. Prior to 2011, the ARO combined accredited and APPIC-member sites when asking programs for their placement rates. Because the IR requires that we use Annual Report Online (ARO) data to calculate and evaluate thresholds, the thresholds can only be based upon the data that have been collected thus far. So, in 2012 we will use one year’s worth of data; in 2013 we will use the average of 2011 and 2012; in 2014, we will use the full three year average, and we will continue to do so moving forward. We recognize that this may inconvenience a few small programs that, for whatever reason, had a bad internship match year. However, if programs explain that to us (preferably with data to support your accredited-internship placement rates from prior years), this should not be a major problem.

How will combined programs where students identify with more than one substantive area be treated?

These thresholds were developed across all doctoral programs, regardless of substantive area or degree type. Thus, all programs use identical thresholds. The exception is for the internship placement, which applies to clinical-only and counseling-only programs.

Why use the same threshold regardless of program size?

Our goal is to have a way to quickly take a look at programs in between full accreditation review. We want to identify programs that may be having difficulty before we do a full review so that, hopefully, they will be able to address any emerging problems beforehand. We decided to set the threshold to identify those programs that were significantly different from the majority of programs, i.e., to identify the bottom 5 percent of programs on the relevant dimension. The idea is to ask the program to explain their data and, if necessary, to develop a plan to deal with the issue if it looks like it is/will become a more chronic problem.

Why limit the threshold to APA/CPA-accredited internship programs?

When thinking about how to conduct a quick review of internship placement, we realized that we can only quickly and easily judge the quality of internship programs that we have externally reviewed. While programs may place their students in other, high quality internship sites, unless these programs have been reviewed according to a set of standards through an accreditation process (either APA or CPA) it would be difficult for us to ascertain what metrics were used to determine quality. Thus, we decided to rely solely on placement in internships where an accrediting body such as CoA or CPA has evaluated the quality.

Programs should have better access to seeing their own aggregated data.

We agree, and we are in the process of working on that as well as other upgrades to the ARO system. Until these upgrades are made, we are happy to provide aggregated data to any program that requests them.

The ARO should include only data needed in monitoring the quality of programs between self-studies and site visits.

The ARO is designed to not only support annual monitoring of programs right now, but it will also support accreditation review in the near future. We are diligently working with our online vendor to develop the functionality to link the ARO to annual thresholds and to Self-Study (SS) tables. At some point programs will be able to view their annual threshold data after completing the ARO, as well as to view preliminary SS tables for their data to date. The CoA is in the process of reviewing and streamlining data requirements for both the ARO and the SS tables so only data that are actually used in annual monitoring or in periodic review are indexed.

Special Topics Spotlight: How CoA Does Program Review in Panels

Although all accreditation decisions are made by the Commission as a whole, program review panels are an integral part of CoA’s decision-making process. Given that there are 32 members of CoA and anywhere from 50-100 programs reviewed at each meeting, the use of review panels not only helps streamline the process but ensures that each program receives the thoughtful and complete review that it deserves. 

Most CoA members have remarked that until they joined the Commission, they had no idea how the panel review process worked — so it’s likely that most program directors and site visitors are wondering the same thing. If you haven’t yet read Implementing Regulation E.2-1(a), (PDF, 970KB), Procedures for Panel Review of Programs at CoA Meetings, that’s a good place to start. In the interest of transparency, here are a few more details to help de-mystify the process: 

  • Prior to each of the 3 CoA meetings per year, program review panels are constructed. There are typically five panels, each with six to seven commissioners. Each panel has a designated chair responsible for leading the discussion and keeping the panel on track. In constructing these panels, careful consideration is given to ensure that each panel is heterogeneous in terms of experience on CoA, representation across constituent groups and knowledge/experience with different types of training programs, substantive areas and models of training.
  • About 6 weeks before a meeting, the list of the programs scheduled for review is provided to all CoA members so that any potential conflicts of interest, (PDF, 970KB) can be disclosed. Based on the conflicts of interest, program assignments are made.   
  • Each program is assigned two reviewers to review the program's materials in the 4 weeks prior to a meeting. Most programs have a 'primary' and 'secondary' reviewer, except for applicant doctoral programs and any program that was previously deferred for cause (who receive two 'primary' reviewers instead). The only difference between a 'primary' and 'secondary' is that the primary reviewer receives a copy of the program's self-study. Both reviewers, as well as the full CoA, receive copies of all other program materials in their meeting agendas: the preliminary review letter and program response, site visit report and program response, and any other written correspondence received during the course of review.
  • To the extent possible (given conflicts of interest), at least one of the reviewers for each program serves as a "match" based on knowledge or experience consistent with the program’s substantive area and model (doctoral programs) and level and type of training (doctoral, internship and postdoctoral programs). 
  • Prior to the meeting, each CoA member is responsible for completing detailed written analyses on each program they have been assigned to review (primary and secondary), for presentation to the review panel. Reflecting the importance of the panel review process, each CoA member is also expected to review the agenda materials (i.e., all materials except the self-study) for all other programs on their panel in preparation for the panel’s discussion.
  • Generally, the first two days (Thursday and Friday) of each CoA meeting are scheduled for work in program review panels. During this time, each of the five panels retreats to a separate conference room in the APA building to conduct its work. All materials (including the self-studies) for each program on the panel are available in the conference room.
  • As described in the IR, after the reviewers have presented their analyses on a program, the panel discusses the recommendation to provide to the full CoA. The program materials are frequently referenced by panel members during the course of this discussion, particularly when questions are raised by a panel member or there is a need for clarification.
  • Although 2 full days are allotted for panel review, more or less time may be taken as necessary. Scheduling program review as the first activity of each meeting reflects CoA’s commitment to ensuring that as much time as needed may be spent on this important process. Only after CoA vote on program review is complete does CoA begin policy discussions and other work. 
  • Following the panel review, CoA reconvenes as a group to discuss the panels' recommendations and conduct the official vote. The final decision on any program is ultimately made by the vote of the entire CoA as a unit, which, based on full discussion, may or may not support the initial recommendation by the panel. 

Current Program Counts*

Program Type

Accredited Programs

Applicant Programs Under Review

PhD

PsyD

PhD

PsyD

 

Doctoral Graduate Programs

Clinical

173

62

1

3

Counseling

67

3

0

0

School

55

6

2

0

Combined

5

3

0

0

Internship Programs

 

469

16

 

 

Postdoctoral Residency Programs

Traditional — Clinical

34

9

Specialty — Clinical Neuropsychology

16

3

Specialty — Clinical Health Psychology

9

0

Specialty — Clinical Child Psychology

5

4

Specialty — Rehabilitation Psychology

1

3

Specialty — Forensic Psychology

0

1

 Total

 

908

42

*Includes all programs scheduled to voluntarily withdraw from accreditation at the conclusion of the 2010-2011 academic/training year.