Two is the Magic Number: Program Evaluation-Related Reports in the 2016 Standards

Robert I. Urofsky
Vice President of Accreditation and Training

The number 2 is a pretty amazing number, with numerous meanings associated to it in history, mathematics, religion, astronomy, astrology, and numerology. The number two has importance in the 2016 CACREP Standards as well, with it being the number of program evaluation-related reports required in Section 4. Section 4 of the CACREP 2016 Standards includes requirements for two separate and different outcomes reports, both of which are completed on an annual basis and get posted to the program’s website in an easily accessible location.

Program Evaluation Outcomes Report (PEOR) (Standard 4.D)

Standard 4.D specifies that programs develop and disseminate an annual report that includes, by program level (i.e., entry-level and doctoral-level if applicable): 1) a summary of the program evaluation results; 2) subsequent program modifications; and, 3) any other substantial program changes.

Let’s unpackage these program evaluation outcomes report elements. The program evaluation outcomes report, which generally includes narrative information and data summaries such as tables and charts, is a presentation of the results of the full breadth of program evaluation activities outlined in Standard 4.B and tied to the program’s objectives. The data sources for the report include aggregate student assessment data addressing student knowledge, skills, and professional dispositions; demographic and other characteristics of applicants, students, and graduates; and, systematic follow-up studies of graduates, site supervisors, and employers of program graduates. As indicated, the program evaluation outcomes report is an annual report, so each annual posting should include updated information reflecting the data from the current reporting year. As not all program evaluation activities are conducted on an annual basis, the updates should include data from those activities specified in the assessment plan (Standard 4.A) as having occurred during the year, and the implications of the findings.

Subsequent program modifications.

The program evaluation outcomes report also includes information on “subsequent program modifications” and “other substantial program changes.” So what is the difference between these two elements? The first element, “subsequent program modifications,” includes changes that have been made to the program specifically based on the findings from a program evaluation activity or activities. For example, a program revises the Appraisal class and adds additional assessment-related content to the Diagnosis and Treatment class based on the findings from follow-up studies of graduates and site supervisors consistently indicating both groups felt stronger content coverage was needed in this area. It is recommended that when programs make program evaluation data-based modifications, that clear connections are made in the program outcomes evaluation report between the change(s) made and the data that prompted the changes. In years when there are not modifications made based on program evaluation results, this can be indicated in the report that gets posted.

Other substantial program changes.

The second element, “other substantial program changes,” includes changes made to the program based on institutional or other contextual considerations for the program. For example, even though a program had been utilizing a curricular infusion approach to ethical and legal issues in counseling, the program responded to recent changes to state counselor licensure regulations by replacing an elective option with a required Ethical and Legal Issues in Counseling course.

Distribution of the program evaluation outcomes report.

Standard 4.D indicates that not only should the program evaluation outcomes report be posted on the program’s website, but also that students currently in the program, program faculty, institutional administrators, and personnel in cooperating agencies (e.g., employers, site supervisors) should be actively notified of the availability of the PEOR and its subsequent annual updating. The most direct method for doing this could be the use of email distribution lists for the specified groups. This would allow both for notification of the specified groups, and for ready documentation of the notification having been made.

Vital Statistics Outcome Report

The second type of report is addressed in Standard 4.E. Standard 4.E specifies that programs annually update and post on the programs’ websites the following data for each entry-level specialization and the doctoral program, if applicable: 1) the number of graduates for the past academic year; 2) pass rates on credentialing examinations; 3) completion rates; and, 4) job placement rates. These four data points are included in the Vital Statistics Survey programs complete each year. While it is not necessary for the program to directly notify constituent groups when the vital statistics outcome report is posted and updated each year, it is necessary to notify CACREP by emailing the web link to where the updated data has been posted. This link is then included in the program’s “Detail” section in the CACREP Directory of Accredited Programs.