Up until this point, you have learned many of the fundamentals of the ABET Accreditation Process. Now, you will apply what you have learned in a series of exercises, in checks for understanding, and, ultimately, in the simulated preparation for the site visit.
A. Criteria Application Basics
- ABET accredits educational programs leading to degrees. (ABET does NOT accredit institutions, departments, or degrees.)
- ABET defines an educational program as an integrated, organized experience that culminates in the awarding of a degree. The program will have program educational objectives, student outcomes, a curriculum, faculty, and facilities.
- ABET does not dictate program names to an institution.
You can find a thorough explanation of the above in the Accreditation Policy and Procedure Manual, Section II.E.
B. Understanding The Criteria
Using the Criteria to evaluate a program’s compliance begins with understanding the Criteria. ABET criteria are based on the principles of continuous quality improvement. General Criteria cover the following areas of an educational program:
- Program Educational Objectives
- Student Outcomes
- Continuous Improvement
- Institutional Support
Your member society may also have additional criteria that cover minimum standards for the specific program discipline you will be evaluating. These are called “Program Criteria.”
You will find ABET Criteria for each Commission here.
C. Common Issues Associated With Each Criterion
The ABET Criteria are minimum standards that you apply with judgment. Over the years, ABET has identified common issues that may surface as you review a program’s Self-Study Report and make observations during a visit for evidence of compliance.
The issues listed below for each criterion area are not exhaustive. You may identify additional issues as you review the Self-Study Report. Additionally, some issues listed here may not by themselves represent a shortcoming relative to the criteria, but rather may indicate a need to seek additional information in order to determine whether there is a shortcoming. Remember each shortcoming must refer to specific requirements in the criteria that are not fully met or potentially may not be met in the future.
Problems With Student Advising
- Ineffective or inconsistent advising.
- Lack of understanding of curricular requirements, especially if many options are available.
- No documentation of course substitutions or missing prerequisites.
Problems with transfer process
- No documentation on acceptability of transfer credits.
Program Educational Objectives
- Program educational objectives are not published or readily accessible to the public.
- Program educational objectives are not related to institutional mission or are inconsistent with the mission.
- Program educational objectives are not consistent with the needs of the program’s various constituencies.
- Program educational objectives do not describe what graduates are expected to attain within a few years after graduation.
- There is no indication as to who are the program’s constituents.
- There is no evidence the needs of the program’s constituents have been considered in the formulation of the program’s educational objectives.
- There is no process to periodically review and revise the program educational objectives.
- There is no evidence of constituency involvement in the periodic review and revision of program educational objectives.
- Student outcomes are stated such that attainment is not measurable. (Note: Having student outcomes whose attainment is not measurable is not by itself a violation of any criterion, but if attainment of an outcome is not measurable then the extent to which it is attained may not be appropriately evaluated, as required in Criterion 4.)
- There is missing or incomplete justification as to how the student outcomes prepare graduates to attain the program educational objectives.
- The student outcomes do not reflect what the students should know and be able to do at the time of graduation.
- There is no process to periodically review and revise the student outcomes. (ASAC, CAC, ETAC only.)
- The assessment and evaluation processes are not documented.
- The program cannot demonstrate the processes do what they claim.
- The assessment, evaluation, and improvement cycle is not complete.
- Indicators of student performance have not been defined and/or no a priori level of student performance has been established. (Although there is no criteria requirement for performance indicators or a priori levels of performance, without these or something equivalent it may be difficult to appropriately evaluate the extent to which student outcomes are attained, and additional information may be needed to determine the appropriateness of the evaluation process for outcomes attainment.)
- The program uses only anecdotal results (versus measured results).
- The program relies only on course grades as assessment for one or more student outcomes. There are many factors, rarely all relating to a single student outcome for the program, that are used to determine a course grade. Thus the level of granularity of course grades relative to student outcomes is almost always too coarse for course grades to be used as reliable indicators for attainment of specific student outcomes.
- There is an over-reliance on student self-assessment (e.g., surveys) as opposed to assessment methods based on actual student performance. As a rule, student self-assessment of outcomes attainment is considered much less reliable than attainment data from actual student performance relative to each outcome.
- Assessment data are being collected for only some outcomes.
- The data collected are not analyzed and used as input to a program improvement process.
- The continuous improvement process appears to ignore evidence that students are not attaining the student outcomes at the expected level of student performance.
- The evaluation of data does not provide the information needed to make program improvements.
- Program improvement plans are developed but not implemented.
- There is no documentation of how the results of assessment and evaluation processes are used to determine needed program improvements.
- Results of the evaluation of student outcomes are not used to make needed improvements to the student outcomes.
- There is no evidence improvement efforts are being assessed and evaluated.
A program’s curriculum provides the foundation for entry into the profession. The curriculum criterion varies among the commissions so the following issues related to this criterion may not all be applicable to your commission.
- Curriculum fails to meet credit hour requirements (if specified by criterion).
- Quality of the culminating or integrating experience, comprehensive project, capstone or major design experience (if required by the criterion).
- No culminating experience.
- Several courses with elements of a comprehensive project but not identified as the culminating experience.
- Multiple culminating courses or courses taught by different instructors that do not all satisfy the requirements of the criteria
- Culminating design experience not addressing multiple constraints and appropriate standards. (EAC only.)
- Insufficient number.
- To support concentrations, electives, etc.
- To maintain continuity and stability
- Poor faculty morale affecting the program.
- Lack of professional development.
- Excessive workloads.
- Retention/turnover rate.
- Heavy reliance on temporary faculty appointments or adjuncts, potentially jeopardizing program stability.
- Insufficient responsibility and authority to improve the program.
- Insufficient space.
- Overcrowded laboratories and classrooms.
- Unsafe conditions.
- Some essential equipment inoperable.
- Lack of modern instrumentation.
- Lack of software/hardware needed to support the curriculum.
- Unstable leadership affecting programs.
- Dean and/or program head positions open or filled by interim appointments for an extended period.
- Frequent turnover of university administration and unit leadership
- Inadequate operating budget affecting
- Acquisition and maintenance of laboratories and appropriate equipment.
- Faculty salaries, promotions, and professional development.
- Hiring and retention of faculty and staff.
- Insufficient support staff
- Teaching assistants.
- Technicians for instructional laboratories, machine shops, and laboratory services.
Keep in Mind: You do not have to be an expert on assessment. The program must provide evidence it has a working and effective system in place. Note outcomes and continuous improvement are linked closely together.
Please Note: Your draft visit plan should detail with whom you will visit on-site to resolve any issues with program compliance with criteria, which are not explained to your satisfaction in the Self-Study Report.
D. The Decision-Making Process
Using the Program Evaluator Worksheet and Program Evaluator Visit Report specific to your commission, you should be able to make a preliminary evaluation of the program based on your review of the program’s Self-Study Report. You should make a list of those issues that will require further investigation on site and discuss these with your team chair. Review the last visit Final Statement. Pay attention to shortcomings cited, to be sure the program has not allowed a particular criterion to degrade.
Once on-site, you may revise your evaluation after conducting interviews with faculty members, students, and administrators; reviewing documentation; and visiting facilities. You will share your findings with your team members at team meetings on Sunday and Monday nights. This will assist you in refining your recommended action. At the conclusion of the visit, you will provide your team chair with the recommended action for your program and an Exit Statement to support that action. You will simulate the on-site activities during the face-to-face training component.
It is essential all team members make decisions on findings in a consistent manner. All team members should listen carefully to the proposed findings of other team members to identify potential inconsistent findings in different programs. For reaccreditation visits, team members should be cognizant of findings that may appear inconsistent with findings from previous evaluations, and should make clear the reasons for any finding that may appear inconsistent with a previous finding if possible.
The team chair develops the Draft Statement to the institution by combining and editing the program exit statement material from the program evaluators and adding material that applies to the institution as a whole. Two editors and ABET Headquarters staff review the Draft Statement for adherence to standards and consistency with other statements. It is then sent to the institution, which has 30 days to respond. The team chair uses the response from the institution to prepare the Final Statement, which is edited again and then provided to the full commission for action. In preparing the Final Statement, the team chair may consult with the program evaluators as needed to determine whether there are any changes to the recommended accreditation action because of the institution’s actions since the visit. Final accreditation decisions are made at the Summer Commission Meeting in July of each year.
E. Evaluating A Program’s Compliance With The Criteria
To decide if a program complies with each criterion and to recommend an accreditation action, follow these steps:
- Identify issues by criterion. Remember you may find issues not listed in the Common Issues Associated with Each Criterion section above.
- Determine the appropriate finding.
- Select the key term that applies for the finding. Base your decisions on the criteria, NOT on your opinion. Consider the resulting recommended action. Is it consistent with the nature of the shortcoming?
- Explain each concern, weakness, and deficiency in relation to the specific criterion using wording consistent with the definition of the shortcoming.
- Recommend the accreditation action. Prior to the site visit, your team chair will ask you where the program stands in overall compliance to ABET Criteria. Based on your preliminary review, you will select one of the following potential actions as described in the Accreditation Policy and Procedure Manual, Section II.G.12. This preliminary judgment may be revisited after you gather more information during the campus site visit. Your available accreditation actions include:
- Next General Review (NGR)
- Interim Report (IR)
- Interim Visit (IV)
- Report Extended (RE)
- Visit Extended (VE)
- Show Cause Report (SCR)
- Show Cause Visit (SCV)
- Show Cause Extended (SE)
- Not to Accredit (NA)
- In Module 2, The Accreditation Process, you read about Levels of Compliance, statements of compliance, concern, weakness, and deficiency, as well as observations with regard to your findings when evaluating a program. You can also find these in the Accreditation Policy and Procedure Manual, Section II.G.9.a.(2).
General Review Terminology Vs. Action
If the evidence supports a program Weakness for a given criterion, you must recommend either an Interim Report or an Interim Visit action (if there is no Deficiency). Note there is no difference in severity for the IR and IV actions. The only difference is whether the adequacy of the corrective action(s) can be determined based on a written report (with appropriate supporting documentation), or whether a visit is required in order to assess the adequacy of the action(s).
If the evidence supports a program Deficiency for a given criterion, you must recommend a Show Cause action if this is a re-review or a Not-to-Accredit action if this is an initial review. Also, note a Not-to-Accredit action can only result from an evaluation of a new or from a show cause visit.
Please refer to the chart below for reference:
|Not to Accredit (only for new programs)
F. Ensuring Consistency
Accreditation actions must be consistent across all programs and institutions. Accreditation actions must be consistent with actions given for other programs with similar shortcomings (Concern, Weakness, Deficiency). Throughout the ABET Accreditation Process, there are multiple checkpoints to ensure consistency. (See Consistency Checks in Module 3.)
G. Sample Situations
What would you decide? Try out these sample situations. You want to determine the best response given the information you are given.
You are a part of a team evaluating five programs at Bay State University. Four of the five programs do not have their educational objectives published anywhere. As a result, the team determines these four programs do not comply with the program educational objectives criterion, and the team agrees those programs be given a deficiency with respect to this criterion. However, the program you are evaluating has its program educational objectives published on its website but not in the university catalogue nor in any of the departmental promotional materials handed out to both enrolled and prospective students.
Select the best program evaluator and team response:
- The program you are evaluating is not in compliance since the objectives are not published in each location where prospective students and the public might look for them, and there needs to be consistency among the actions taken for different programs at the same university. Therefore, the team should agree the program be given a deficiency.
- The team agrees you will discuss this with the program head to try to resolve the issues before the team leaves the site. If the response of the program is satisfactory for resolving the problem, you will not report it in the Exit Statement or exit meeting.
- The program is in partial compliance with the criterion but lacks the strength of full compliance and the team agrees the program should be given a weakness.
- Since the objectives are published someplace and the criterion is silent on where the program educational objectives need to be published, the program evaluator should consider this as a concern at most and perhaps not mention it at all to the team.
- This would not be appropriate because the criterion does not require the objectives be published in all documents related to the program. Further, consistency among the program actions is not an issue; different actions for different programs are appropriate if the characteristics that are the basis for a decision are different.
- This is not appropriate because all the relevant facts are known, and it is not appropriate to leave without making decisions on all issues relative to the criteria.
- This may be the best alternative, although if there is evidence the program’s website is the most-referenced source of information about the program, then a Concern could be appropriate.
- This could be appropriate if the website is clearly the primary source of information about the program for prospective and current students. However, it should be mentioned to the team in any case, especially given the other programs do not publish their objectives anywhere.
Note: It is not necessary the program educational objectives be published in every document about the program, but they should be included in all documents readily used by the public and by current and prospective students to obtain information about the program.
You are evaluating a program and its Self-Study Report indicates a significant portion of the student outcomes assessment data will be made available during the site visit. Once on site, you find these materials are neatly compiled statistical results of several assessment instruments – recent teaching evaluations, an alumni survey, and employer interviews. These are claimed to support the Continuous Improvement Criterion as evidence of an assessment process with documented results.
Select the best program evaluator response:
- Review the materials and identify trends in the responses that would suggest appropriate ways to improve the program you are visiting; report these to the program chair.
- Request an executive summary of the results be provided before you depart from the site.
- Request additional documentation of how the results of the surveys are applied in program improvement. If such evidence is not available, discuss with the team chair and other team members the option of citing the program as being deficient with respect to the Continuous Improvement Criterion: assessment results are not utilized for the continuous improvement of the program. Suggest to the program chair that direct assessment of student work may provide the most objective form of student outcomes assessment.
- Tell the team chair the program will be cited for a Deficiency with respect to the criterion and they face a probable Show Cause recommended action.
- The program evaluator is not expected to engage in data analysis. It is up to the program faculty members to make the case for their claims.
- This approach may seem appropriate on the surface, but the most important issue in this case is how the data are used. (See next option.)
- This is the best first step. The program evaluator may find the data has been used in some way but not documented.
- This would be an appropriate action if the evaluator determines this is all that was done, there was no serious attempt to use the data, and the faculty and program chair did not seem to be committed to using the data in any manner. However, the recommended action should not be mentioned explicitly.
The program you are visiting has in its Educational Objectives the program’s ability to prepare its undergraduates for successful careers both in the traditional chemical process industry and in the novel field of microelectronics. The faculty CVs contained in the Self-Study Report indicates no faculty member has expertise in the microelectronics area. After the faculty interviews, you confirm this is the case and the faculty member in charge of teaching the advanced senior-level course in microelectronics processing has been learning the course material on the fly from a new textbook in the area. Interviews with the students provide evidence that the faculty member teaching the microelectronics processing course is a favorite instructor. Students like the course and find the material to be very easy. Consultation with the program head suggests the course content has very little rigor for a senior-level course. You conclude the evidence indicates coverage of microelectronics is weak at best.
Select the best Program Evaluator response:
- As the Program Evaluator, you encourage the chairperson either to remove the objective from the published information or to provide the faculty with development opportunities to enhance their expertise in the microelectronics field. In the meantime, the program is cited with a weakness because of the lack of strong compliance with the Faculty Criterion: The faculty . . . must have the competencies to cover all of the curricular areas of the program. The faculty . . . must ensure the proper guidance of the program and its evaluation and development.
- In your Exit Statement, cite a deficiency for the Faculty Criterion and suggest the administration provide funds for hiring additional faculty members to provide expertise in the microelectronics field.
- You encourage the chairperson for the program’s attempts at providing innovation in its Educational Objectives. No weaknesses, concerns, or deficiencies need be cited.
- Report a concern about the microelectronics curriculum in regard to the Curriculum Criterion.
- This may be the best response if the program evaluator believes the faculty member who is teaching the course is committed and capable of developing the needed expertise in a short period with the proper support. However, care should be taken to avoid appearing to prescribe any solution to the problem.
- This is an inappropriate response unless the program evaluator determines there are other factors, such as local industry and/or the school administration, demanding the microelectronics option be provided.
- This is an unacceptable option in view of the findings of the program evaluator.
- This is an unacceptable response in two ways. First, there is no explanation of the basis for this concern. Second, it ignores the issue of faculty expertise as required by the Faculty Criterion and the need for this emphasis in the program (constituency support indicated through Program Educational Objectives Criterion).
H. Analyzing Student Transcripts
Student transcripts provide direct evidence the institution’s program requirements are met. In addition, transcripts provide evidence the curricular requirements are met. As part of your review of the Self-Study Report, you will need to analyze transcripts. Procedures for doing this include the following:
- Be sure the transcripts identify the name (title) of the degree received in a way that clearly identifies the program as an accredited program according to the institution catalog and other documents and in a way that distinguishes it from any non-accredited programs with which it could be confused by a potential employer. Identify any problems in this regard to your team chair.
- Make sure the courses counted toward the degree are consistent with the published requirements of the program. In cases where the transcript is for a graduate of an earlier curriculum, the institution must provide a copy of the appropriate curriculum. The institution also should provide justification for any variances, such as transfer credits or substitutions not clearly documented on the transcripts.
- Check to be sure prerequisites are taken before each course that requires them and the course sequence on the transcript does not vary unreasonably from the recommended sequence. If courses are taken out of sequence, check to see if there is an indication of difficulty for the students in terms of the course grades. (If there are difficulties for students, then there could be a problem with the mechanisms for advising and the enforcement of prerequisites. If there are no problems, it could indicate prerequisite requirements that are not needed.)
- Request clarification for any apparent problems in the transcripts. Do your transcript analysis and request clarifications soon enough to allow reasonable time for the institution to respond.
- Review transfer course and course substitution decisions for reasonableness related to course content and credit allocation. Review documentation of the decisions. If there are questionable substitutions, request clarification from the program.
Please click here for a guide for analyzing student transcripts based on commission.
I. Analyzing The Simulated Self-Study Report
Are you ready to apply your newly gained knowledge to simulating PEV pre-visit activities? Please use the following sample data and forms to:
1. Review a simulated Self-Study Report and sample program student transcripts for your commission concerning a fictitious program at a fictitious institution, Upper State University. Important Note: These Self-Study Reports have been created for training purposes only. They should not be distributed. While based on responses typically seen in Self-Study Reports, these documents intentionally contains problems for training purposes.
Applied Science Self-Study Report
Computing Self-Study Report
Engineering Self-Study Report
Engineering Technology Self-Study Report
2. Complete the PEV Visit Report and/or PEV Worksheet for your commission:
Applied Science Evaluation Forms
- Applied Science Program Evaluator Visit Report 2016 [Word document] (contains Program Evaluator Worksheet). Complete your name on the first page on the Report, the Bachelor Degree Transcripts of Graduates from the Evaluated Baccalaureate Program of the Report, and the Pre-Visit Estimate column of the PROGRAM EVALUATOR WORKSHEET of the Report.
Computing Evaluation Forms
Engineering Evaluation Forms
Engineering Technology Evaluation Forms
To help with this task, the example completed Program Evaluator Visit Report Forms and/or Program Evaluator Worksheets below illustrate the types of responses used to complete these forms.
3. Make a preliminary recommendation based on the findings.
Using the appropriate pre-visit forms for your commission, evaluate your program’s Upper State University Self-Study Report and example student transcripts for compliance to the ABET General Criteria.
Once you have completed the required sections of your commission’s evaluation forms, post the PEV Visit Report and/or PEV Worksheet (with completed sections as noted above) on the ABET Training Secure Website, http://main.abet.org/evaluator, by the required due date. For instructions on posting your completed work on the ABET Training Website, click here.
Please note the following:
- You are reviewing the Upper State University program against General Criteria ONLY.
- This review may take up to eight (8) hours to complete.
REMINDER: The following MUST be completed no later than four (4) weeks before the PEVC Face-to-Face Training:
- The three (3) Proficiency Assessments
- The Program Evaluator Visit Report and/or the Program Evaluator Worksheet (as noted above)
The results of the three (3) Proficiency Assessments will be automatically recorded. The Program Evaluator Visit Report and/or the Program Evaluator Worksheet must be posted on the ABET Secure Training Website. Please post these documents as Microsoft Word Files.