ABET

Pre-Visit Preparation Module 3: ETAC

A. ETAC Leadership

 Adjunct Accreditation Director

  • Frank Hart

Executive Committee

  • Chair – Scott Dunning
  • Past Chair – Kirk Lindstrom
  • Chair Elect – James Lookadoo
  • Vice-Chair, Operations – Tom Hall

Four Members-at-Large

  • Ciro Capano
  • April Cheung
  • James Lookadoo
  • Tom Hall
  • Frank Young

Public Member

  • Christine Olson

ABET Board Liaison Representative 

  • Scott Danielson

 

B. Criteria Review and Changes

A complete summary of changes in criteria and in policies and procedures can be found on the Accreditation Alert page. Those who have not recently participated in a program review should review these changes.

Important: Be sure to download and use the forms in the current PEV Workbook for your visit this fall! Pertinent changes needed for your visit will be found in the current workbook. The Workbook file will have a date as part of the file name. Before your visit, check the date for the online version of the Workbook to see if any changes have been made since you downloaded your copy.

Please review the definitions in the first section of the Criteria for Accrediting Engineering Technology Programs.  As a PEV, it is imperative that you understand and can explain these definitions to program personnel if you find shortcomings related to those related criteria.

Please have a copy of the current Criteria available as you review the Self-Study in preparation for your visit. It can also be useful to have a copy of the Criteria available during the visit so that you can quote directly from the Criteria when responding to questions about accreditation.

After the visit, please help us improve by completing the online evaluations of Team Chair and other PEVs.

C. Issues Arising During Recent Accreditation Reviews

Introduction Section:

As a PEV, you will be asked by your Team Chair to provide the introduction section for the program.  This introduction section will be used in the draft statement for the program before the visit based on the information in the self-study. The introduction section you will be expected to write consists of a one-paragraph description of the program, including the target employers of the graduates and any unique aspects of the program content or delivery, and the program’s educational objectives.

Strength Findings:

A program Strength is an exceptionally strong and effective practice or condition that stands above the norm (seldom seen in other programs) and that has a positive effect on the program. When writing a program strength statement, state: a) what was observed, b) what makes it stand above the norm, and c) the positive effect it has on the program. Strength findings can be included in the T301, Program Audit Form.  Do not identify specific a person or job position that has only one member.  Statements such as “faculty members care about students” does not stand above the norm, however, describing the actions of the faculty members, how those actions are different from others, and what positive effects these actions have on the program can make this statement a strength statement.

T351 PEV Report:

The Comment column in this form must be filled out to briefly explain the final quality rating agreed upon by the team.  The Comment column in the Summary page should also be filled out to reflect the reason/s behind the team’s degree of compliance decisions. The Summary page is also used during consistency evaluations; therefore, it is very important to concisely state the reason/s for the findings.

The following advice is based on issues found in recent draft statements, and addresses criteria or policies where inconsistencies or misinterpretations have most often occurred. 

Multiple Issues with a Criterion:

The ETAC criteria document contains eight General Criteria: Criterion 1 through Criterion 8, and 24 discipline-specific or program criteria. Where possible, try to combine all issues for a given Criterion into a single finding. (e.g., try to avoid multiple findings based on different sections of the same criterion.) The “degree of compliance” (expressed as a Deficiency, Weakness, or Concern) should be based on how well the overall Criterion (not just a small piece of it) has been satisfied.

Criterion 1 – Students:

There continue to be findings related to students taking courses without stated prerequisites. This shortcoming is typically identified by the PEV’s transcript analysis. Thus, it is critical that the PEV is provided transcripts well ahead of the visit so a thorough review can be accomplished. It is equally critical that you, the PEV review the transcripts well ahead of the visit and provide your team chair with any shortcoming found before the visit.  Ideally, the program is asked to explain or provide documentation for each transcript shortcoming before the visit. This may lead to discovery of a shortcoming related to missing or incomplete documentation of how or why the student was allowed to take a course without its prerequisites. Please note that there is no requirement in the criteria that students must have taken prerequisites before taking a course.  However, there are implied requirements that institutions follow their own rules.  This means that if an exception is made with respect to a prerequisite,  it must be documented (usually this includes a written notice to the Registrar).  [Note that a lack of prerequisites or other issues related to the curricular structure would be addresses in Criterion 5.]

Criterion 2 – Program Educational Objectives:

The primary findings stemming from this criterion come from the followings, please note that discussions related to Criterion 2 are on-going within ABET so, please consult with your team chair in case new information has become available.

  1. Programs not having a documented process for periodic review of the PEOs.
  2. Involvement of stated program constituencies (as per the program’s self-study or other venues), for the periodic review of the program educational objectives. In many cases, programs list constituencies but do not have documentation of those constituencies being involved in the periodic review of the objectives. Please allow programs to obtain input from those (key) constituencies they feel are the most influential in program development.
  3. The specific wording or nature of the educational objectives should not be the focus of a PEV. If the program educational objectives have been created via a documented process, a presumption of appropriateness of the educational objective is recommended. For instance, if program educational objectives seem to be very similar to student outcomes, or very similar PEOs among various programs, ETAC’s position has been to write an Observation recommending that the program educational objectives be re-written to better align with ABET’s definition.
  4. If a good process has not been followed/documented AND the nature of the educational objective’s wording does not match the criterion’s definition, (e.g., it reads like an outcome) the finding should address both issues. Documentation means that written evidence ( meeting minutes etc.) is available that shows the involvement of program’s key constituencies in the review. Systematic and periodic means that the review has occurred on a scheduled basis. Lacking any of these elements, a finding should be written.

 

Criterion 3 – Student Outcomes:

Clarification of terminology:

  1. Student Outcomes: ABET defined as what students are expected to know and be able to do by the time of graduation. These relate to the skills, knowledge, and behaviors that students attain as they progress through the program.
  2. Program Outcomes: Is NOT an ABET defined term. Usually referred to by programs as they establish their requirements for students’ capabilities upon graduation.
  3. Learned capabilities: Referred to in Criterion 3, a through k elements of this ABET criterion. To provide clarity to statement readers, please refer to a-k or a-i elements as “learned capabilities”, NOT student outcomes in your statements.

 

Ask yourself the following questions:

  1. Does the program have documented (which means published on their web site) student outcomes?
  2. Do those program outcomes address or satisfy all the elements of the required Criterion 3 learned capabilities?
  3. Is there a documented and effective process for the periodic review of those student outcomes? For instance, if the stated student outcomes don’t address all required Criterion 3 learned capabilities, the shortcoming is twofold—they don’t address the required learned capabilities and they don’t have an effective review process.
  4. Note that ETAC does not require that a program use the literal wording of the Criterion 3 learned capabilities. However, regardless of how the program expresses its student outcomes, the program must demonstrate that its student outcomes address all the learned capabilities of  Criterion 3 [a-i or a – k]. Such demonstration may be done via a matrix or other illustrative device that shows the correlation between the program’s student outcomes and the Criterion 3 learned capabilities.
  5. Degree of compliance of this finding usually falls on the team’s judgment regarding how well the overall Criterion (not just a small piece of it) has been satisfied.

 

Criterion 4 – Continuous Improvement:

This criterion is often a source of findings. While the statement of the criterion is concise, it has complexity that deserves careful thought and attention both before and during a visit. The points below should help you navigate these issues.

  • The intent of this criterion is to ensure that the program (1) has processes in place to assess and evaluate how well it is achieving its student outcomes, and (2) has processes in place to use the results of that evaluation to improve the program. The process and the results of these processes must be both appropriate and documented.
  • While there is no explicit mention of the manner in which assessment must be carried out, the definition of assessment (in the preface of the criteria document) indicates that “effective assessment uses relevant direct, indirect, quantitative and qualitative measures as appropriate to the outcome being measured.” So, if a program only uses a few surveys, one examination, or one class to accomplish all its assessment activities, it would be reasonable and supportable to write a finding focused on the lack of appropriate and effective assessment.
  • If the program has been making changes for improvement but these changes are not related to student outcome assessment and evaluation process results, there is a Criterion 4 shortcoming.
  • Criterion 4 does not require that a program achieve a particular level of attainment for its student outcomes. It is the program’s responsibility to define acceptable performance (although a ridiculously low level of “acceptable” performance could lead to a finding using the criterion’s requirement that the process be appropriate). The program (not the PEV, not ABET criteria) defines “satisfactory” level of performance for each of the learned capabilities [a-i] or [a-k]. “Satisfactory” should be based on what is needed to satisfy the needs of organizations being served by graduates.
  • Student outcomes and Criterion 3 learned capabilities refer to capabilities of students as a group, so assessment of attainment should be reported as such. The program does not have to show that every individual graduate has achieved an acceptable level of performance in student outcomes or in Criterion 3 learned capabilities However, the number of students involved in the assessments should be reported to allow judgment of the appropriateness of the process.
  • A PEV should not try to judge whether student performance is satisfactory in meeting student outcomes or Criterion 3 learned capabilities. The PEV’s responsibility is to determine how well the program is assessing the achievement of its stated outcomes according to the program’s own benchmarks / performance targets. A PEV should never attempt to use displays of student work or displays of raw assessment data to determine whether student outcomes have met the program’s performance targets. It is the program’s responsibility to develop assessment data (using primary evidence as much as possible) and to then evaluate those data to draw its own conclusions about student achievement. If the program has not done this, then a finding should be written. It is the PEV’s responsibility to determine whether the program’s process for demonstrating achievement is credible and reliable, meeting the criterion’s requirements that the process is appropriate and documented.
  • A Criterion 4 finding should not say students have—or have not—attained an outcome; rather, a Criterion 4 finding should say the program has—or has not—used an appropriate and documented process to assess and evaluate the extent to which student outcomes are being attained.
  • An appropriate assessment process should involve the use of “primary evidence” of student attainment of outcomes. Primary evidence is associated with direct measures of student performance and should be found in the display materials of student work. As examples, survey data are indirect and secondary evidence while assessments via rubrics or other data collection mechanisms based on student project work, exams, homework or laboratory work would be primary or direct evidence.
  • A PEV should allocate sufficient time on Day “0”, typically Sunday, for a thorough review of display materials towards determining whether a program’s process for demonstrating student achievement of outcomes is credible and reliable. The PEV should have the program representative lead them through the display materials with the specific guidance of “show me how the program assesses and evaluates student attainment of outcomes.” It is recommended that at least two hours of focused time be allocated to this process. It is important to gain a thorough understanding of the program’s processes and documentation on Day 0 as this provides the PEV with follow-up questions for program faculty on Day “1” and deliberations with the team and the team chair.
  • If a program has to satisfy specific Program Criteria, they may include the Program Criteria requirements as additional student outcomes or incorporate them as components of existing student outcomes that are already mapped to Criterion 3 learned capabilities. If Program Criteria outcomes are written as separate student outcomes, they must be included in the continuous improvement plan. The attainment of such student outcomes should be assessed as part of the continuous improvement plan and any shortcomings cited in a Criterion 4 finding.  There is on-going discussions regarding this topic, please be watchful for updated guideline from the commission.

Criterion 5 – Curriculum:

The primary areas of findings from this criterion come from either the mathematics portion or the advisory committee portion of the criterion. Baccalaureate degree programs are not specifically required to include courses in calculus or other mathematics above algebra and trigonometry, but the program must include applications of integral and differential calculus in the solution of technical problems. PEVs should look for, or have the program faculty provide evidence of, the student’s use of the appropriate mathematics.

The advisory committee language within the criterion deserves attention.  It is more comprehensive on the role of the committee and its interaction with the program than other ABET commissions. Look for clear documentation of an advisory board’s fulfillment of these tasks on a periodic basis—which means a reasonable and repeated basis. If the evidence provided is for only one review right before the ABET visit and nothing else, a shortcoming exists.

If program cannot demonstrate that it has effectively addressed the requirements of the applicable Program Criteria in its curriculum or in its student outcomes (Criterion 3), then any resulting finding would be written as a Program Criteria finding.

Criterion 6 – Faculty:

This criterion has some requirements for individual faculty members and some for the faculty as a whole, so be careful in distinguishing between the two in a finding that may be written against the criterion. Common areas for recent findings are adequate resources for, or evidence of, continuing professional development and sufficient numbers of faculty. As always, make sure to provide clear evidence related to impact on the program in such findings.

Criterion 7 – Facilities:

This criterion no longer refers to “equipment characteristic of that being used in the industry and practice being served by the program.” However, it does include language that “modern tools, equipment, computing resources, and laboratories appropriate to the program” be available. Although explicit comment regarding safety has been removed from the criteria, the language in this criterion that “students must be provided appropriate guidance regarding the use of the tools, equipment, computing resources, and laboratories” could provide the basis for a finding if the safety issue is related to instruction within the laboratory. In more general cases, safety-related shortcomings should be cited under APPM II.G.6. Also, as noted above, the criterion now mentions that the library services and computing/information infrastructure must be adequate to support scholarly and professional activities of the students and faculty.

Criterion 8 – Institutional Support:

This criterion is often cited inappropriately in findings. Focus findings on the issue’s impact on the program. For example, do not write findings that require a program to hire additional personnel (e.g. program support staff); write the finding based on what is not being accomplished, and let the institution decide whether to resolve it with additional personnel or by other means.

Program Criteria:

The applicable program criteria for the program being evaluated should have been determined by ABET HQ based on the program name.  If there is any question about this, consult your Team Chair.  PEVs must see if the program has satisfied program criteria requirements in addition to the requirements in Criteria 1 through 8.  Carefully check to see if all of the requirements of the program criteria are addressed by the Self-Study.  If the Program Requirements are being satisfied thru Criteria 3 Student Outcomes, check to see if appropriate assessment, evaluation, and continuous improvement processes have been accomplished. Failure by the program to address aspects of the Program Criteria should be discovered and communicated to your Team Chair well before the visit so that the program can be given an opportunity to remedy this oversight.

If the program cannot demonstrate that it has effectively addressed the requirements of the applicable Program Criteria in its curriculum or in its student outcomes, then any resulting finding should be written as a Program Criteria finding.  Discussions are on-going regarding Program Criteria format and requirements between the ETAC and professional societies, so be watchful for changes.

 

D. ABET Policy and Procedures (APPM) Issues

Name of the Program:  The official name of the program is the name precisely as it appears on the Request for Evaluation (RFE); this is the exact name that should be used in all documents throughout the accreditation review. It is important that this program name also be shown on the student transcripts in the same way as on the RFE. Additions to the official name of the program, as shown on the RFE, on the student transcripts should trigger investigation related to program criteria and the appropriate number or type of program evaluators. If a PEV believes there may be an issue of this sort, contact the Team Chair immediately so that it can be investigated.

Modes and Locations of Instruction:  Be aware of any online and off-campus or remote offerings of the program. If it is possible for a student to take a significant amount of technical courses at sites in different modalities, (e.g., face-to-face at places other than at the home campus or online including via the Internet) other than at the home campus, then:

  • An online/hybrid/multiple-site program may require a greater time commitment in preparation and evaluation than is normal for a single site program delivered face-to-face.
  • If a program, or portions of a program, is offered at multiple sites, the program must be able to demonstrate that the program is equivalent at all sites.
  • The program should be prepared for the team to visit any site at which the program is offered and these sites must be reviewed as part of the accreditation review.
  • The “weakest link” concept applies to the program evaluation and if an issue is found within one delivery modality or at a specific site, the finding and any resulting accreditation action will apply to the program in its entirety, regardless of delivery method or location.
  • The program must demonstrate how it assures that development of student outcomes at the remote site is equivalent to that at the home primary site and campus being reviewed.
  • The program should provide separate course/assessment materials for each delivery method/location and include graded student work ranging from excellent through poor for students for each delivery method. Assessment materials from remote sites should be disaggregated from other sites to allow for effective assessment and evaluation processes of all routes to the degree. Otherwise, lack of student attainment of outcomes at a remote site may be masked by assessment data from the home campus.

Here is a list of specific APPM requirements that may lead to findings:

  • I.A.4. An institution may not use the same program name to identify both an accredited program and a non-accredited program.
  • I.A.6. Each accredited program must be identified as “accredited by the Engineering Technology Accreditation Commission of ABET, http://www.abet.org”
  • I.A.6.a. Accredited programs must publicly state their educational objectives and student outcomes. A shortcoming can be written if the information is extremely difficult to locate by the public.
  • I. A.6.b. Accredited programs must publicly post annual enrollment and graduation data per program. A shortcoming can be written if the information is extremely difficult to locate by the public.
  • I.C.4.b. Program name must be shown consistently on transcripts, all publications, and the RFE.
  • I.C.4.c.(2). All program criteria for any implied program specialization must be satisfied.
  • I.C.5.a. [To be eligible for an initial accreditation review,] a program must have at least one graduate within the academic year prior to the academic year of the on-site review.
  • I.E.1. All paths to completion of the program must satisfy the appropriate criteria.
  • I.E.5.b.(1) [Examine] Facilities – to assure the instructional and learning environments are adequate and are safe for intended purposes.
  • I.E.b.(2) [Examine] Materials – Evaluators will review materials sufficient to document: a) the extent of attainment of each student outcome, and b) the program’s compliance with Criterion 3 Student Outcomes and Criterion 5 Curriculum, as well as any applicable Program Criteria. These materials are provided either as a part of the Self-Study Report or as displays during the onsite visit, or accessed by evaluators within a suitable on-line storage location utilized by programs delivered fully or partially on-line. Materials provided during the onsite visit are typically textbooks, assignments, exams, and examples of student work in a range of quality. Provision for access to online materials used by the program must be made available during an on-site visit.