Resources for Evaluating Distance Learning Programs

Download the full report Feedback on Evaluation of Distance Learning/Online Programs.

The impetus for ABET to begin accreditation of alternative delivery educational programs stemmed from an ABET Board of Directors strategic issue raised by the 2006-2007 Board. The resulting actions included the work of two task groups charged between 2007 and 2009, as well as activity by the Accreditation Council and the four ABET Commissions.

The first Board task group, chaired by Susan Schall (ABET IIE Director), refined the key issues concerning alternative delivery, identified guiding principles, outcomes, and measures, and recommended that a second Board task group (chaired by Moshe Kam, ABET IEEE Director) be charged with review of ABET’s Criteria and accreditation policies and procedures.

This second task group:

  • reviewed the criteria and policies for barriers that would prevent alternative delivery programs from meeting the criteria and policies; and
  • identified any needed revisions or additions to accreditation criteria and policies to provide standards, as appropriate, for alternative delivery program accreditation.

Recommended Revisions

The report of the Kam Task Group concluded that the Criteria needed no revisions at this time.

The Kam Task Group did identify policies and procedures that would need modification and recommended that the Accreditation Council and the four Commissions be tasked with providing Accreditation Policy and Procedure Manual (APPM) revisions for Board approval.

Computing Accreditation Commission Task Group

The ABET Board authorized the Computing Accreditation Commission (CAC) and the Engineering Technology Accreditation Commission (ETAC) to conduct pilot accreditation reviews to identify any potential criteria or policy changes.

The CAC Executive Committee formed a Task Force on Alternative Education (chaired by Barbara Price, CAC Executive Committee At-Large Member) as a first step toward the CAC’s pilot accreditation review of a multi-site, online program. This Task Force discussed various issues which included qualifications for PEVs, learning objectives for laboratories, etc.

The CAC Fact Finding Task Group sought input from two candidate institutions/programs about how an on-site visit might be conducted; what would be similar to an on-site review and what would be different. In addition, the Fact Finding Task group visited one of the “candidate” programs (March 2009) to get a sense for how things might work when a team got to the site. This provided subsequent recommendations regarding the process for evaluating distance learning-based programs.

The CAC Executive Committee provided a report of activities to the ABET Board at its March 2009 meeting.

The CAC Task Group recognized that ABET reviews had always include sampling as a part of a review protocol. ABET teams sample transcripts for review and student work collected by the program as evidence of criteria compliance. In addition, an ABET review may only provide enough time to interview selected faculty members, students, and administrators.

However, the review of alternative delivery / multi-site, hybrid programs would require sampling at a much higher level. The Task Group petitioned ABET Headquarters for resource support. Headquarters engaged an independent, professional statistician to assist the Task Group in developing a sampling methodology that could be applied to both multiples sites and large numbers of faculty members and students, as well as be adaptable to a variety of program configurations.

The sampling methodology was delivered in late summer 2009 and implemented by the CAC on its first online / hybrid delivery program in the 2009‐10 review cycle. Both ETAC and ABET staff observed this pilot visit.

Pilot Reviews

The CAC conducted its pilot alternative delivery program review during the 2009-2010 review cycle. The ETAC conducted its pilot review in the 2010-2011 review cycle.

Each review provided for different program configurations and allowed the respective Commission to experiment with processes and procedures. ABET now has two process options for how to approach a request for review for alternative delivery / multi-site programs. In addition, the pilots gave ABET Headquarters the opportunity to identify other areas of ABET that would need to be adjusted, for example, the invoicing of visit and maintenance fees and the support for the accreditation process provided by the ABET database.

Lessons learned from the pilot reviews include but are not limited to:

  • The criteria are delivery-proof.
  • The commissions need to select experienced team chairs and program evaluators who are able to be open-minded and flexible.
  • The program’s Self-Study Report must adequately address all delivery paths through the program just as it must adequately address all curricular paths through the program.
  • Team pre-visit preparation and engagement may require more time and focus until greater experience with these types of reviews is gained.
  • Programs may need to provide tutorials and additional assistance to ABET reviewers as they navigate the online materials and course displays.
  • ABET needs to invest in consistent, reliable technology solutions to support teams reviewing alternative delivery programs and for staff tracking the reviews from team chair assignment through accreditation action.