Accreditation & Assessment

Program Effectiveness Review 2018

On this page:

Overview

The Program Effectiveness Review Committee formed in spring 2018 and began work on its initial charge of developing a one-page survey for colleges, divisions, and their underlying programs to identify strengths, opportunities, contributions to the mission, and measures of success. These were also used by participants during the strategic planning summit in March and will continue to inform the strategic planning process.

The committee's next phase was the development and distribution of a set of key performance indicators (KPIs) for annual reporting and analysis. This initiation of a process of continuous improvement will: heighten understanding of how programs meet the needs of those they serve, how well work is aligned with university priorities, and identify opportunities to develop or enhance programs.

Academic Programs

Over the Fall 2018 semester, academic programs were provided a common set of metrics going back to the 2012–13 academic year as part of several dashboards that will support evaluation at the department-, college-, and university-level. Department chairs and deans were asked to characterize the trends that they see in the data and identify strategies that will have a positive impact on those trends for their units in the upcoming academic year. In anticipation of HLC re-accreditation, chairs should also include a paragraph describing the learning outcome assessment program for each of their programs. 

Chairs were asked to provide responses to their deans by October 15.  Deans, in turn, were asked to submit college-level reports by November 12. A university-level report will be shared with the campus community during the spring 2019 semester.

KPIs for academic programs will include*: (*some KPIs will not be available for the initial reporting cycle)

  1. Fall-to-fall retention:  The fraction of students for a given fall semester who returned the next fall semester by department, college and university overall, excluding students who completed degrees between the fall-to-fall timeframe.
  2. Student credit hours and fraction of credit hours attempted (at day 14) for which a passing grade was earned:  For those students who were assigned a final grade for a given term, what fraction that successfully completed the course?
  3. Fraction of credit hours awarded for on-line, hybrid and night classes:  Considered as the total number of credit hours awarded by department, college, campus and university overall that were (a) 100% on-line, (b) hybrid, and (c) evening for a given semester and academic year.
  4. Fraction of credit hours awarded by categories of instructors:  Considered as the fraction of credit hours awarded by campus/college/department for which the instructor of record was either (a) a full professor, (b) an associate professor, (c), an assistant professor, (d) a senior lecturer, (e) a lecturer, (f) an instructor, or (g) an adjunct for a given semester.
  5. Student-to-faculty ratio:  The full-time equivalent (FTE) ratio between faculty and students by campus/college/department for a given semester and academic year.  For undergraduate courses, student FTEs are determined by multiplying the number of students registered at day 14 by the number of credit hours for the course and then dividing by 15.
  6. Credit hours at time of degree completion:  The average total number of credit hours earned at the time of degree completion per completer per department, college and university overall for a given academic year.  Data will be limited to those completers who had not previously earned a degree.  Data will be available as “credits earned at Wright State,” “credits accepted as transfer,” and “total credit hours.”
  7. Degrees awarded:  The number of degrees awarded (completions) by department, college, campus and the university overall for a given academic year.
  8. Enrollment:  The number of students enrolled in each of a department or college’s majors for a given academic year.
  9. Net tuition revenue:  Considered as (a) the gross tuition revenue generated by course offerings by campus, college and department and (b) associated tuition discounts to derive net tuition revenue.
  10. Budget, non-tuition revenue, and expenses:  Considered as (a) the adopted and accounted (adjusted) budget, (b) the non-tuition revenue, and (c) expenses for a given college/department for a given fiscal year.
  11. Research expenditures:  As an indication of scholarly activity; units for which research expenditures are not a good metric will need to provide an alternative, more appropriate metric.

Non-Academic Programs

Given the variability of purpose and function across non-academic programs, non-academic units were asked to provide information on three standard data points and to propose unit-specific KPIs from a selection of broad categories. Unit leaders met with the Program Effectiveness Review Committee over the summer to finalize an agreed upon set of metrics.

Those units that have existing data aligning with the agreed upon metrics will submit their program effectiveness self-study reports at the end of the summer. The remaining units will begin data collection in the fall and will submit their first report by the end of the fall semester.

Common data points reported by all non-academic units include:

  • Number of full time staff: FY16, FY17, FY18, FY19
  • Total allocated budget: FY16, FY17, FY18, FY19
  • Customer base (student, faculty, staff, alumni, other)

Indicators specific to the core mission and operations of each unit were developed from the following data categories:

Volume metrics: Measures of total volume experience in key business processes or activities. Examples may include event attendance, service utilization, calls received, forms processed, or agreements produced.

Yield or ROI ratios: Ratios that express actual output or revenue generated relative to inputs or cost of investment. Examples might include percentage of applicants enrolled, click-thru rates, or dollars raised as a percent of unit operating costs.

Cost ratios: Ratios that express the cost to produce a unit of output. Some examples are cost per touch, cost per application, or cost per active customer.

Industry benchmarks: Standards set by industry regulations or best practices that serve as benchmarks for processes or activities. Some examples are staffing ratios, fundraising levels, or efficiency levels.

Time metrics: Amount of time spent on a particular process or process segment. Some examples are average call length, average processing time, or average wait time.

Student outcomes: Metrics indicating the effect of service or process on student outcomes such as average GPA, retention rates, and completion rates.

Compliance: Metrics that indicate compliance with regulatory, safety, budgetary, or other standards and requirements. Examples include percentage of time in compliance or number of incidents.

Customer satisfaction: Feedback or survey results that indicate how well the unit is meeting the needs of its customers.

Submitted non-academic unit program effectiveness reports are available on the "2018 Administrative Reports" page for Wright State University faculty and staff to review and provide feedback.

These initiatives are the first steps to creating a broad continuous improvement process for both academic and non-academic areas. The data derived from this effort will be used  to improve processes and service and to help inform the process of aligning resources with our strategic planning initiatives.

Committee

Program Effectiveness Review Leadership

Dan E. Krane, Ph.D.

Department:
Biological Sciences
Title:
Professor
Address:
Biological Sciences Bldg 225B, 3640 Colonel Glenn Hwy., Dayton, OH 45435-0001

Faculty, Staff, and Student Members

Eric S. Corbitt

Department:
Student Union
Title:
Director, Student Union and Campus Recreation
Address:
Student Union 211, 3640 Colonel Glenn Hwy, Dayton, OH 45435-0001

December Green, Ph.D.

Department:
Sch of Social Sci's & Int'l Studies
Title:
Professor; Associate Dean of Students and Curriculum
Address:
Allyn Hall 106M, 3640 Colonel Glenn Hwy, Dayton, OH 45435-0001

Nova M. Lasky, B.S., M.Ed.

Department:
Chief Operating Officer
Title:
Director, Organizational Planning & Project Management
Address:
University Hall 250, 3640 Colonel Glenn Hwy, Dayton, OH 45435-0001

Daniel Palmer, Jr, J.D.

Department:
President Administration
Title:
Government Affairs Specialist
Address:
University Hall 286, 3640 Colonel Glenn Hwy, Dayton, OH 45435-0001

Mateen M. Rizki, Ph.D.

Department:
College of Egr & Computer Sci
Title:
Adjunct Faculty and Professor Emeritus
Address:
Russ Engineering Center 303, 3640 Colonel Glenn Hwy., Dayton, OH 45435-0001

Gregory P. Sample, M.P.A.

Department:
President Administration
Title:
Executive Vice President, Chief Operating Officer
Address:
University Hall 282, 3640 Colonel Glenn Hwy, Dayton, OH 45435-0001

Shu Schiller, Ph.D.

Department:
Grad Programs and Honors Studies
Title:
Professor; Interim Dean, College of Graduate Programs & Honors Studies
Address:
University Hall 168, 3640 Colonel Glenn Hwy, Dayton, OH 45435-0001

Amanda Graham Spencer, M.S.

Department:
Student Success Services
Title:
Director, University Academic Advising
Address:
Student Success Center 101, 3640 Colonel Glenn Hwy, Dayton, OH 45435-0001

Reports

Academic Affairs

Advancement

Business and Finance

Chief Business Officer

Chief Diversity Officer

Chief Information Officer

Enrollment Management

Facilities Management and Campus Operations

Human Resources

President

Student Affairs

Student Health Services

Student Success

VP for Research

Feedback on Administrative Reports

Warning message

You must login to view this form.