Using Instructional Program Review
We are in an ongoing cycle of trying to improve how we "close the loop" on instructional program review. In recent years we have improved IPC's Feedback form, added a Dean's Perspective and an Overall Program Effectiveness rating. This year, we are tackling the following questions: how do we use the results of program review? who follows up?
This survey explores the usefulness of extracting reports from the various PR prompts and providing them to appropriate people/groups on campus for analysis and action. Sample reports and explanations of how the report might be used are provided below. Be advised that everything on that webpage is in draft form and open to revision. It's just a starting point. The draft reports are "quick and dirty" (e.g. the program names are not displayed next to the responses) and are here solely to provide an approximate example.
Please scan the report that would be relevant to your role on campus and then provide feedback via https://goo.gl/forms/JAHGxcll8tJSEweT2 Feel free to comment only on the reports that might be relevant to you.
Prompt | Recipient | Potential Use | Draft Report |
---|---|---|---|
0. Executive Summary |
Academic Senate Board of Trustees |
Documenting completion of phase 1 of Program Review. Explanation: Currently there is no quick way for a person/group to know how the college’s programs are doing. Executive Summaries are in specific files for each program and these are separate from IPC’s Overall Rating. Compiling both summative metrics into a single report, would facilitate easy reading and could result in a product that can be shared with the BOT or other constituent. |
report |
7. Enrollment Trends |
IPC and VPI |
Improving enrollment management. Explanation: Enrollment management has been a significant concern on campus. This report would allow IPC and the VPI to view enrollment from the faculty point of view and may serve to direct future dialogue between faculty and administration. [Note: this is a new practice that is not currently being done by IPC] |
|
8A. Access and Completion | ACES |
Documenting evidence of equity efforts and informing equity planning. Explanation: Currently ACES reports on its progress toward meeting equity goals through its annual report to the state. However, ACES doesn’t have a way to capture the efforts of faculty at the grass-roots level. This report might provide that evidence. By reading these responses, ACES may be able to discern where the college is in its understanding of equity and tools for addressing these gaps. This may lead ACES to revise the PR prompts, identify which programs need assistance, identify needs for professional development. Finally, this report and ACES follow-up provides important accreditation documentation. |
report |
8B: Completion – Success Online | Online Instruction (DE) Coordinator and DEAC |
Documenting evidence of efforts to reduce performance gaps and informing planning. Explanation: Our accreditation requires the college to examine performance gaps between online and face-to-face instruction. The Online Instruction (DE) Coordinator and DEAC can use this report to discern where the faculty are and what needs they have for addressing these gaps. |
report |
9AB: SLO Compliance and Impact 10: PLO Impact |
Assessment Coordinator |
Documenting evidence of “completing the assessment cycle” and informing assessment planning. Explanation: Our accreditation requires the college to document evidence of the impact of SLO and PLO assessment. This report will collate those responses into a single document that can be provided to accreditors if requested. By reading the responses, the Assessment Coordinator can better determine how the faculty are doing with assessment, the impact assessment is having, and possible needs for professional development. |
report |
5B: Progress report on prior action plans | IPC |
Documenting evidence of attempts to improve programs and identifying continuing needs that may require college resources or other internal support. Explanation: The purpose of this report would be for IPC to identify the planning objectives that have not been accomplished – especially for reasons of lack of resources. IPC could then discuss the plans and make recommendations to PBC for resource allocation. [Note: this is a new practice that is not currently being done by IPC] |
report |
Planning unit Objectives and Tasks | IPC |
Identifying program plans that require additional college resources. IPC adds relevant “Units Impacted” to action plans that need internal support. IPC submits recommendations to PBC for resource allocation. Explanation: Using data, analysis and reflection to create plans for improvement is the primary purpose of program review. Some plans are straightforward resource requests (instructional equipment) which are processed by the CBO. Other plans can be accomplished by faculty without any additional resources. But there are some plans that require an allocation of resources. For example, a program may need assistance from the Workforce Development Director or Marketing Director; a program may propose plans that would require an investment in Retention Specialists or Counselors; or a program might propose an initiative that would be appropriate as an Innovation Fund proposal. IPC would be a logical entity to find these resource-intensive planning objectives, discuss and make recommendations to PBC for funding. [Note: this is a new practice that is not currently being done by IPC] |
report |
6A: Impact of resources 6B: Impact of staffing changes |
PBC |
Documenting evidence of effective resource allocation for accreditation; providing advance notice of potential new position proposals. Explanation: Documenting the impact of resource allocation is an accreditation requirement. Compiling these narratives puts all the evidence for the year in one place. PBC needs to be aware of what can happen when resources become unavailable (vacant positions not filled) or are no longer sufficient to accommodate present needs. It is also a way to measure the collateral costs/benefits of faculty reassignment for non-instructional work. |
report |