Program Review: From Mandate to Benefit

November
2015
ASCCC Treasurer
LaunchBoard Project Manager

Project Origins

Program review is a required and potentially beneficial element of college planning, yet it is largely undefined both in terms of the activities involved and in the objectives and outcomes it should produce. Consequently, these processes are extremely varied at different colleges, which ultimately may also be a factor in the results of accreditation self-study and review processes.

In Spring 2014, adopted Resolution 07.05 called for the ASCCC to “work with the California Community College Chancellor’s Office and other appropriate agencies to further develop research tools that offer quantitative, qualitative and meaningful data for local program review processes.” This partnership subsequently led to a joint one-year research effort with the CCCCO LaunchBoard project to examine the feasibility of using state-level data to inform local review for CTE programs.

The Design Process

To get a sense of the additional information that CTE practitioners wanted so that they could supplement the data available through local program review processes, the ASCCC and LaunchBoard team held a series of meetings at conferences and via two statewide CCC Confer calls that were attended by approximately 100 faculty, researchers, and CTE deans. This process led to the development of a concept paper that outlined key criteria as well as desired data points, which were circulated to the field for comment via a survey. This process yielded the following specifications:

  • Use a graphical, question-driven data display: Visually represent information to address key questions about supply and demand, as well as program completion and employment outcomes. Whenever possible, information should include comparison data that colleges can use to benchmark their performance. Visuals should be backed-up by data charts.
  • Tailor the data displayed: Create a “wizard” feature that allows users to only see the data most relevant for their programs’ goals. For example, a program that provides training for incumbent workers might want to see job retention and wage increases, whereas a program that is aligned with a CSU degree might want to see transfer outcomes. Also, practitioners wanted the option to see outcomes for both completers and skills-builders—workers who are engaged in short-term course-taking to maintain and add to skill-sets required for ongoing employment and career advancement.
  • Provide professional development: Offer guides that provide suggestions on how to use the tool in program review processes and in discussions within departments or across colleges. Examine programs that show the strongest outcomes to document effective practices.

During 2014-15, the LaunchBoard team developed a pilot program review tool and worked with 10 colleges that volunteered to review data on a total of 25 programs. Teams of faculty and researchers discussed the information in the program review tool as part of departmental meetings and then filled out a survey on the usefulness of the data and the structure of the tool.

What We Learned

Local program review processes are strengthened by having additional data that may not be widely available at the college level. Pilot colleges reported that having labor market and employment outcomes gave them a stronger understanding of whether students met their goals. They also valued access to historical trends and regional context.

Program review processes may be best enhanced by combining traditional local program review data, additional locally-calculated data points, and regional/statewide information.  Some practitioners were eager for return-on-investment metrics that were not possible to calculate in the LaunchBoard because financial data are not sufficiently granular in statewide data sets. This type of additional information, combined with regional totals, labor market information, and benchmarking data, would augment and strengthen local program review conversations.

Practitioners would benefit from a common set of data and opportunities to look at the information together. Because program review data may be cut differently by individual colleges, comparing apples to apples may be difficult when examining results. Statewide tools allow decision-makers to use consistently-defined metrics so they can immediately get to the meatier conversations—such as how a program has been designed or implemented that might influence outcomes.

Additional statewide data is needed about post-college outcomes. Many practitioners focused on data points that are not available in statewide data sets, such as whether students become employed in their field of study, earn a third-party credential, start their own business, or are satisfied with their program. Some of these questions are addressed in the CTE Outcomes Survey, a survey of former CTE students that colleges can elect to either administer on their own or pay Santa Rosa Junior College to implement on their behalf to leverage economies of scale with other colleges.  However, colleges must pay out of pocket each year to participate, which may disadvantage colleges with smaller CTE programs and lower budgets.

Next Steps  

LaunchBoard 2.0: The LaunchBoard team, rather than build out a program review tool, elected to redesign the main interface of the LaunchBoard. The Program Snapshot tab is currently being rebuilt so that information is accessed via questions such as “Are we training the right number of students for available jobs?” and “How much money are students making?” Answers are displayed visually, with opportunities to explore deeper into related data, such as more detailed labor market information or disaggregated results.  The LaunchBoard team will be sharing a demo version across the state this fall and rolling out a full release in February 2016.  This past spring, the LaunchBoard team released another tool that allows colleges to examine program-level data from the CTE Outcomes Survey, which makes information on post-college outcomes more readily available for program review conversations.

Inquiry/data templates: While information on key topics for program improvement such as budgets and scheduling cannot be generated from statewide data, research or inquiry templates could be designed that would facilitate a more systematic review of these issues.  For example, colleges could use suggested formulas to calculate students’ return on investment or gather information in a consistent fashion to support the review of scheduling across multiple programs or colleges. It would be beneficial to bring together faculty, researchers, and college leadership to identify high-priority lines of inquiry and research specifically tailored to program review and improvement that could be built into templates and shared across the state. These efforts could be integrated into other statewide efforts such as the Institutional Effectiveness Partnership Initiative, regional and sector research activities, and resource planning, to name a few.

Professional development: Having access to better data will give colleges a big step forward in making program review more meaningful. However, data alone is not sufficient. Professional development will be needed to help practitioners understand how to combine local data, regional and statewide metric, and labor market information when considering ways to strengthen CTE portfolios.

The Academic Senate and the LaunchBoard team look forward to pursuing these next steps in the near future. Faculty throughout the state should take note as these opportunities unfold, as faculty input will be critical to making these efforts beneficial to programs and students.