SLO Regional Meetings Address Collaborating for High Standards in Program Review

May
2009
Janet Fulks, Academic Senate
Marc Beam, Research and Planning Group

The Academic Senate and Research and Planning Group (RP) collaborative group on Student Learning Outcome (SLO) Assessment sponsored four regional meetings in March where researchers and faculty shared dialog on their working relationships and how they can address common issues of improving student success, supporting evidence-based decision making, and moving toward a culture of inquiry. Meetings were held at Mt. San Antonio College, Merritt College, Sierra College, and Mesa College. Research facilitators were Keith Wurtz (Chaffey), Linda Umbdenstock (Long Beach ret.), Bob Pacheco (Barstow) and Rob Johnstone (Skyline). Faculty facilitators were Gary Williams (Crafton Hills), Maggie Davis (Fresno), Janet Fulks (Bakersfield), and Lesley Kawaguchi (Santa Monica).

These regional meetings have been addressing SLO assessment issues from the field for the last three years, growing from about 75 to over 175 participants this year. The regional meetings are the product of collaborative work between researchers and faculty committed to improving student success. This year the focus of the workshops was on clarifying faculty and researcher roles in SLO assessment and program review. Registration for the workshop included participant input concerning their most pressing issues. From their responses it was evident that the key concerns had to do with program review. This formed the focus of the activities. Participants looked at faculty and researcher roles, discussed the types of data needed for program review, and examined program review case studies from basic skills, career technical education and student services. Discussions also focused on using program level student learning outcomes to drive program review and linking program review data to planning and budgeting. As participants dug into case studies they learned each other's language, better understood roles, and shared observations on what works, or doesn't.

One function of the meetings was to clarify the expertise and contributions of researchers and SLO coordinators from their own domains as described in the National Research Council's (NRC) book Knowing What Students Know: The Science and Design of Educational Assessment.1 The NRC reported that the educational assessment design process must be collaborative, cross-disciplinary and iterative. Classroom educators, subject matter experts, cognitive scientists, and researchers all play an important role in designing effective assessments and they take time to develop. The table on the following page indicates some of the roles and strengths identified by SLO coordinators and researchers.

Further discussions identified concern over who the research function reports to, since the demand for accountability and operational data often supersede the need for research on basic skills, course success, or student learning outcomes. Prioritization of research should include college-wide discussions based on mission and goals. Research should enable better teaching and learning, not just be used to count beans.

Participants and event planners called this a "marriage made in heaven"-indicating the synergy and advantage of collaboration between research and the classroom.

Another outcome of the RP/ASCCC collaboration on SLO assessment is an ongoing process to create an SLO assessment glossary (See article on page 2 of this Rostrum for more information) and an online course for SLO assessment that faculty and researchers can participate in for certification and continuing education.


1 Pellegrino, Chudowsky, & Glaser ed. 2001. Knowing what students know: The science and design of educational assessment. Washington DC: National Academy Press

SLO Coordinators Roles & Strength
As self-identified by SLO coordinators
Simplify the assessment process for faculty
Work with adjuncts
Get buy-in
Make the link between outcomes assessment, program review, and planning
Make a student centered culture on campus
Train faculty
Understand formative and summative assessment - assess to assist and assess to advance
Facilitate dialog
Help faculty translate what they do and how they assess into SLO language and formalized structures
Make explicit what faculty are already doing implicitly
Work with Senates advocating with state, WASC, and federal level to ensure that we avoid NCLB-like approach

As identified by researchers about SLO coordinators
Bring student connection to the entire process
Motivate
Bridge faculty, research, and administration
Create ways to bring people along with the process
Responding to resistance
Be aware of different views and bring those forward
Convey the value of the process in providing feedback to faculty
Help remove the technical barriers to facilitate understanding
Parse language - proper terminology for faculty
Lead and facilitate conversations
Respond to skepticism
Understand and articulate faculty and student needs and issues
Describe the assessment cycle to skeptics in non-technical language
Develop timelines for assessment
Navigate the intricacies of different disciplines and the political intricacies of divisions
Know who the key campus faculty are for success / failure of a project
Interact with faculty unions, understand shared governance

Researchers Roles & Strengths
As self -identified by researchers
Locate, and make available, data
Identify and help establish relevant benchmarks
Present data to non-technical audiences
Create data explanations and visuals that make sense to non-researchers
Interpret results in layman terms
Assist with evidence based decision making
Analyze and use of data
Explain and determine sampling
Clarify research questions
Validate assessment instrument
Designing research methods and surveys Facilitate discussions about data and evidence
Facilitate discussions to define goals so they are measurable and objective
Provide statistical analysis - selecting appropriate and implementing

As identified by SLO coordinators about researchers
Share the overall picture and open up different ways of looking at things,
Work with people who are non-technical
Bring WASC requirements to Student Services and Instruction Collaborate
Describe persistence and evidence based change and backed by data
Bring forth compelling questions
Stimulate questions about the context and meaning of the data
Ask important questions to see if we are actually measuring what want to measure
Facilitate creative problem solving
Help others who aren't as comfortable with data
Work with details, staying on task, documenting results
Provide an institutional level view
Identify and gather data about what faculty think is interesting to know
Help faculty develop surveys
Help with ensuring validity
Defend the proper use of data and resist the improper use of Data
Always ask - will this improve teaching and learning
Direct studies and surveys that answer good questions and provide relevant and useful data

The articles published in the Rostrum do not necessarily represent the adopted positions of the academic senate. For adopted positions and recommendations, please browse this website.