SLO Assessment Handbook

Cerro Coso sign

Introduction

This handbook is intended to provide an overview of the Outcomes Assessment Committee (OAC) structure, purpose, and to provide Cerro Coso faculty and staff guidance in the development and meaningful assessment of outcomes.

Outcomes Assessment Committee

The Outcomes Assessment Committee is an associated committee of College Council, under the Institutional Effectiveness Committee (IEC). The committee has a participatory governance structure with representation from all employee groups and is charged with providing oversight for the College's outcome assessment processes and documents in order to improve student learning and achievement. The committee maintains its faculty emphasis, being the largest representative group, and a faculty member is the Student Learning Outcome (SLO) Coordinator, and committee chair. The faculty represent the members of the standing Outcomes Assessment Committee of the Academic Senate. Student Learning Outcomes and Assessment are an integral part of curriculum and educational program development which are Title 5 “10 + 1” areas.

The committee maintains a website for SLO resources and data are housed. Formal and informal resources are available for faculty, staff, students, and the public. These resources highlight best practice and effective strategies in learning outcome assessment and can provide guidance for faculty and staff and a context through which to interpret the information for students and the public.

Charge

To provide oversight for the College's outcome assessment processes and documents in order to improve student learning and achievement.

Purpose

  • Ensure that outcome assessment is ongoing, systematic, and used to assess and improve student learning and achievement.
  • Promote ongoing, pervasive, and robust dialogue about student learning.
  • Ensure the ongoing evaluation and fine-tuning of organizational structures to support student learning.
  • Maintain student learning improvement as a visible priority in all practices and structures across the College.
  • Ensure all practices, services, and structures provide students with an optimal learning environment.
  • Ensure that student learning and administrative unit outcomes are specifically linked to program reviews.

Composition

Every effort shall be made to ensure equity and diverse composition amongst the committee members for areas of administration, staff, and faculty as well as campuses. Representatives from the various college campus site may come from administration, classified staff, and/or faculty. Every effort shall be made to ensure a variety of disciplines within the faculty. See Roles and Responsibility section below for specific duties of members.

  • Student Learning Coordinator (faculty) - Chair
  • Vice President, Instruction/Administrative Representative
  • Program Review Chair/Faculty Representative
  • One (1) additional administrative representative
  • Five (5) additional faculty representatives
  • Two (2) classified representatives - one of which is from the Office of Institutional Research, if possible
  • Student representative

Evaluation and Assessment

The outcomes assessment process itself is evaluated both formally and informally. Formally, the committee is evaluated through an annual self-evaluation of the committee's work and goals, scored with a rubric by the Institutional Effectiveness Committee. The following standards are analyzed:

  • Student learning outcomes and assessment are ongoing, systematic, and used for continuous quality improvement.
  • Dialogue about student learning is ongoing, pervasive, and robust.
  • There is evaluation of student learning outcomes processes.
  • Evaluation and fine tuning of organizational structure to support student learning are ongoing.
  • Learning outcomes are specifically linked to program reviews.

And once every two years, the committee is additionally assessed by the strategic planning survey which asks employees about their awareness of and satisfaction with outcomes assessment. Informally, the SLO Coordinator, by virtue of frequently meeting with faculty and staff to discuss best practices and being a member of the curriculum committee and IEC, ensures continuous monitoring of outcome assessment practices and is often the first to hear of gaps and suggested revisions to the process for improvement.

Assessment at Cerro Coso

Cerro Coso Community College is committed to the ongoing assessment of student learning in academic programs and student services through a systematic, college-wide assessment plan. The results of assessment provide clear evidence of student learning, and student experiences, and are used to make further improvement to instruction and services.

The College embraces the idea that learning assessment is a natural extension of instruction and student services and that all departments and units have a responsibility to regularly evaluate the knowledge and skills that comprise student learning and student achievement and make adjustments in operations or teaching methodology when outcomes are not met.

Philosophy

Self-assessment is a natural extension of instruction and student services, and all members of the College share in this responsibility. Student populations are becoming more diverse, and a rapidly changing employment economy creates challenges to meet all students' needs effectively. Consequently, the teaching methods of today may not work as well for tomorrow's learners. We need to continually assess what is working and what requires improvement. Another trend that makes self-assessment a natural academic activity is the culture of teaching and learning is shifting from independence and autonomy to interdependence and collaboration. Intra-departmental, collaborative assessment is a natural extension of this culture. We want to ensure that students are learning, so we should be interested in verifying this. Finally, we are accountable to external organizations and students, as consumers, for our learning effectiveness. Assessment certifies the quality of the education we offer.

  • We value assessment as a process for continuous quality improvement and evolution of programs and services. We believe when done intentionally, and with meaningful analysis, the assessment processes lead to improvement in student achievement and services.
  • We value a process that is simple, but not simplistic. Outcome assessment should be simple enough to be manageable and sustainable, but it should thorough enough to assess and improve instructional programs and services.
  • We value quality over quantity. Learning Outcomes are intentional, measurable, and succinct. They represent the major skills, knowledge, abilities a student will acquire at the successful completion of a course, or program.
  • We value assessment of support programs and services as an effective means to ensure that student learning occurs in an environment that values the student experience.
  • We value assessment of courses and programs as a faculty-driven process to ensure that it is constructive and non-punitive. The process supports full faculty participation and the successful completion of an assessment cycle, including the definition of outcomes and assessments, assessment design and collection of data, analysis of the data, and implementation of improvements based on the data.

Definition of Student Learning Outcome Assessment

Student learning outcome assessment is an activity in which institutional and instructional effectiveness is certified by evidence of student learning, or experience with programs and services. Specific measurable learning behaviors are identified and assessed, and the results of the assessment are used to improve programs, courses, and services. Assessment, in this context, is not an evaluation of individual students or faculty.

There are several other concepts implicit to assessment:

  • Its primary purpose is to improve student learning and services at Cerro Coso.
  • It is a process that is on-going and cyclical.
  • It does not encroach upon academic freedom.
  • The results are used constructively, not punitively.
  • Related to academic courses and programs, it is faculty driven.
  • It is a collaborative process.

It is a process by which individual learning outcomes are defined at the administrative service unit, institutional, program, and course level. For a particular outcome, expected student achievement /experience is compared with actual outcomes, using predetermined benchmarks. If the results are lower than what has been determined to be acceptable, a plan to improve student learning, or services is developed and implemented.

Assessment, in this context, is not related to grades or faculty evaluation. Although students provide evidence of learning, this is not an assessment of individuals, but an assessment of curriculum design and institutional best practices to the end that students are successfully learning.

Outcomes Defined

Outcomes are the end result - changes in the learners' knowledge, skills, attitudes, and habits of mind that develop as a result from being involved in a course of study, program of study, activity, or service.

  • An outcome must be measurable, and meaningful.

Cerro Coso has the following outcome categories:

  • CSLO- Course Student Learning Outcome: Each CSLO identifies a specific learning goal a learner will achieve when successfully completing the course. CSLOs are defined at the individual course level and identify the knowledge, skills, and abilities a student will achieve upon successful completion of the course. Courses generally have 3-6 CSLOs.
    • Course Student Learning Outcomes are different from Course Objectives. Course objectives are incorporated throughout the COR in order to align with C-ID. Objectives nestle under CSLOs.
  • PSLO- Program Learning Outcome: Describe what learners will know and be able to do when they complete a program of study. They are closely linked with the CSLOs in the courses that make up the program. Programs will map CSLOs to PSLOs. *Programs equal certificates and degrees, as well as sequences or groups of courses that allow students to achieve an academic objective, such as the general education pattern, employment related skills, and the honors program.
  • ISLO- Institution Learning Outcome: ILO's represent competencies learners will achieve while completing a program and represent broad learning categories. Programs will map CSLOs to ISLOs.
  • GELO - General Education Learning Outcome: General Education courses are mapped to broad learning outcomes.
  • AUO - Administrative Unit Outcome: Represent the key functions and services of student services, learning support, and administrative units.

Continuous Quality Improvement Practices

  • All SLOs/AUOs are assessed at least once in the first three years of each Program Review cycle; PLOs are assessed within the fourth year of each Program Review cycle. Many courses within programs are assessed more often, but the minimum requirement is at least once during each program review cycle. As best practice this committee recommends that that those courses that are within multiple programs should be assessed every two years to ensure each program has meaningful assessment data. The program review template requires departments and units to complete a chart listing the due dates of each CSLO, PSLO, and/or AUO during the five-year cycle.
  • Assessment data will be entered into eLumen.
  • New courses are assessed the first semester taught, providing valuable reflection on the course structure, teaching strategies, and assessment methods. This assessment may be entered as a formative assessment, meaning it will not be reflected in the program's data, or it may be entered as a summative assessment, and reflected in the program's data.
  • All courses in a program are mapped to program learning outcomes and also to applicable institutional outcomes (ISLOs). Appendix D.
  • Formative assessment is used to inform teaching/services during the course/event by checking understanding and learning at various points in instruction/service. These assessments may be captured in eLumen if the faculty member, department, or unit wish.
  • Summative assessment is used to evaluate learning at the end of instruction/service. These assessments are captured in eLumen and used for reporting.
  • The development, assessment, and analysis of outcomes are the result of college-wide collaborations and dialogue.
    • CSLO's and PSLO's are defined in official curriculum documents and are the result of departmental dialogue guided by advice and feedback from the SLO coordinator and outcomes committee. The SLO Coordinator, who is always a faculty member, sits on the curriculum committee and serves in a technical review role to provide feedback on CSLO/PSLO design during the curriculum approval process.
    • AUO's of an operational unit are defined by the supervising administrator based on core functions of the department. The SLO coordinator, supervisor, institutional researcher, department staff, and/or other mentoring staff member provides input and assistance to supervising administrators defining AUO's and determining assessment instruments.
  • For all CSLO's, PSLO's, and AUO's, the specific assessment instrument is determined when the outcome is defined and can be located in the course outlines of record and program review documents. For many instructional programs PSLO's are mapped-over class assignments or entire CSLO's from key or capstone classes. For example, the second PSLO of the Web Fundamentals certificate—demonstrate technical and creative mastery of the creation of web media, such as graphics, motion graphics, and interactive media—is assessed by a project scored with a rubric in Digital Media Arts C102.
  • When outcomes are assessed, the expectation of all CSLO's and PSLO's for instructional programs is that collected data reflects all offerings at all campus locations, including online, Rising Scholars Program (RSP) courses, and full time/part time faculty.
  • AUO assessment will reflect all campus locations where the unit provides services.
  • Once assessed, outcomes can be aggregated, disaggregated, and analyzed to drive improvements in instruction and services. For all CSLO's, PSLO's, and AUO's, if gaps are detected appropriate remediation is expected to be designed and implemented and the outcomes reassessed at the next cycle or earliest available opportunity.
    • For instructional programs, faculty will be prompted to complete a reflection template after entering assessment results. This provides the faculty member an opportunity to reflect on teaching strategies, content, unique challenges, etc. and discuss improvements at the time the results are posted. In addition, outcome assessment gaps and improvements are discussed in both annual unit plans and the program review documents. Appendix C
    • For students services and administrative units, a reflection template will be provided after entering assessment results. (to be developed eLumen)
  • When presenting course/program revision to CIC, faculty will discuss when a course /program was last assessed, and how the results influenced the proposal.
  • As of Spring 2019, programs with less than 90% of their courses assessed are not eligible to complete Program Review.
  • The AUP and Program Review templates require programs and units to identify all outcomes that were assessed, to identify gaps/themes, and link CSLO and PSLO data to budget requests. Outcome assessment information and results directly impact student behavior and achievement as faculty and staff identify best practices and collaboration opportunities both internally and externally with colleagues. Divisions, Units, Programs and Departments must directly correlate CSLO assessment and student success to requests for resources.
  • The OAC reviews Annual Unit Plans for identified learning outcome gaps and themes, and uses this information to identify trends, training, and professional development opportunities.
  • The OAC recommends the following schedule of regular assessment within the 5-year Program Review Cycle:
  • Years 1-3: All Course CSLOs assessed at least one time, re-assess any gaps identified.
  • Year 4: re-assess any gaps identified in year 3, assess PSLOs.
  • Year 5: write and submit Program Review.

Roles and Responsibilities

Outcomes Assessment Faculty Coordinator

The Student Learning Outcome Coordinator provides college-wide leadership in the implementation of student learning outcome assessment. Under the supervision of the Vice President of Instruction, the faculty member receives reassigned time to provide leadership in maintaining and improving the outcomes assessment process.

Examples of Duties and Responsibilities:

  • Serves as Chair of the Outcomes Assessment Committee with primary responsibility for the committee's effectiveness in carrying out its charge act of promoting student learning by providing leadership in continuous and sustainable outcomes assessment and to foster a culture of inquiry.
  • Participates on the CIC committee as the technical reviewer of CSLO's and PSLO's in course and program proposals and as a resource to the committee on best practices in defining and assessing outcomes.
  • Participates on the Institutional Effectiveness Committee as the primary spokesperson for student learning and administrative unit outcomes, the outcomes assessment process, the integration of outcomes assessment results in the college's integrated planning effort, and continuous review and improvement of the outcomes assessment process.
  • Participates in the annual evaluation of the outcome's assessment committee, identifies gaps in quality performance and collaborates with the committee to design improvements and implement changes.
  • Collaborates with the Vice President of Academic Affairs, the Program Review Coordinator, and the Accreditation Liaison Officer to maintain linkage among program review, outcomes assessment, and institutional planning, including resource allocation.
  • Attend Program Review meetings when needed to disseminate OAC recommendations related to Program Review documents.
  • Works with college department, unit, section, and division leadership to ensure college wide participation in the outcome's assessment process.
  • Acts as the primary point of contact for outcomes assessment related inquiries.
  • Assists in training college faculty, managers, and staff in outcomes assessment procedures and best practices.
  • Provides critical feedback to college constituents regarding outcomes assessment documents, plans, and integration of assessment results for continuous program improvement.
  • Monitor outcomes assessment timelines.
  • Maintains current knowledge of external developments in outcomes assessment, such as through literature review, participation on listservs, and attendance at conferences and workshops.
  • Supervises collection and archival of outcomes assessment data and processes for both internal and external use.
  • Provides administrative management of the eLumen Assessment Module for college wide use and oversees the module's assessment process. Collaborates with Distance Education for eLumen-Canvas assessment integration. Or other learning management system, program(s), or platforms adopted by the college for assessment planning and integration.

The SLO Coordinator should have a strong understanding of curriculum, program review, and accreditation standards, and is a member of the Curriculum and Instruction Council. The following is a list of other skills identified as necessary for SLO Coordinators, based on input from SLO Coordinators, curriculum chairs, and administrators throughout California:

An understanding of student learning outcomes and assessment

  • Classroom teaching experience
  • Educational research
  • Sensitivity to diverse backgrounds
  • Faculty leadership
  • Strong interpersonal and motivational skills
  • Organization and ability to keep current records
  • Knowledge of institutional processes

The SLO Coordinator candidate is approved by both the IEC and the Academic Senate.

Outcome Assessment Committee Members

  • Attend and actively participate in OAC meetings and related work
  • Must attend at least 70% of meetings (6 out of 8 meetings per year) and/or provide feedback for materials if possible if they cannot attend.
  • Attendance may be excused and not counted an absence if attending other college requirements such as faculty development. However, this should not used as a regular excuse for attending meetings and member may be asked to vacate if other obligations repeatedly prevent attendance at meeting.
  • Report on assessment issues and requests faculty input on assessment in their own area/unit
  • Assist colleagues with assessment guidance
  • Assist committee in the review and evaluation of assessment reporting in Annual Unit Plans and Program Review documents
  • Participate in assessment of committee and establish goal annually for improvement.
  • Failure to actively participate will result in contact from the Chair/Coordinator regarding participation and then if there is no improvement in participation the member will be removal from the Committee.

Faculty Chairs

Faculty Chairs assume primary responsibility for all aspects of student learning outcome or administrative outcome assessment, although the process should be collaborative within departments and/or programs, and it may be necessary to rely more heavily on particular faculty members who have more expertise in a course's subject matter. Evidence of collaboration and dialog should be included in Department meeting minutes, Annual Unit Plans, and Program Review Documents. Individual faculty are responsible to gather CSLO Assessment information in their course when an assessment is planned.

Institutional Researcher

Cerro Coso has access to the Office of Institutional Research (OIR) for support for unit plan, program review, and student learning outcome assessment data. The IR office is available to provide guidance in the crafting of assessments that are valid and reliable and to assist in the collection of data that is not easily attainable through classroom-embedded assessments or Cognos or current applicable programs. It is important that we have a researcher who is a member of our college culture and understands the complexities of serving students across multiple sites over a large geographic area. One member of OIR will serve on the OAC committee to the extent possible. This position will serve as one of the two classified staff positions.

Assessment of CSLOs in particular should be viewed as a reflective practice in intentional teaching. The primary focus should not be formal research, but instead focus is placed on reflection of course content in relation to CSLOs, and the learner's demonstration of mastery at the end of the course.

Outcomes and Assessment Overview

What are Student Learning Outcomes?

Student Learning Outcomes identify what students can DO to demonstrate that they are learning. There should be clear linkages between student behavior, the production of a learning artifact, and assessment of that artifact. Outcomes MUST be measurable.

Other characteristics of student learning outcomes include:

  • They are NOT instructional objectives or goals.
  • They are an observable, behavioral outcome - what the learner will be able to do.
  • They focus on the end result, not on the learning process.
  • They are learner-centered, rather than instructor-centered.
  • They may or may not be content specific.
  • They should take a diverse student population into account.
  • They should be independent from the deliver mode (classroom, ITV, online).
  • As much as possible and appropriate, they should require higher-level cognitive, affective, and/or psychomotor domains.
  • Bloom's Taxonomy is recommended as a resource for the selection of outcome verbs.
  • SLO assessment is independent of individual student grade.

Programs of study are created with what the student will know and be able to do at the end of the program. Program learning outcomes should be defined first, resulting from input from advisory committees or academic organizations for the discipline. Course student learning outcomes should emerge from program student learning outcomes. A curriculum map is useful in presenting how courses align or map to program learning outcomes.

Writing CSLO, PSLO, and GELO Outcomes

We refer to Bloom's Taxonomy of Educational Objectives (see Appendix A) for suggestions about appropriate observable outcomes (although Bloom's is not an exhaustive list). Bloom organized outcomes into three domains: cognitive, psychomotor, and affective. The cognitive domain relates to knowledge, the psychomotor domain relates to skills, and the affective domain relates to attitudes and values. If possible, we favor a set of outcomes that draw from each domain, although the psychomotor domain may not be appropriate for all programs or courses. Each of those domains has outcomes further organized according to depth of processing. We favor higher level outcomes that demonstrate critical thinking, a high degree of skill mastery, or personal integration of attitudes and values. Such higher-level outcomes are listed in the right columns of the outcome tables. Refer to Appendix A for outcome examples, and Appendix B for examples of verbs that are difficult to measure.

Acceptable Results / Target

It is also useful to determine what the acceptable benchmark of student achievement will be. This has nothing to do with students passing courses or obtaining credit. Although we are measuring student learning in assessment, the objective is to determine how well we are doing with respect to instruction or student services. The question to be considered is: at what level would we determine that there is nothing more that we can do to improve student learning? There are student success factors that are outside of our control, so 100% student success is not realistic. However, something less than 100% will be appropriate, perhaps 90%, 85%, 80%, etc.

  • The OAC has set 70% as the default - no assessment shall be set below this success rate.

The determination of what will be acceptable is dependent upon many factors and, at first, may have to be a best guess among departments and program areas. That benchmark may differ from department to department, and it may differ between courses within a department. It may even differ between outcomes within a single course. An illustration of why this may differ is the following:

In some programs, entry level courses may have greater attrition than advanced courses because some students likely discover sooner rather than later that the program is not a good fit for their interests or aptitudes. This is a factor over which we have no control. Defining 75% as an acceptable result for assessment may be appropriate for an entry level course, during which many students are determining whether they are really interested in that program of study, whereas 95% might be appropriate for the capstone course of the same program because presumably by that time, students are confident about their academic goals. We would expect greater success, given the same quality of instruction.

Again, you are determining the point at which you believe institutional enhancements will no longer improve the results. This benchmark will inform you about what to do with the assessment data—make improvements or congratulate yourselves. There is not a science to this. Determining appropriate levels is best achieved through continuous dialog within your department, as well as reassessment of the criteria after an assessment cycle.

The College's Default Levels of Mastery:
  • Meets expectations: The student demonstrated mastery. (Met)
  • Somewhat meets/ Almost meets / Barely meets expectations: The student is making progress toward but has not yet achieved mastery. (Optional use by each department). May be used to identify students who are within +-5% of the target.
  • Does not meet expectations: The student did not demonstrate mastery. (Not Met)

See below Appendix A for examples of SLO language and related performance descriptors for assessing the SLO.

Assessment Artifacts

Finally, the student learning outcome assessment definition needs to specify how the outcome will be measured. This includes an artifact and a method for scoring the quality of that artifact. Examples of common assessment artifacts include:

  • Projects
  • Portfolios
  • Essays
  • Speeches
  • Performances
  • Skill Demonstrations
  • Athletic performances
  • Exit Interviews
  • Multiple Choice Exams
  • Essay Exams
  • Surveys
  • Critiques

The artifact(s) chosen for the assessment should be appropriate for the outcome verb. For example, a learning outcome of “describe” is better measured by an essay than a multiple-choice exam. Another consideration for the selection of an artifact is the relative ease or difficulty that the assessment can be conducted. Exams and surveys are easier to administer than portfolio assessments that are scored with a rubric. Departments should give careful thought about choosing an assessment that effectively measures the learning outcome but is also reasonable to administer.

Principles of Assessment Tools

When considering which artifact/tool to use, consider the following:

  • Isolate knowledge, and skills. Sometimes a CSLO has more than one variable to measure. Separate out the variables, so that each can be assessed. Consider if the student does not meet requirements in one variable, does it mean they have not mastered the entire CSLO. This is especially true for math, or multi-step processes.
  • Separate out levels of student mastery. Met / Not Met is acceptable, but would it be valuable to know how many of the students who met the outcome just barely met the target, vs. those who clearly met the target, or those how just barely missed the target vs. those who clearly missed the target?
  • The tool should lend itself to repeated use, semester after semester. Consider how to guard against plagiarism/cheating. If using test bank generated questions, will these need to be revised if a new textbook is chosen?
  • Develop a rubric that can guide in determining if the outcome was met.

Assessment Scoring

Some of the above artifacts can be simply scored for correctness, as is the case with multiple choice exams.

  • Rubrics are appropriate for scoring projects, portfolios, essays, speeches, performances, skill demonstrations, critiques, or essay exams.
  • Response scales, such as Likert (respondents choose Strongly Agree, Somewhat Agree, Neutral, Somewhat Disagree, Strongly Disagree) may be useful in scoring surveys, interviews, or critiques. A scale might also be used to score an artifact holistically.

Assignment or course grades are not a valid means of assessing student learning outcomes for the following reasons.

  • Course grades and many assignments reflect multiple skills and outcomes. We need to tease out a specific outcome for measurement.
  • Course grades also may reflect criteria that have nothing to do with course learning outcomes but are imposed within a course to motivate participation and the development of a learning community. For example, an essay may also include grading points for formatting, spelling, grammar, and other content not directly related to the SLO/PLO. The SLO material will need to be assessed separately from the grade for the essay as a whole.
  • Grades are an individual evaluation, whereas outcome assessment is collaborative, and the results generalized.

However, certain types of course assignments can be leveraged for student assessment AND course assessment.

  • Ensure that the same assignments and measuring tools are used in every single section of a course over multiple semesters and among all faculty. There must be a way to tease out a specific outcome and assess only that outcome.
  • Collaborate with faculty to “norm” the assessment tool, rubric, and determine measurements for each level of mastery.

See below Appendix A for more specific examples of language for rubrics and performance descriptors for assessing SLOs. See also the separate Bloom Taxonomy and Rubric information on the OAC employee webpage about designing and creating rubrics.

Administrative Outcomes

Administrative Unit Outcomes (AOU) identify what students (or clients) will experience or receive as a result of a given service. AOUs may also be business related, identifying particular goals related to efficiency or achievement.

Below are ideas and concepts to consider when planning your AUO and AUO language. Remember the AUO is a broad sweeping and integral part of the office itself. It is not a goal, but what services are offered by the office on a continuing basis. Think about the overall functions of the office. Goals within an Annual Unit Plan need to connect to the AUO and can help measure an AUO but are not an AUO. It may also be a good idea to work with OAC or OIR to see if certain AUOs are measurable and how they could be measured/assessed.

AUO - Administrative Unit Outcome

  • Represent the key functions and services of an administrative unit.
  • Enable administrative units to maintain focus on their role in student experience and success.
  • AUOS should be actionable by the unit.
  • These are separate from Goals, Initiatives, and Strategies and should be something that is identified and would continue throughout the program review cycle..

Consider / assess

  • Service
  • Efficiency
  • Compliance
  • Student /staff satisfaction - experience or understanding as a result

Selecting Measures AUOs

  • Data should not be difficult to collect or access
  • Multiple methods of assessing each AUO:
    • Direct and indirect measures
    • Qualitative and quantitative
  • Data should represent all service points
  • Connect to existing goals such as accreditation, strategic goals, etc.
  • Use / connect to institutional data and resources when possible

Implementation, Planning, Budgeting

If a plan to improve student learning was developed, it should be implemented and reassessed in a new Assessment Study to verify that student learning has, indeed, improved. As has been previously mentioned, assessment is an on-going and cyclical activity. If the results of the previous study were acceptable, the next Assessment Study should focus on a different outcome.

Assessment results and plans for improvement must be integrated into our other institutional plans and processes. Because Cerro Coso Community College exists so that students may learn, there must be a link between the results of Assessment Studies and everything else that we do at Cerro Coso.

Outcome assessment results are thoroughly integrated into program reviews in the section titled “Achievement of Outcomes,” where the appropriate CSLO's, PSLO's, and AUO's are reported on, analyzed, and serve as the basis of dialogue about change. They are also integrated into the annual planning template where departments and units report out on actions taken and on gaps identified in assessment activities undertaken in the immediately preceding academic year. The Unit Plan is included in the Educational Master Plan, which drives the College's Technology Plan, the Staffing Plan, the Facilities Master Plan, and the College's budget. Some improvements to student learning can be made with instructional practices, but sometimes institutional support is needed, and this process accomplishes that.

Publishing

Results of outcome assessments are published on the college website.

  • PSLO assessments for every instructional program are posted both on the OAC page as well in the outcomes section of each program.
  • PSLO results are posted as program reviews are completed.
  • AUO assessments for student services, learning support services, and administrative services are published on the OAC page.
  • Outcome assessment results are also reported out in annual unit plans and in program reviews.
  • ISLOs are published on the college website and in the catalog.

Specialized Assessment Practices

The Assessment Study

For programs wishing to delve deeper into assessment, they may plan an assessment study. The Assessment Study is the process by which a specific outcome is chosen (not necessarily CSLOs, PSLOs, or GELOs) and the results analyzed. It is important to understand that only one (1) outcome is assessed in a particular study.

For detailed information on designing an assessment study, refer to Appendix E. It is recommended that the department/unit work with both the Outcomes Assessment Coordinator and the Office of Institutional Research (OIR) if an assessment study is desired.

Appendix A: Instructional Student Learning Outcome Examples

The following are examples of several complete outcome statements. These are meant to be examples of what could be used in your courses. Remember to apply active verbs to the SLO statement.

Academic Programs

CSLO Examples

Upon successful completion of the course, students will be able to:

  1. Apply developmentally appropriate teaching strategies and theories to curriculum and environment design.
  2. Evaluate out-of-school programs and how they meet the developmental needs of children in middle-childhood and adolescence.
  3. Identify the elements of a contract and whether it is governed by the common law or Uniform Commercial Code.
  4. Apply a variety of rhetorical strategies in writing unified, well-organized academic essays with arguable theses and persuasive support, using complex ideas presented in university-level sources.
  5. Apply appropriate inferential analyses to real situations in order to draw conclusions about a population or several populations.
  6. Explain the key theories and concepts related to the forces of evolution, including mutation, natural selection, genetic drift, and gene flow.
Performance Descriptor examples for assessing the SLOs

Below are a few examples that could be used to assess some of the above SLOs. This language needs to be included in the current eLumen system as performance descriptors within the assessment data and included within the “Need an assessment planned” document. The performance descriptors are numbered sequentially to match the above SLOs.

  1. Met: The response is thorough and includes all elements. Developmentally appropriate strategies are applied to curriculum and environment design. The response was good or above on all components: Introduction, Curriculum web, and Activities.

    Somewhat Met: The response includes developmentally appropriate strategies applied to curriculum and environment design. The response may be mixed, but at least average on all components: Introduction, Curriculum web, and Activities.

    Not Met: The response was incomplete and did not adequately include developmentally appropriate strategies applied to curriculum and environment design. The response was not at least average on all components: Introduction, Curriculum web, and Activities.

  2. Met: Response clearly evaluates data and connects it to how the program is meeting the developmental needs of the children. The roles of the environment, adults and activities is thoroughly discussed.

    Somewhat met: Response provides reasonable connection to how the program is meeting the developmental needs of the children. The roles of the environment, adults and activities is discussed.

    Not Met: Response does not discuss how the program is / is not meeting the developmental needs of children. There is little or discussion addressing the roles of the environment, adults, and activities.

  3. Met: Students correctly identified and addressed at least 4 out of 6 elements in the definition of a contract. Students discussed whether the contract was governed by the UCC or common law.

    Somewhat Met: Not used.

    Not Met: Students correctly identified 3 or less of the elements in the definition of a contract. Students may or may not have addressed whether the contract was governed by the UCC or common law.

  4. Met: Essay demonstrates use of varied rhetorical strategies in developing and supporting an arguable thesis using complex ideas presented in university-level sources.

    Somewhat Met: Not used.

    Not Met: Essay fails to demonstrate use of varied rhetorical strategies in developing and supporting an arguable thesis using complex ideas presented in university-level sources.

  5. Met: Scored 70% or higher on related exam questions.

    Somewhat Met: Scored between 65% -69% on related exam questions.

    Not Met: Scored below 64% on related exam questions.

  6. Met: The response is complete, correctly identifies and defines key terms, includes examples, and links the concepts to evolutionary change.

    Somewhat Met: Not used.

    Not Met: The response may be incomplete, fails to define terms, does not include correct examples, or does not adequately explain how the concepts affect evolution.

PSLO Examples

These examples relate specifically to the Program Learning Outcomes. The examples from different programs are provided as samples to help guide you in development of your own PLOs

Upon successful completion of the program, students will be able to:

  • Develop and display a portfolio of visual art works from a variety of visual art disciplines that reflects a personal direction and individual creativity. The portfolio will be assessed by a rubric.
  • Critique selected aspects of human social and cultural life from an anthropological perspective.
  • Conduct ethical legal research and use other investigative functions to gather relevant information.
  • Define, identify, and analyze literary and dramatic techniques in a variety of works.
  • Perform hands-on laboratory and/or field experiments of all science classes safely.

At least one PLO should be connected for assessment within each course within the program of study. A PLO should directly relate and connect to at least one SLO in the course. It is not a requirement for all SLOs to connect to a PLO. If it is not connecting, consider what needs to be revised and how.

Administrative and Student Services

Student Services SLO Examples:

  • Individuals who participate in activities and events will have an increased sense of engagement. This will be measured by the student satisfaction survey and the percentage of students participating.
  • Optimize technology to provide resources and academic support services across multiple modalities.
  • Improve student success outcomes by providing core services (orientation, placement, counseling, education planning).

Administrative Unit Outcome Examples

  • Provide materials and services that support the college's programs and the interests of students, staff, and faculty.
  • Increase student access to FA office Technicians for assistance Financial Aid/Scholarships.
  • Provide materials and programs that support academic programs and the research interests of students, staff, and faculty.
  • Provide useful, clear information to all students.
  • Provide a developmentally appropriate, play-based program for young children that is a model program for students and professionals related to the early education field.
  • Implement strategies to support first-time students completing an education plan prior to or within their first semester.

Appendix B: Examples of Difficult to Measure Verbs

The following are examples of verbs that may be difficult to measure in an area of study. Try to avoid verbs that are abstract. Move towards using verbs that are active and measurable. When developing your SLO/PLO/AUO language consider how it will be measured as you are writing it. What assessment tools will you be using? Also, remember that some verbs may work will in one area of study but not as well in another.

  • “Demonstrate competency”, or “Demonstrate an understanding…”
    • These are both abstract and do not describe the measurable, action the student will be able to do at the end of the course/program. Using a measurable action verb provides the student, and potential employers with clear information on the skills and knowledge the student is prepared to use.
  • Appreciate
  • Access
  • Develop
  • Have more confidence
  • Value
  • Recognize
  • Understand

Appendix C: Academic Reflection Template

Student Learning Outcome Reflection Template

Your responses in this reflection will go to the Department Chair, for discussion of the assessment tool, gaps, strategies, etc. within the department. Reflections can assist in creating a foundation to improve and evolve the best practices of teaching and learning. The reflection template is for the entire assessment, so if multiple CSLOs were assessed, consider each CSLO when reflecting.

  1. Describe any challenges that you had with the course or materials this term.
  2. How did the results of student outcomes in this course section compare to previous sections of this course? Did you enact any changes or improvements to your instruction, or to the course material that could account for this change? Could any other changes explain a change in scores? If this is your first term teaching this course, put N/A.
  3. Please discuss the strengths of the assessment (method (tool/assignment), artifact, rubric, timing of assessment in course, etc).
  4. Please discuss the weaknesses of the assessment (method (tool/assignment), artifact, rubric, timing of assessment in course, etc).
  5. Explain what you would do to change the course, if any, for future offerings.
    1. What actions will be taken to improve student performance? For example: particular skills or concepts that might need to be taught with additional attention, or using a different strategy; consider the curriculum- was enough time given to relevant learning goals?
    2. What training, professional development or resources do you feel you need as a result of this reflection (i.e., teaching strategies, equity resources, etc.)?
  6. How many students did not submit the assessment assignment?
  7. Please use this space to share any additional thoughts you feel relate to this particular course, and the assessment process or results.

Appendix D: Institutional Learning Outcomes

The ILOs are available at this link.

Appendix E: The Assessment Study

For programs wishing to delve deeper into assessment, they may plan an assessment study. The Assessment Study is NOT part of the regular cycle of assessment, rather is the process by which a specific outcome is chosen (not necessarily CSLOs, PSLOs, or GELOs) and the results analyzed. It is important to understand that only 1 outcome is assessed in a particular study.

This phase occurs over an appropriate period of time, to allow data to be collected from a sufficient sample. For the assessment of course student learning outcomes, this is usually 2-3 semesters. For program learning outcomes, it could be 2-3 years. There are 3 steps to the Assessment Study phase:

  • Design the study
  • Collect the data
  • Analyze the results

Design the Study

In the previous phase, the assessment artifact, scoring method, and possibly the criteria for success will have already been defined. At this point, however, departments or program areas will need to work out the details of how the assessment will be conducted. The following issues/questions should be considered:

  • What outcome will be assessed? Some suggestions...
    • A program's highest impact learning outcome
    • An outcome from a high impact course
    • An outcome that faculty are extremely passionate about
  • Who is on the assessment team?
    • Subjective assessments must have 2 or more assessors to reduce bias.
    • Objective assessments must have someone to tabulate the results.
  • What assessment artifacts or scoring devices need to be developed? For example:
    • Exam questions
    • Surveys
    • Interview questions
    • Likert scales
    • Rubrics
  • What criteria will be used to determine whether the outcome of the study is successful or not (if not previously defined)?
  • How will assessment artifacts be collected from students and archived until a review of those artifacts occurs? For example:
    • Video taping performances
    • Photocopying written works
    • Photographing visual works
    • Electronically storing digital works
    • Generating a mailing list for surveys, and budgeting for postage and self-addressed, stamped return envelopes
  • What will constitute a sufficient sample?
    • Data should ideally be collected from multiple course sections across terms, sites, delivery modes, and instructors. Depending upon the number of sections offered each semester, a sufficient sample may require data collection over 2-3 semesters. A few courses, however, may provide a sufficient sample over a single semester, due to the number of sections offered and variation of delivery locations, modes, and instructors. ENGL C101 may be an example.
    • Similarly, program learning outcomes should be assessed with a sufficient sample, which may mean several groups of graduating classes over 2-3 years.
  • How will the results be recorded?
    • Both objective and subjective data needs to be tabulated and compiled.
  • Who will write the analysis of the findings?

Collect the Data

With thorough planning, the data collection process is straightforward. There are a few points of note, however:

  • The same artifact and scoring method must be used throughout the study. In other words, exams cannot be given to some students and not others. If the assessment artifact is an embedded class assignment, all course sections, regardless of instructor or delivery mode, must include the identical assignment. If multiple assessment artifacts are used, then all must similarly be used consistently.
  • Given that only 1 outcome is assessed in a particular study, and course-embedded exams usually cover multiple course outcomes, the exam questions that pertain to the particular outcome need to be identified and the results somehow teased out and tabulated separately. Unless the entire exam only pertains to the single outcome, the general exam results cannot be used.
  • Subjective evaluations must be conducted with a team of assessors to reduce bias. All assessors must evaluate all artifacts. In other words, the team cannot divide the work up to get through the process faster.
  • It might be useful to document other data in association with learning outcome results, such as semester term, delivery mode, instructor, and/or campus site. This will be useful for analyzing the results and planning for improvement.
  • Departments might consider incorporating retention/attrition data into the results. We should not merely be interested in the students that persisted to the end of the course, but also those who dropped out along the way.

Analyze the Results

  • After tabulating the results and having already determined a benchmark of success, it will be clear whether students are achieving the outcome above, at, or below the expected level. If the result is at or above the expected level, congratulations are in order! This implies that there is nothing department faculty can do to improve the result. However, it may be worthwhile to discuss whether the criterion was set too low. This may be obvious if the department faculty can identify practices that could improve the result further.
  • If the result is lower than expected, there should be discussion about why that is the case and what can be done to improve the result. This is where the identification of other data in association with the outcome data is useful. If on-site courses have a better result than online courses, what can be done to improve student learning in online sections? If results are better for 16-week semester courses than for 8-week summer courses, is there a way to improve the outcome for summer courses? Perhaps the solution is that particular course should not be offered during the summer because there is not enough time on task. If one instructor produced better results than others, what is that instructor doing that should be replicated throughout the department?

Please note that this data should not be used to penalize faculty or to point out failures. It should only be used to identify best practices and implement what works well more consistently. This is a constructive process and faculty should have that spirit about it. (This is also a good time to point out that while faculty are asked to discuss student learning outcome assessment as a part of the Faculty Evaluation process, this should simply be a discussion of the instructor's involvement in the process. The results of assessment are not included in faculty evaluation.)

Based on a collaborative departmental process, the results should be analyzed and a plan for improvement developed. Be sure to take detailed minutes of all meetings in order to provide evidence of collegial dialogue.

Revised Spring 2021.