Course Assessment
Purpose: Just as the course is a foundational unit of the College, the assessment of courses, whether general education, program core, or workforce (non-credit), is at the center of the Mission of Community College of Philadelphia. Course assessment is one of the primary ways that we can know and show that we do what we say we do, which is to “provide a coherent foundation for college transfer, employment and lifelong learning” (Mission Statement).
- Course assessment is primarily focused on the assessment of course learning outcomes, although it may overlap with equity assessment and project assessment.
- Much of course assessment has a close relationship with program assessment, a relationship demonstrated by the process of curriculum mapping.
- The curriculum map typically includes all the program learning outcomes (PLOs) at one axis and the courses in the program at the other to show how the sequence of courses help students achieve the program learning outcomes.
- The letter “A” is used to designate the courses in which the PLO is assessed.
- Typically, only courses specifically designed for the program (e.g., FYE courses, introduction to the discipline courses, 200-level courses) are part of program assessment; however, some programs choose to include general education courses as they contribute to students’ achieving the program learning outcomes.
- Most general education courses (e.g., ENGL 101, FNMT 118, CIS 103, BIOL 106) are assessed at the course level and as part of the general education assessment process.
- Developmental education courses (e.g., ENGL 097, FNMT 017) and English as a Second Language courses (e.g., ENGL 071, ENGL 081, ENGL 091) do not count towards program completion (graduation) and are also assessed at the course level. They may be tied into programmatic goals such as persistence and pass rates in credit-bearing courses.
- The curriculum map typically includes all the program learning outcomes (PLOs) at one axis and the courses in the program at the other to show how the sequence of courses help students achieve the program learning outcomes.
- Course assessment is mostly accomplished by 1) defining specific course learning outcomes, 2) defining the methods of assessing each outcome, 3) aligning student artifacts with those methods, 4) collecting specific data related to those artifacts (e.g., writing assignments, presentations, clinical evaluations, embedded test questions), 5) analyzing and documenting the results of data, and 6) driving the data into different forms of continuous improvement (e.g., scaffolding, hands-on projects, curriculum change, etc.), and 7) re-assessing and documenting the results of the data-driven improvement strategies.
Course Assessment Planning and Curriculum Development
- Assessment is central to the curriculum development process.
- The curriculum development process includes the creation/revision of CLOs, the identification of methods of assessment specific CLOs, and the creation of sample assignments and rubrics that demonstrate how students will achieve the CLOs.
- CLOs and methods of assessment appear in every course document.
- Other key sections of the course document (sequence of topics, course activities, sample assignments) must be in alignment with the CLOs.
- Best Practices in Course Learning Outcomes Development:
- CLOs should be clear, descriptive, and student friendly. Students will see these outcomes in their syllabi. They should be able to discern what they can expect from the learning experience from the outcomes.
- While each department determines the number of outcomes for each course or program, a good rule of thumb is to have no more than five or six course learning outcomes to facilitate assessment scheduling.
- CLOs should include action verbs like “describe” or “explain” rather than head words like “demonstrate knowledge” or “understand.” Review Bloom’s Taxonomy action verbs for ideas.
- CLOs should reflect the level of learning that they describe. If the course reflects foundational learning, then verbs such as “define” or “identify” may be more appropriate. Courses that reflect a higher level of learning (e.g., build upon skills and knowledge that students have already acquired from previous courses) should use verbs that demonstrate more integrated higher-order thinking, such as “analyze” or “examine.”
- CLOs should focus on what we can assess, measure, or track. “Prepare to take the XYZ certification exam” is fine, but “pass the XYZ certification exam and transfer to a four-year degree program” is more difficult to assess.
- CLOs should be distinct from one another. It is difficult to assess outcomes that are layered or too similar.
- CLOs should be streamlined wherever possible. More is not better. The number should be what the program/department can reasonably expect to assess every semester or year.
- CLOs should reflect our world today. They should use respectful, inclusive, accessible language.
Equity in Course Assessment
- Equity assessment at the course level (usually of multiple sections over time) should be in alignment with already-defined divisional or departmental DEI goals.
- Equity assessment at the course level may involve several approaches:
- Using data disaggregated by race and gender to discover and address equity gaps
- Making a commitment to anti-bias training for all faculty and staff
- Having an unflinching focus on racial equity (e.g., NOT asking “shouldn’t we be looking at poverty instead?”)
- Listening to and acting on student feedback
- Not having a deficit-minded approach that blames students for inequitable outcomes
- Acting upon the conviction that success is a combination of student engagement and creating an inclusive student learning experience
- Things to look for when assessing with an equity lens:
- When you looked at your disaggregated equity measures, what gaps did you see?
- Did students achieve the outcomes, e.g., graduation, course pass rates, licensure exam pass rates, transfer, at similar rates?
-
- For example, 45% of the students enrolled in your program identify as Black females, but only 15% of the students who transfer within one year of completing the degree identify as Black females.
- Is there a particular student population that is more likely to leave your program after a certain number of credits? What factors might account for this?
- When you looked at your disaggregated equity measures, what successes did you see?
[Course Assessment by Academic Division]
- Assistant Deans, Department Heads, Program Coordinators, DCAF, OAE
Workforce Course Assessment
- The assessment of non-credit courses largely follows the same process as the assessment of courses that are taught for credit.
- The primary differences are:
- Non-credit courses will have different considerations in the development and articulation of course learning outcomes, especially when planning for eventual workforce-to-credit pipelines.
- Non-credit courses may or may not fit together into a program or overall curriculum.
- Non-credit course learning outcomes, like the CLOs developed for credit-bearing courses, should focus on measuring what students learn within the course, rather than external or post-hoc metrics like exam scores or employment outcomes.
- As in credit-bearing course assessment, equity should be taken into consideration. Assessment data should be disaggregated and critically examined for signs of inequitable student outcomes, and continuous improvement plans should include specifically articulated strategies to create a diverse, equitable, and inclusive classroom climate for all students enrolled in non-credit bearing courses.
Course Assessment Timeline
The following steps of course assessment can be used in conjunction with a formal documentation and review process (e.g., 335 submissions; program-, department- or division-coordinated regular reporting), or more informally in the ongoing process of personal and professional development. While formal documentation may not be required in every instance, it will likely benefit both you and your colleagues in the future if you keep notes on all three parts of the timeline in your own records.
DATA COLLECTION:
First steps: CLOs and Measures
- Before data collection occurs, course learning outcomes must be designed, and measures for assessing those CLOs should be selected. The CLOs are already set for the course.
- Your department may have agreed to use specific measures in common (e.g., a common final exam) or you may need to align your assignments (measures) to the CLOs using a rubric or other tool.
- TIP: Most course documents include the methods of assessment (measures) in the document next to the CLOs.
- In designing and aligning assignments (measures), consider the planned analysis and use of the resulting information, i.e.:
- Whether the data collected will fit well with data collected in past instances of the course (for trend analysis), or
- Whether you want to do complex statistical analysis (in which case you will need an instrument that collects data as integers (I.e., numbers you can add, subtract, multiply, and divide, not categories or qualitative descriptions).
- Consider the time and resources available for assessment activities when planning.
Data Collection Frequency:
The frequency of data collection depends on how often the course runs and on program-, department-, and division-specific assessment schedules.
- Some amount of course assessment data is being collected any time courses are being taught, but the wide range of ways that data are recorded, documented, analyzed, summarized and reported on means that there is variation in the frequency of data collection as well.
- There are many ways to collect course assessment data; the following is a non-exhaustive description of how data collection may vary based on one’s role and the scope of the course assessment being undertaken. For more details, consult with your division’s DCAF.
- For program or course coordinators assessing large multi-section courses:
- Data collection can happen via many different instruments, some of which lend themselves to relatively simple collection and analysis, others of which are more complex.
- If it is simple: take advantage of automatic data collection and technology to get a complete set of data (e.g., using assessment software or other aggregators)
- If it is more complex: you may need to get into sampling:
- Sampling will reduce the number of data points to analyze, which can help simplify analysis, but it also takes time to calculate and get a representative sample, and reduces the accuracy of the results, so the needs for timesaving and accuracy must be balanced.
- For more information about sampling, see “Data Sampling for Assessment” in the Appendix or contact assessment@ccp.edu
- For any faculty members assessing only their own section(s) of large multi-section courses or assessing small- to medium-sized courses (wherein all sections can be taught by one or two faculty members):
- You can use the full set of data that you have available or, if it’s too large, a sample as discussed above
- IN GENERAL, if you have a sufficiently large group of students that you’re assessing, it’s preferable to keep data associated with student J-numbers so that the data can later be disaggregated by demographic measures for assessing equity,
DATA ANALYSIS:
Once you’ve collected the data, you may analyze it by semester, year, multiple semesters, or some other unit that makes sense, depending on your goals.
- One method is analysis of rolling trends over a multi-semester period (e.g., analyze results from Fall 2022, Spring 2023, and Fall 2023 during Spring 2024)
- Another method is analysis of a specific semester’s data to track the success of data-driven continuous improvement strategies or the impact of external factors
- Department meetings, Professional Development Week, etc. are opportunities for reviewing data and making meaning
- Descriptive statistics: used to describe a whole data set with just a few numbers, essentially a summary of a big group of numbers:
- N: the number of data points you have
- Min: the smallest number in the data set
- Max: the largest number in the data set
- Median: if all numbers in a data set are arranged in numerical order, the number that falls exactly in the middle
- Mean: the sum of all numbers in the data set divided by the number of data points. Often this is what is meant by “average.”
- Mode: the number or numbers that occur(s) most often in a data set
- Standard deviation: a way to describe how far data points in a set tend to be from each other; a larger standard deviation indicates a very spread-out data set; a smaller standard deviation indicates a more tightly clustered set.
- Equity analysis: re-summarize the data by splitting it into groups along the lines of one or more demographic measures and comparing the descriptive statistics of the groups with each other
- Gap analysis: compare assessment results with previously established benchmarks to focus later discussion of data driving (below); any areas where the data fall below the benchmark should most certainly be addressed, but areas that fall just a tiny bit above the benchmark should also be noted and potentially included in interventions so that trends can be reversed early.
- Best practices and options, sensitive to scale and role:
- For program or course coordinators assessing large multi-section courses:
- Your method depends heavily on what you’re analyzing for
- Trends: look at rolling averages over several different instances
- Specific interventions: do pre/post comparisons; compare results from after implementation to the results that inspired the intervention, look for improvement
- For program or course coordinators assessing large multi-section courses:
DATA DRIVING:
Every semester, year, or other unit of time (as needed)
- Decide on and implement recommendations, then reassess, and document results from the previous semester, grouping of semesters, or year. Examples of the action items that can come from data assessment could include
- New teaching strategies (e.g., updating course topics, more scaffolding)
- Curriculum changes (e.g., revising CLOs, developing a capstone course)
- Professional development (e.g., FCTL workshops, anti-racist training)
- Collaboration with other areas of the College (e.g., working with another department or unit that helps students with foundational learning and general education skills related to this CLO, such as Foundational Mathematics, English, Social Sciences, Chemistry
- Change textbook or classroom resources being used
- Discussion on data driving should be made in close consultation with your deans and department heads as there may be a requirement for resources to implement the changes.
- Dependent upon whether courses are offered and run
- Spotlight: Improving the Student Experience
- The primary setting that shapes the student experience at the College is the classroom, and the primary point of contact that most students have with the College is with the faculty members whose courses they take.
- Using assessment data as a tool to regularly examine the outcomes of the experiences that you design for students within your classroom, and subsequently improving those classroom experiences, will have a direct bearing on the overall Student Experience at Community College of Philadelphia.
- Designing and analyzing assessments that are sensitive enough to reveal areas for improvement is critical to understanding the effects of classroom interventions.
- If you get stuck in trying to determine what interventions will help address specific challenges, try involving students in your brainstorming; they can often bring a new perspective and help to draw your attention to factors that you may otherwise overlook.
- Resources: Small Teaching (2016) by James Lang; Small Teaching Online (2019) by Flower Darby with James Lang