On Transposing Assessment Scales and Mapping to WSU 6

Re: mapping to WSU 6
An inquiring program point wants to know:

A score of 4 is entry level competency, and we will transpose that scale for all programs.  It is absolute, not weighted by us.  (Entry level written communication for your program may not mean the same thing for another,  so our work, not yours, may get a bit messy.)  We will adjust scores according to the scales programs use. So your four point scale with the entry level score of 2(?) becomes a four on the institutional, aggregate report.  Performance doesn’t equate to year in school, either.  We need to be cognizant of ceiling effects. In other words, there is no reason a first year student might not score at a 5 or 6, and, as we have experienced in one program, graduate students assessed anonymously did not perform better than upper division students, and in fact, they did not on average perform at entry level competency.  It’s that absolute scale again.  We have learned, too, that nothing constrains student performance more clearly than low expectations. (As George Kuh noted, we sometimes put the bar so low that students trip over it.)  The goal for graduating seniors (ideally assessed in capstone courses or their equivalent) is anchored at four and other levels of performance radiate from there.   We want to be able to report that “All WSU graduates are held accountable to levels of performance in their programs on standards that are affirmed by professionals and WSU faculty working in collaboration.”  That’s our first line.

“All WSU programs are responsive to those standards and make curricular and pedagogical changes in that context,” is our second.  We want to hold that actual percentages of students performing at competence in abeyance as long as we can.  As a qualification, there are innumerable studies in multiple modes that confirm that graduates from institutions of higher education are NOT being adequately prepared, so the scores in the histogram below are not unusual or, all things considered, even remotely disappointing.  In fact the program’s  work that evinced those scores is exemplary and, not incidentally, sustainable.  In other words, the gold standard for now — for WSU and for NWCC&U and for the HEC Board — and for the foreseeable future is not about showing off results, but showing off the commitment to responsible assessment.

Figure 2:  Disciplinary Scores Reallocated to the WSU 6 Goals of the Baccalaureate*

*The Blue Arrow Indicates Anchored Entry Level Competency

Advertisements

Assessment design is messy: challenges to cooordinating flexible and useful assessment for a college and its programs

Assessment planning, designing, and implementing is a messy process, as are most authentic, responsive endeavors that involve and engage many people from a variety of roles and vantage points.   Different people and programs have different concerns and genuine questions.

What’s the relationship between college and program assessment? Student learning goals identified by a college and by its programs are at the heart of assessing student learning and planning useful and coordinated assessment activities. Many challenges to designing  a coordinated, yet flexible assessment plan that balances multiple considerations.

OAI is addressing this question now with one WSU college, Spring 2010:

From OAI contact to a program head:

I’ll briefly summarize  a) assessment efforts at WSU, b) what I understand assessment efforts in one college, c) what I understand is on the table now for next steps, and d) my role as your OAI contact, as well as OAI’s role.

Assessment efforts at WSU: college-level and program-level

The current efforts for systematic assessment at WSU include two different approaches.

In some colleges, efforts are starting primarily at the program level, including developing an assessment plan, student learning goals, rubrics/tools, measures, and processes.  This approach can create assessment that’s immediately meaningful and useful to a program — but it also brings significant challenges in terms of “rolling up” results into something coherent at the college level.   Piecemeal efforts, dozens of unaligned rubrics and measures, there is a great deal needed to make that useful at the college (and institutional) level.

In other colleges, assessment efforts have been coordinated at the school or college level.  This approach identifies common student learning goals across programs, and seeks to develop common rubrics/tools, measures and processes for assessment.  This approach can provide useful assessment results at the college level, with aligned measures (such as course evaluations, senior exit surveys, alumni surveys, etc.) providing complementary data.  It also brings the opposite challenge — how to build in flexibility and give adequate agency at the program level so that programs can identify and focus on gathering useful assessment data that interests and informs them to guide ongoing improvement in learning and teaching.

Summary of your college assessment efforts to date (as I understand them, primarily involving OAI/CTLT)

For several years … the college has invested time and effort in a developing, piloting and refining a college-wide rubric, with input from faculty in the various areas, in collaboration with OAI/CTLT.  The assessment team, in consultation with OAI/CTLT, adapted WSU’s Critical and Integrated Thinking Rubric, and revised it with input from various faculty  Over two semesters, the rubric has been piloted and refined by assessing student work from five different courses in the college.

The assessment team has also developed and piloted a senior exit survey which aligns with the rubric and has started drafting an alumni survey that is similarly aligned.

Last semester, the Dean requested two faculty to draft student learning goals for the college, in alignment with WSU’s Six Learning Goals of the Baccalaureate. This was done independent from work by the assessment team.

In February 2010, OAI helped map the college rubric to these new learning goals and WSU’s Big 6, including adding another dimension for specialty (to be developed by each program).

Next steps

In March 2010, faculty in one program brought up questions about the purpose of assessment in their area, the rubric, and the college’s new student learning goals and performance criteria created last semester.   These raise important questions about how to balance program agency and assessment needs with college-level needs, including:

  • What is the relationship between the new student learning goals and performance criteria, and the rubric developed over the past two years, the complementary measures and data?
  • In the future, how does the college want to balance college level assessment and program level assessment for a coherent and useful overall assessment effort?

These questions and their answers represent an important conversation among key players in the college, faculty and leadership.

From an assessment standpoint, there is no fixed path ahead.  Rather, those involved should identify goals, needs, constraints, and priorities. The college should probably outline a coordinated, yet flexible assessment plan that balances those considerations; it’s possible to try different approaches in different programs, to decide to focus on shared goals, or to do something in between.

As a note, some programs at WSU are in the process of developing staggered assessment plans, in which they assess only some goals each year.

OAI’s role

OAI has been working with this college’s assessment team for two years to develop, pilot, and refine a college-wide rubric and other complementary measures, including a senior exit survey and alumni survey.  OAI is also available to work with programs on their assessment plan, tools, and measures.

However, in the interest of making the best use of our collective time and efforts, before beginning to develop any program’s assessment plan, I suggest that the college consider the questions raised above.

If invited, OAI is available to consult with the college about different paths you might choose to take for useful, quality assessment.