Examining the quality of our assessment system

With most of the 59 programs rated for this round, we are beginning an analysis of our system of assessment.

To re-cap, we drafted a rubric about a year ago and tested it with Honors college self-study and a made-up Dept of Rocket Science self-study. We revised the rubric with discussions among staff and some external input. In December we used the rubric to rate reports on 3 of 4 dimensions (leaving off action plan in the first round). Based on observations in the December round, the rubric was revised in mid-spring 2010.

We tested the new rubric at a state-wide assessment conference workshop in late April, using a program’s report from December. The group’s ratings agreed pretty well with our staff’s (data previously blogged).

The May-August versions of the rubric are nearly identical, with only some nuance changes based on the May experiences.

The figure below is a study of the ratings of OAI staff on each of 4 rubric dimensions. It reports the absolute value of difference of the ratings for each pair of raters — a measure of the inter-rater agreement. We conclude that our ratings are in high agreement [a 54% are 0.5 point or closer agreement (85/156); 83% are 1.0 point or closer]. We also observe that the character of the distribution of agreement is similar across all four of the rubric dimensions.

Draft timeline for Assessment Self-study process

Draft timeline for Assessment Self-study process Showing how the self study data feeds in from programs to the template and flows back out from the template to various reporting needs.

This diagram needed to explain system to NWCCU and also to share with Liaisons and Program Points to help them understand that the goal is focus and labor saving.

[We never finished this diagram or communicated it out. Ed.]

Bottlenecks to improving program outcomes assessment

Bottlenecks to improving program outcomes assessment.
Today we had a discussion about bottlenecks (and mis-perceptions) programs seem to have about assessment and an assessment system.  The conversation was based on what we’ve observed in our review of the Dec. 18 self studies.
It serves as input to the review of the near final draft of the version of the rubric being prepared for May 17.

This image is a record of the discussion

As promised, reflection on the retreat, finally

Re: draft note to staff
I was thinking last night that the persistent concern expressed in our still recent retreat was less about job security and more about qualifications in our protean institution.  If I no longer should be doing what I’ve done in the past, what skills do I need to develop to do what I am expected to do?  And how long, one anonymous note asked, before we change yet again?

In my own thinking, I am coming to understand that it is more about the affect than the skill set, though they are not fully separable.  As the CTLT our work was like teaching an elective course in the major.  Most of interactions were with faculty who wanted to work with us.  And now we fear our new charge is not only requiring us to work with those who share fewer of our concerns and values, but we are drifting into work increasingly with those who clearly do not.  We are now like those who teach required courses.  And the required course we teach is not one that many find rewarding.

Our challenge as a CTLT was encouraging participation in our efforts commensurate with the institutions’ investment in us.  As the OAI, our challenge is, with some irony, helping the institution achieve measurable outcomes.  In CTLT we worked with colleagues who were mostly or who became friends.  As an OAI, we must strive to work collegially.

The challenge is ultimately less about our skills.  We are amply qualified.  It is a challenge to our dispositions.  To a person this is not the work we signed on for.  It has a “feel” about it like a garment not cut quite right.  It is frequently somehow oddly irritating.

“Do we still work with the individual faculty we have worked with?”   No, is the answer.  If the work cannot be parlayed to the program level, then the job requires us to extricate ourselves from that work …and, at least in the work context, away from that friendly working relationship.

“Will we change our organization again?” Probably.  The garment may not fit quite right, but the unit will grow into it.

As the CTLT, however, we were unique and recognized internationally for our transformative focus.  Some aspects of that focus were occasionally a source of some contention, but it was a shared understanding we had of our mission.  We were guided by the belief that not only can we but we must do a better job in our institutions of learning at improving the student learning experience and student learning…outcomes.

So here we are.  Be careful what you wish for, the saying goes. Our new vision, rubric, and the harvesting of community input are being followed nationally. We have a tremendous opportunity to promote real, deep, and meaningful change at WSU.

If we can change.

Gary
Dr. Gary R. Brown, Director
The Office of Assessment and Innovation
Washington State University
509 335-1352
509 335-1362 (fax)
browng@wsu.edu
https://mysite.wsu.edu/personal/browng/GRBWorld/

Wiki Page for the Self-Study Review Process

From: Joshua Yeidel
Date: Thu, 25 Feb 2010 11:40:52 -0800
To: OAI messages
Subject: Self-Study Review Process Wiki Page

OAI folks,
In the course of working through the December Self-Study cycle, we have
developed a lot of informal knowledge about the process of handling
Self-Study Reviews.  Now it is time to capture and preserve that knowledge,
so that we can apply it to the May cycle.

I have started a wiki page with an outline of the process (the steps
identified in the Process Actions list “Next Actions” field).  Please add
what you know about the process steps to this page.

http://wiki.wsu.edu/ctltwiki/Assessment_of_Assessment_Self-Study_Review_Proc
ess

link reproduced here:  Assessment of Assessment Self-Study Review Process – ctltwiki

— Joshua

Planning program highlights for sharing

Planning program highlights for sharing This was a discussion of creating short highlights/ summaries (aka Baseball Cards) for programs with well developed features in their assessment plan, based on the Dec 18 self-study

Re-design of the workflow and web store following Dec 09 cycle

Re-design of the workflow and web store following Dec 09 cycle As the work from assessment and feedback on the Dec 09 self studies comes to a close, the team met to review the (admittedly) ad hoc tools assembled in January. These boards reflect that discussion. Implementation is intended before the May 17 deadline.

Implementing a workflow for Dec 18 Program Review

Below are Joshua’s notes for folks interacting with his new SharePoint mechanisms for implementing a workflow for our Dec 18 review process.

The attached image is from Thurs 14 (last week) where we diagrammed out the steps. Joshua implemented a trial mechanism to capture the remaining Dec 18 work as a test run for the May 17 work we are anticipating. Ashley has started populating that structure in Assessment.wsu.edu

In the figure, there is an agenda associated with the meeting and to the right of it, in blue and red, in Nils’ hand, a set of steps flowing down and to the right and an associated numbered list of steps in red.

—— Forwarded Message
From: Joshua Yeidel
Date: Tue, 19 Jan 2010 14:17:04 -0800

—————–DRAFT—————————–
Here are some tips for making successful use of the internal workspace for the 2009 December Planning document reviews:

There are two components:  a “Process Documents” library for non-public documents related to program reviews, and a “Process Actions” list which identifies what the next action is for each program’s review, and to whom the action is assigned.  Both components can be accessed via links in the left-hand navigation bar.

When a new review comes in, if it doesn’t already have one, create a “New Folder” with the program’s name in the “Process Documents” library.  This folder will be used for any internal (non-public) documents related to the review (e.g., draft responses).  Public documents (such as the original submission) go in the “2009-2010” site on UniversityPortfolio

ALSO create a new tracking item with the program name in the “Process Actions” list.  Set the “Next Action” and “Assigned To” fields appropriately. This will cause an email to be sent to the new assignee.  To identify the “Assigned To” person, Enter the person’s WSU network ID or “lastname,firstname”, then click the checkmark icon on the right end of the text box to “Check Names”.  The server will replace the id or name with an underlined name, meaning that a match for that entry has been found.  If no match is found (usually due to a typo), and you are using Internet Explorer, you can right-click on the unmatched entry for further search options.

Each time an action is completed, update the Process Actions item via the “Edit Item” button or drop-down menu item.  Change the “Next Action”, “Assigned To” and “Instructions” fields as needed.   This will cause an email to the assignee in which those fields are listed.

If you don’t know what the next action for a review should be, you can select “Figure out next action” at the bottom of the Next Actions pick-list, and assign it to a director.

You may assign a review to more than one person, but “Next Action” can have only one value at a time. (For example in cases where raters need to reconcile divergent ratings.)

Each “Process Action” item includes a “Log Notes” field which is cumulative.  When you make an entry, it is displayed at the top of a list of previous “Log Notes” entries for that item.  Use this for any information that is important to track (e.g., “rating cannot be completed until Joshua returns on 1/25/2010”.  There is no need for log notes of routine information.

Please let me know if you have successes or if you experience any difficulty.  This is a pilot of a tracking method we may use for the May cycle, so please surface issues so they can (hopefully) be addressed.

— Joshua

—— End of Forwarded Message

Notes From Design Circle Meeting

FW: Notes From Design Circle Meeting

Today we did some post-reviews reflection. Ashley led an activity that had the leadership and Designers doing a jigsaw read of a selection of the responses we have drafted to the Dec 18 planning reports.  The goal of the activity was to surface kinds of resources that would be helpful to have for programs — things that stand alone or work with mediation, to help programs understand and respond to the feedback.  The discussion produced a whiteboard of possible materials: things written by others with long shelf life, things that OAI might develop (and evolve), WSU-centric annotated samples, and other thinking, captured in these documents.

Also worth noting is this thinking about how an online community might be created that would expand the interest in, and ability to create, the documents outlined today.

attached document Notes from Design Circle Whiteboard

Storing drafts of December Planning documents

We are starting to collect documents for review
https://universityportfolio.wsu.edu/2009-2010/December Planning/Forms/AllItems.aspx

It has me thinking about how we track the portfolio of our own work

I just read a draft for a program. I uploaded the document as submitted to me and then used Word’s comment features to make a new version that I sent back to the program and also uploaded for posterity.

I noticed that Josh’s work with the folders and columns in this document library lets us set the ‘Review Status’ of the individual documents. This is great because I see how we might end up with several documents in one folder and only one should be considered for final review. That document would be individually marked ‘Ready’ for review.