Suggested Script for Cold Calls to Program Points

Cold Calls to WSU Programs

Who you are
·        This is a beginning of the process of actively contacting all programs to solicit their participation in the Dec 18 self-study reporting. Prior to this point, contact was indirect, via the College-level program liaisons.

My name is _______________ with the Office of Assessment and Innovation.  As your College Assessment Liaison probably told you, accreditation and accountability expectations for higher education have been changing, and the accreditation process for us will now require updates every two years with a report on learning outcomes progress due next fall, 2010.

What your role is
·       I am the designated OAI contact selected to work with you and your program.  As you may know, the Office of A and I was established to meet new assessment requirements and to help do so in ways that keep the focus on improving the student learning experience.  Our charge is to help WSU programs develop, if they are not already in place, assessment strategies that meet new assessment requirements in ways that leverage faculty expertise and that are meaningful and useful for WSU faculty.

What you understand their role to be
·       As you may know, in order to meet the fall report deadline, some level of assessment needs to be conducted next spring so that the results can be used to guide improvements.  It is using evidence to ‘close the loop’ that is the common denominator of assessment requirements coming from all of our stakeholders, including the NWCC&U, professional accreditors, OFM, the legislature, or the HEC Board (all now pressing us for accountability).  I’m calling because I understand you are the point person for your program and I want to make sure you are aware that there is a December 18th deadline for sharing your plans.

The Process Overview
·       After you have shared your planning document in the template provided to you by your college assessment liaison, available from me, or accessible online at ( ), your plan will be reviewed and feedback provided. That review and feedback will be based on criteria available in the ‘Guide to Assessing Assessments [A of A].’  The Guide is being developed to help clarify the principles of assessment that our accreditors expect, and of course it might be a useful resource as you prepare your plan.
·       The same systematic process, our charge, will be conducted as WSU’s and your program’s assessment continues, and ideally you and your team will join us in refining the criteria and process to help us make the activity as useful as we can for improving the WSU student experience.
·       To be clear, this process at this point is intended to be formative assessment that we will use so that we can do more than comply with accreditation–we want to help shape it while the opportunity for us and for our accreditors is still possible.  We hope you may also help us identify colleagues in your field who might find this exercise useful and to ensure that the review of your program is conducted by experts you know and trust.
What we can do to help
·       It may be that what this activity entails is overwhelming or confusing right now.  The OAI was established to help you navigate this challenge and to do it in a way that is effectively integrated into your everyday teaching practice.  We have developed a number of strategies over the last several years that you may find useful, and we have developed and identified a number of resources that can help.  Let me know if you want to meet and talk about this, and of course I welcome the opportunity to meet with any of your assessment team who are available as well.

Comments added to original post

questions for an initial mtg, F2F or phone or whatever

Here’s what I try to run through to start the discussion of prog assessment:

Initial conversation about program assessment (phone or F2F)

Tell me about your program.
I’m not very familiar with XXXXXXXXX; could you tell me a bit about your program, students, faculty, and capstone to help us plan an appropriate pilot assessment.

What do you think is working well in your program?  What’s an issue that you all struggle with?  (See if they can identify a question.)

Does the program already have student learning goals?

What direct measures to target

•Does your program have a capstone project?
(Basic info re mode, size, how many students / projects each semester/year,  All seniors?  Mostly seniors?  Timing  )
•Which student work / class is appropriate for capstone assessment?  Individual or group work?
(If it’s a huge project, could students write a short piece about the project,  3-4 pages providing an overview of their thinking about the project, and reflection – like something you might include in a portfolio or prep for a job interview —  and the project is an “appendix” to that, for our assessment??)
•Logistical issues to address?  Format of project or other.  Can we collect digital copies? Need clean, unmarked.
•Assignment prompt and course syllabus
•Logistics of collecting student work this semester.

Lower division core class
•Does your program have a lower division core class or classes?
(Basic info re mode, size, how many students / projects each semester/year,  All first year students?  Mostly?  Timing  )
•Which student work / class is appropriate for assessment?  Individual or group work?
•Logistical issues to address?  Format of project or other.  Can we collect digital copies? Need clean, unmarked.
•Assignment prompt and course syllabus
•Logistics of collecting student work this semester.

Other stuff
•Do a program inventory.
•Who will participate in assessment?  Faculty, instructors, TAs, others
•Look at the timeline for getting started. General timeline this semester and next (attach our sample timeline)
•Student learning goals – into a rubric
•How do your faculty share ideas about teaching, or learn new teaching practices?  What kind of teaching resources (articles, workshops, conferences, brownbags, etc) are used by instructors?  How are they shared, informally or formally?

Green, Kimberly at 12/18/2009 12:33 PM

December Agenda Shared with Liaisons

Suggestion for you to send to Liaisons
In particular, the note was shared:

The new OAI website is up and we are assembling resources there that will be helpful to you and program contacts in creating the plans that are due Dec 18. Especially note the three links on the right end in the banner. These are the template for writing a plan, the guide for assessing a plan and a folder of additional resources.

I want to remind you that OAI staff are available to talk with your program contacts about these materials and to consult in the development of plans. Please call 335-1355 to arrange an appointment

agenda for liaisons, december

A of A Rubric Draft December (revised)

document attached to email with no body  a of a rubric expanded revised 12_1

Dr. Gary R. Brown, Director
The Office of Assessment and Innovation
Washington State University
509 335-1352
509 335-1362 (fax)

Assessment Liaison Updates and Tasks


Please share with your Program Assessment Point people:

1.      OAI contacts have been assigned for each program.
You can see the list:
If you click on the names, you will find more contact information, but if that fails, all OAI contacts can be reached at 5-1355.

No doubt this list will be changing.  Please make changes or send them to me or Judy ( ) as they occur on your end.

2.      Attached is the 3rd version of the rubric for assessing assessment and the corresponding template.  The rubric has been rendered in three versions (not perfectly aligned in this draft just yet).  There is the over view, the ‘digest’ version (by popular demand), and the expanded version, which we have found tends to produce better results though perhaps requires a greater initial investment.)  We anticipate one more round of deep revision of the rubric and template following the activities associated with this release, but the principles will remain the same.  We are also close to releasing a timeline and checklist, but anticipate that document merits discussion at our next meeting.

3.      We strongly recommend you forward the blue below and encourage each of your teams to review the assessment process using the rubric to evaluate the mock report (Rocket Science) before submitting their spring assessment plans.  We also recommend they consider their previous self-study reports in light of these criteria.

We also suggest that the assessment of assessment process be conducted synchronously in collaboration between your program assessment teams and their OAI contacts.

And we always welcome feedback but particularly at this juncture as we gear up for spring assessment.

Here is the link to the online Mock/Model Template-based report–Rocket Science.  The process is the same as the one we did with Honors.  If you log into this site, you will find the directions to guide you through the Rocket Science assessment process: [since retired]

Again, the process:

1.      Read over the Assessment Criteria (you can download the revised rubric at the link above or read it online).

2.      Read the Rocket Science self Study

3.      Go through the online rubric and assign Rocket Science scores on each of the four dimensions of the rubric.

Meanwhile, we are also sharing this process statewide and with selective professional groups and have received very positive feedback.  More on that when next we meet, but the upshot is transparency.

4.      Finally, don’t forget the deadline for spring assessment plans.  We need plans for spring assessment activity from all programs before December 18th, 2009.  We will review (with the appropriate criteria of the rubric) and provide feedback for each plan as soon as possible but no later than early January. (The sooner we receive them, the sooner we can provide feedback.)

Don’t hesitate to contact me with any questions or concerns. And don’t forget the next Liaison Council meeting:  December 4th at 1:00 in Lighty 403.


PS, some program points have asked for more models.  A few from 2008 we can point to are here:
(follow ‘Assessment Highlights’ and the ‘case reports’ below the highlights)

Dr. Gary R. Brown, Director
The Office of Assessment and Innovation
Washington State University
509 335-1352
509 335-1362 (fax)

attachment with original email  a of a set beta (3)

the email also references a mythical program “rocket science” as a vehicle to be used testing the rubric. Those files are included here for completeness: 

A Of A Template and Rubric post-NWCC&U Meeting in Seattle (11.6)

From: Brown, Gary
Sent: Sunday, November 08, 2009 1:14 PM
To: Peterson, Nils; Ater-Kranov, Ashley
Cc: Green, Kimberly; Desrosier, Theron; Jacobson, Jayme K; ‘Jane Sherman’
Subject: a of a beta 2


Attached is a working version of A of A Beta 2.  I’m trying to build into our Assessment of Assessment rubric and template some of Friday’s language from NWCC&U, focused so far mostly in the template (Nils, for Rocket Science mock model report update).  I’m working mostly on the digest form right now so more alignment with expanded criteria will be necessary in the ongoing iterations.  I’ll be working on squeezing in more language from NWCC&U, though it is deliberate in the level of abstraction they use’a very clear position taken that we operationalize it ourselves as one size will not fit all universities.  Our authority, such as it is, will come from the stance the WSU Executive Council takes and our ability to convey the principles of assessment as useful.

The meeting was literally a reading and Ron Baker’s interpretation of the new standards, which are somewhat different from what was recently posted.  What is key will be clarifying for ourselves the language of:

Core Themes

Used everywhere and in different ways, we will want our language to align as much as possible with the way NWCC&U is using the terminology. Ron Baker appears to have worked on point to revise and explain the standards, so if we have questions we can go to him.  Jane will be here in December and we hope to confirm our approach to the language at that time, but the working clarification I have so far come to understand (awaiting confirmation) is that  Core Themes equates to WSU’s four strategic “goals”: of a set beta 2 (Yes, we have lots to operationalize in the four goal language in ways that programs and faculty can really make use of in practice).  Our institutional goals for meeting the core themes in the learning realm are still the six Gs of the B.  Programs will have objectives which are generally measurable (though function more like discrete program goals in this scheme) and outcomes which are what is actually measured.  We may have many objectives, for instance, but focus at any given time on actually doing that measurement to determine if we have achieved our outcomes.

(Jane, please weigh in if this corresponds to your understanding.  Needless to say, terminology can be a real bottleneck, is debated among experts, and we have little time to count dancing pin-headed angels, or something like that…..)

Anybody want to join me in CLA point meeting Tuesday 10:30-12:00?   CLA has a mix of chairs and good folks identified to make assessment happen in their programs.

I also want to get the revised A of A and template out to all Liaisons this week, sooner rather than later, and start setting up meetings to walk them through the Rocket Science mock report.


Dr. Gary R. Brown, Director
The Office of Assessment and Innovation
Washington State University
509 335-1352
509 335-1362 (fax)
attached draft reporting template and rubric  a of a set beta 2

Recap of the Planning Request

The Plan We Need

The goal of the plan is to establish the team and strategy to be implemented in the spring.

We are working to be sure ALL programs and ALL teaching faculty, adjuncts, and TAs can report some level of involvement in the program outcomes work, including, minimally, identification of program outcomes on their syllabi and participation in the analysis of results and awareness of the action plan (how the program will be closing the loop).

It is reasonable to expect that the spring assessment will be a pilot, and that therefore the action plan might very well be something focused on ways to make the assessment more robust, valid, and designed to encourage greater participation and use of embedded activities faculty have already been doing in their courses.

So specifically, we hope the plan will include the development of a team and system (rubric criterion #1) and the beginnings of the outcomes and measurement strategy (rubric criterion #2):

1.     Identification of WHO in the program will be involved (#1).
2.     Identification of WHICH core courses and WHAT representative activities students will be doing that will be assessed. (#2)
3.     How that approach to direct measurement will be aligned with other existing indirect measures such as student evaluations, and if not, then how that alignment will be addressed in the future (the plan to plan) (#2)
The goal of this plan to plan is to make sure logistics are in place so that we can increase the likelihood that assessment will take place next spring. The spring deadline is necessary so that programs can also do the analysis and complete a report with action items for fall, 2010, when the updated report to the Northwest Commission is due.

Other Items
·        The rubric revisions are underway, thanks to excellent feedback from this group and staff in OAI.
·        We will have the rubric revisions ready along with two mock model self studies. The models will be aligned with the rubric and should help ongoing deliberations about the process and the template.
·        You can glimpse the rating of the Honors self study along with associated comments at this link.

October 27 Agenda, Planning Guide, Template Draft

Handouts for October 27 meeting.  We decided to revisit the template.  More summary will follow. (Agenda one is Larry’s; 2 includes revisions following discussions with OAI leadership.)

From: James, Larry G
Sent: Tuesday, October 27, 2009 10:59 AM
To: Brown, Gary
Subject: Agenda

Curriculum Mapping from Learning Outcomes Goals

In our first presentation First Presentation of Harvesting Feedback as Accreditation System we struggled with how to represent the curricular map. We used this chart in the Prezi

Each program goal is a column, courses are rows, aspirations for how well the program goal is met in the course is a number and color in the grid. Theron came back to argue that this was hard to read and the radar presentations we had been using for reporting feedback was a better representation.

As an outcome of the meetings of the Liaisons to pilot the rubric and the Assessment of Assessment process was discovery of some interesting data in the Honors Self-study used for the pilot.

Honors had surveyed each faculty and each course for the course’s aspirations on level of proficiency on the WSU Big 6 Goals, which had been adopted as the Honors College Learning Outcome Goals. The survey provided a dataset for doing a curricular map in Honors college, below:

—— Forwarded Message
From: Nils Peterson
Date: Wed, 21 Oct 2009 14:46:29 -0700
To: Carol Anelli , “Walker, Libby”
Cc: Gary Brown
Conversation: Honors Learning Outcomes Survey data
Subject: Honors Learning Outcomes Survey data


I have made progress with the data summaries you sent me. They tell an interesting story which I am mocking up as a part of prototype of an Honors Learning Outcomes report. I think it has the potential to make a very interesting worked example, and seems more compelling for a next step than my proposal on Monday to re-organize the Honors NWCCU document to make it align with Gary’s draft template.

Here is the graph I made with the aggregated data . We have been using these radar graphs to help visualize multi-dimensional rubric data. You will see the 6 goals around the rim, from CT= Critical thinking thru Disc=Discipline. The scale ranges from 0 at the center to 6 at the rim. I set these values in your data: 2=Emerging; 4=Developing; 6=Mastering. N/A in  your survey is effectively a zero on this scale.

Here is what I see, and why I call  it a curriculum map. Each Honors course is supposed to have big 6 as its goals, and the approximately concentric circles indicate that the aspirations of faculty for 100-level courses are lower than for 400 level courses. HOWEVER, there are several places where the higher level courses are not expecting more than lower level courses. In Disc, the 400-level course is expecting less than the 100 level course. I think this may be a data transcription error which I will check from the raw data.

I see in the original survey that collected the data above, faculty were asked to share assignments, sample rubrics and/or sample student work to Libby. Did that happen? Can they be found? We could consider comparing a review of those artifacts with the faculty’s self-reported intentions

—— End of Forwarded Message

Rainking Futures Begin to Emerge from the Mist

Notes by Joshua Yeidel

Today’s “University Portfolio Design” meeting began with some meeting-existential questions (like, “What are we meeting for? Do we need this meeting?”). Despite such an unpromising beginning, and the eventual decision that a standing meeting is NOT needed, some useful work took place in clarifying our timeline for the next two years and the locus of our technical work on Rainking.
What follows is my understanding of what emerged today.  Others should feel to weigh in, whether by comments below or in other ways.”University Portfolio Design” meeting
Our standing meeting Thursday at 11am had been migrating toward a Rainking status review meeting.  It was determined that this purpose is better served by including Rainking in the Friday morning All-CTLT meeting.  To keep Rainking from squeezing out other topics, the Friday morning meeting will need to focus on sharing awareness of issues;  issues which cannot be resolved rapidly should be handled off-line from that meeting.”Rainking” SharePoint Site
The site should be understood as a testbed, sandbox, and project management site (a “dev” or development site).  We anticipate that will be our production site for the coming academic year (or at least, for Fall semester;  see below).

Assessment Timeline

NWCCU will require a report in October 2011 on WSU’s progress in learning outcomes assessment.  Working back from that date, we developed the following rough timeline:

Fall 2009: use NWCCU2009 self-study reports to test harvesting responses to assessment (“assessment of assessment” rubric)

November 2009: Pilot departments turn in self-study report on Fall 2009 assessmentDecember 2009: Harvest responses to pilot self-studies

Spring 2010: Pilot departments implement changes based on Fall assessment and feedback

April 2010: Self-study (description, data, action plan) from all departments

May 2010: harvest feedback on self-studies

August 2010: Potential launch of long-term Univiersity Portfolio systemFall 2010: All departmebnts implement assessment-based changes

Spring 2011: All departments turn in self-study

Spring 2011: University self-study for NWCCU

At least through Fall 2009 we plan to use SharePoint as the hub for self-studies and responses.  We are also looking at alternatives, which could be adopted for Spring 2010 but no later than for Fall, 2010.Portfolio Architecture

Some recent work by Theron, Kimberly, Rich, Judy, and Joshua has implications for the architecture of the University Portfolio 2009-10 site.  lnterested parties will meet to discuss this.

The Rain King Chronicles Blog

The idea for this blog was to have a place where we could chronicle the transformation and invention of the university-wide learning outcomes assessment process.

This space is something more than a content repository, and something less than a full fledged blog.

Rain King Chronicles
A collection of artifacts from the OAI


  • Have a collection
  • Document internal learning
  • Model new way of working/learning
  • Increase our Prestige
  • (Possibly) get help from outsiders