On Transposing Assessment Scales and Mapping to WSU 6

Re: mapping to WSU 6
An inquiring program point wants to know:

A score of 4 is entry level competency, and we will transpose that scale for all programs.  It is absolute, not weighted by us.  (Entry level written communication for your program may not mean the same thing for another,  so our work, not yours, may get a bit messy.)  We will adjust scores according to the scales programs use. So your four point scale with the entry level score of 2(?) becomes a four on the institutional, aggregate report.  Performance doesn’t equate to year in school, either.  We need to be cognizant of ceiling effects. In other words, there is no reason a first year student might not score at a 5 or 6, and, as we have experienced in one program, graduate students assessed anonymously did not perform better than upper division students, and in fact, they did not on average perform at entry level competency.  It’s that absolute scale again.  We have learned, too, that nothing constrains student performance more clearly than low expectations. (As George Kuh noted, we sometimes put the bar so low that students trip over it.)  The goal for graduating seniors (ideally assessed in capstone courses or their equivalent) is anchored at four and other levels of performance radiate from there.   We want to be able to report that “All WSU graduates are held accountable to levels of performance in their programs on standards that are affirmed by professionals and WSU faculty working in collaboration.”  That’s our first line.

“All WSU programs are responsive to those standards and make curricular and pedagogical changes in that context,” is our second.  We want to hold that actual percentages of students performing at competence in abeyance as long as we can.  As a qualification, there are innumerable studies in multiple modes that confirm that graduates from institutions of higher education are NOT being adequately prepared, so the scores in the histogram below are not unusual or, all things considered, even remotely disappointing.  In fact the program’s  work that evinced those scores is exemplary and, not incidentally, sustainable.  In other words, the gold standard for now — for WSU and for NWCC&U and for the HEC Board — and for the foreseeable future is not about showing off results, but showing off the commitment to responsible assessment.

Figure 2:  Disciplinary Scores Reallocated to the WSU 6 Goals of the Baccalaureate*

*The Blue Arrow Indicates Anchored Entry Level Competency

Flexing the May Report deadline

Hi [Program Point],
We discussed the update on your recent assessment meeting, which was a great start.  Your OAI contact also shared that you may be feeling a bit stressed by the May timeline, so I want to say we’d rather keep working together to get a pithy, focused, and clear document that reflects the real* work you are doing.  So if you need more time and if you are open to the ongoing collaboration, when the results start coming in we’re happy to help pull together a report ready outline, find some time to sit down with you, and help get the report into shape.  If that goes into June or so, that’s fine on this end.
Let me know.

Gary

*”real” is the operative word that makes flexibility possible.
Dr. Gary R. Brown, Director
The Office of Assessment and Innovation
Washington State University
509 335-1352
509 335-1362 (fax)
browng@wsu.edu
https://mysite.wsu.edu/personal/browng/GRBWorld/

A proposal to get external review/ test of rubric

A proposal to get external review/ test of rubric I sent this to a program contact in early April to help with the problem that the program was unwilling/unable to devote resources to testing their rubric against a small collection of student theses.

No reply in 2 weeks.

Hi,

I just came across this resource. It looks like they will do a free demo. Are you interested in seeing if they can apply your rubric effectively? I’d suggest we send one paper and the rubric as an experiment.

http://www.virtual-ta.com/

Awards Ceremony 3/13- we will be recognizing exceptional leadership in college and program level assessment

From: Ater-Kranov, Ashley
Sent: Thursday, April 08, 2010 1:55 PM
To: OAI.Personnel
Subject: Awards Ceremony 3/13- we will be recognizing exceptional leadership in college and program level assessment

Please see the attached invitation. The awards list is below.

Office of Assessment & Innovation
Berat Hakan Gurocak
hgurocak@wsu.edu
Assessment Leadership for Mechanical Engineering WSU Vancouver
Office of Assessment & Innovation
Douglas Jasmer
djasmer@wsu.edu
Assessment Leadership for the Veterinary Medicine Program
Office of Assessment & Innovation
Roberta Kelly
rkelly@wsu.edu
Assessment Leadership for the Murrow College of Communication
Office of Assessment & Innovation
Kimberlee Kidwell
kidwell@wsu.edu
Assessment Leadership for the Agricultural and Food Sciences Program
Office of Assessment & Innovation
Kimberlee Kidwell
kidwell@wsu.edu
Assessment Leadership for the College of Agriculture, Human and Natural Resources
Office of Assessment & Innovation
Kimberlee Kidwell
kidwell@wsu.edu
Assessment Leadership for the Integrated Plant Sciences Program
Office of Assessment & Innovation
Lisa McIntyre
ljmcint@wsu.edu
Assessment Leadership for the Department of Sociology
Office of Assessment & Innovation
John Turpin
jturpin@wsu.edu
Assessment Leadership for the Department of Interior Design at WSU Spokane
Office of Assessment & Innovation
Phillip Waite
pswaite@wsu.edu
Assessment Leadership for the Department of Landscape Architecture
Office of Assessment & Innovation
Libby Walker
walkerl@wsu.edu
Assessment Leadership for the Honors College

An Invitation to the University College Awards Ceremony
Further information about these awards appeared in these blog posts:

Re-thinking Course Evals for Prog Eval purposes / Turmoil in Colleges

Re-thinking Course Evals for Prog Eval purposes / Turmoil in Colleges A conversation on the bus home from work with a College Liaison led to this exchange

On 4/6/10 9:18 AM, “Nils Peterson” wrote:
Further to our conversation on the bus. Collecting actionable data, and acting on it, are the things that we believe are likely to engage students and therefore to increase response rates on course evals. How about we get together and talk about some ways to re-think the course eval process. I’m looking for a partner that is in a position to take a bit of an adventurous view of the topic.

(I want to see if this Liaison would venture into something like paired assessment where faculty declare their goals for the course and students say what they perceived happened. We’d get curricular maps, which could then be used by the program to help students navigate course selection.)

Liaison replies:
Nils,
I will be interested in doing this.  However, let’s put off the conversation for several weeks.  Amongst other things, I still do not know where I am going to land beginning July 1.  That is to say, I do not know what my position will be and in what college.

Why Public Reporting

One faculty member responds to OAI assessment plan feedback:

“Oh -kiss my #$$, I have been here long enough to know that THAT thing
will be flushed down the toilet.”

Program starts assessment activities for May 17

Program Point writes:

>
> Hello Nils -just in case you thought we have fallen off the edge of the
> earth ,I thought I would touch base with you to update on the progress we
> are making vis a vis undergraduate assessment. First I apologize for not
> being more proactive, but you have to realize-we are a very small
> department/school, and under VERY much stress trying to juggle all the
> extra requests this semester, without our Director[on sabbatical leave]-
> from the grad school, from this convoluted progress of the  merger with
> [another program] and trying to keep abreast of all the course
> justifications etc etc -and keeping up with research/teaching!!. Having
> said that, we of course recognize the importance of getting this process
> in place for the undergrad assessment.
>   Since to date we have no historical data base,we are actively emplacing
> several – eg Exit interviews, staying in touch with our graduates as
> they join the work force. We have started a data base on their
> acceptance to graduate-level programs [we consider this one of the best
> ‘assessments’ of the success of our training. We are asking all of our
> faculty[ie 8-9 my program, 1-2 sister prgram to submit within their
> Course Syllabus several assessment questions ,specifically pertaining to
> university missions and goals, to be monitored through-out the course.
> We have no official assessment teams in place , but will specifically
> monitor all these data as a faculty in our regular meetings. We are
> looking to enhance the role of our Board of Visitors as the most obvious
> ‘stakeholder’
>  If you have any comments ,suggestions. I would very much appreciate any
> feedback. [ We will try, incidentally to emplace a like system in
> sister program-as the sole tenured faculty has a
> tremendous load]
>        best   ~~
>
>

On 3/29/10 10:37 AM,  wrote

Thanks for the reply.

I appreciate that your program is under stress and yet you are attempting to
work on these important issues.

I am available to assist.

For example, I’d like to encourage that you be clear about program goals and
their associated student learning outcomes. You may find the definitions on
the first page of the guide to assessment useful (see attachment in my
previous email).

I think I can get you a couple sample exit surveys. You would want to adapt
them, for among other reasons, because the samples will be aligned with
another program’s learning goals.

An alumni survey is another good indirect measure. I have sample surveys
that you could align with your program’s goals.

I’ve talked with your assistant Dean about her program, where she is
exploring gathering assessments from admissions officers (taking it a step
more than just acceptance rate, which could be influenced by all kinds of
factors having nothing to do with quality.) You might want to chat with her,
or I could help.

Looking at syllabi is great, especially asking to what extent are they
communicating program learning outcomes. We’ve also done work with programs wanting to assess a key assignment in a course — to what extent is the assignment prompting students to demonstrate the program’s learning
outcomes?

The above are great indicators, that can triangulate if you coordinate them,
but none is a direct measure of a specific skill. Is there a capstone course
that has a project where student work could be assessed with a program
rubric?

This is a long list. You don’t need to undertake it all at once. Part of
your plan may be to develop a schedule of when you collect what kinds of
data. Eg, course evals, each semester; exit interview in May; alumni survey
every other Fall; syllabus review as part of your program’s 5-year
professional review, etc. We know you need to make the work level
manageable. I would encourage that part of what you submit is a plan and
schedule — even if means your data is more of a pilot nature.

I’m happy to come over and talk