December Agenda Shared with Liaisons

Suggestion for you to send to Liaisons
In particular, the note was shared:

The new OAI website is up http://oai.wsu.edu and we are assembling resources there that will be helpful to you and program contacts in creating the plans that are due Dec 18. Especially note the three links on the right end in the banner. These are the template for writing a plan, the guide for assessing a plan and a folder of additional resources.

I want to remind you that OAI staff are available to talk with your program contacts about these materials and to consult in the development of plans. Please call 335-1355 to arrange an appointment

agenda for liaisons, december

Advertisements

Assessment Liaison Updates and Tasks

Folks,

Please share with your Program Assessment Point people:

1.      OAI contacts have been assigned for each program.
You can see the list:  https://universityportfolio.wsu.edu/20082009/Lists/Liaisons/Liaison%20Council.aspx
If you click on the names, you will find more contact information, but if that fails, all OAI contacts can be reached at 5-1355.

No doubt this list will be changing.  Please make changes or send them to me or Judy (judyrumph@wsu.edu ) as they occur on your end.

2.      Attached is the 3rd version of the rubric for assessing assessment and the corresponding template.  The rubric has been rendered in three versions (not perfectly aligned in this draft just yet).  There is the over view, the ‘digest’ version (by popular demand), and the expanded version, which we have found tends to produce better results though perhaps requires a greater initial investment.)  We anticipate one more round of deep revision of the rubric and template following the activities associated with this release, but the principles will remain the same.  We are also close to releasing a timeline and checklist, but anticipate that document merits discussion at our next meeting.

3.      We strongly recommend you forward the blue below and encourage each of your teams to review the assessment process using the rubric to evaluate the mock report (Rocket Science) before submitting their spring assessment plans.  We also recommend they consider their previous self-study reports in light of these criteria.

We also suggest that the assessment of assessment process be conducted synchronously in collaboration between your program assessment teams and their OAI contacts.

And we always welcome feedback but particularly at this juncture as we gear up for spring assessment.

Here is the link to the online Mock/Model Template-based report–Rocket Science.  The process is the same as the one we did with Honors.  If you log into this site, you will find the directions to guide you through the Rocket Science assessment process:

https://universityportfolio.wsu.edu/2009-2010/Pages/default.aspx [since retired]

Again, the process:

1.      Read over the Assessment Criteria (you can download the revised rubric at the link above or read it online).

2.      Read the Rocket Science self Study

3.      Go through the online rubric and assign Rocket Science scores on each of the four dimensions of the rubric.

Meanwhile, we are also sharing this process statewide and with selective professional groups and have received very positive feedback.  More on that when next we meet, but the upshot is transparency.

4.      Finally, don’t forget the deadline for spring assessment plans.  We need plans for spring assessment activity from all programs before December 18th, 2009.  We will review (with the appropriate criteria of the rubric) and provide feedback for each plan as soon as possible but no later than early January. (The sooner we receive them, the sooner we can provide feedback.)

Don’t hesitate to contact me with any questions or concerns. And don’t forget the next Liaison Council meeting:  December 4th at 1:00 in Lighty 403.

Gary

PS, some program points have asked for more models.  A few from 2008 we can point to are here: https://teamsite.oue.wsu.edu/progeval/default.aspx
(follow ‘Assessment Highlights’ and the ‘case reports’ below the highlights)

Dr. Gary R. Brown, Director
The Office of Assessment and Innovation
Washington State University
509 335-1352
509 335-1362 (fax)
browng@wsu.edu
https://mysite.wsu.edu/personal/browng/GRBWorld/

attachment with original email  a of a set beta (3)

the email also references a mythical program “rocket science” as a vehicle to be used testing the rubric. Those files are included here for completeness: 

Academic Effectiveness Liaison Council Meeting, Tuesday, October 27

Academic Effectiveness Liaison Council
Date: October 27, 2009
Start Time:  2:00:00 PM
End Time:  3:00:00 PM
Dialing Instructions: 5707955
Origin: Pullman (French 442)
Location: Spokane (SHSB 260), Tri-Cities (TEST 228), Vancouver (VCLS 308J)
Details: Academic Effectiveness Liaison Council
Special Request: Room reserved from 12:00 – 12:30 for set up.
Event Contact:

Donna Cofield     5-4854

Sneak Preview

1.      Liaison identification update
·        We have about half of our liaisons who have helped identify the point people for EACH of the programs in their college.  These point people will be critical in our overall communication strategy.  Remember, even if you plan on serving on point for each of the programs in your college, we need to be sure we know what and how many programs need to be identified.
2.      Forum feedback on assessment process and rubric
·        We have already learned enough and received good feedback on the rubrics.  We will have a short version very shortly to complement the longer version, and there are other notable refinements gained from liaisons’ critical vantage.  Keep the input coming!
3.      Conversations and support for process.
·        We have already received positive support from this effort from Old Dominion, Mount Royal, the University of Kentucky and Penn State who have heard of this work through the mock-up we presented with the TLT Group and are very interested in partnering with us (external review exchange).
·        Conversation with leadership with the NWCC&U has been scheduled for later this week.
·        The Western Cooperative of Educational Technologies already requested a presentation on this work for next year in La Jolla.  (Who wants to go?)
4.    We still need your participation in program contacts and doing the pilot Honors Self-Study assessment.

The task again:
1.      Go to https://universityportfolio.wsu.edu/20082009/Pages/default.aspx
2.      Scroll down to the link near the bottom ‘Honors Review Rubric’ which opens an online survey.
3.      The first page of that link is instructions, at the bottom of which is a ‘begin’ button.
Remember, when we have worked through refinements, the goal of the work should provide with a new template for reporting that streamlines the rating.  By writing reports in  template ‘chunks,’ we will be able to concatenate each of them into various formats to address the different reporting requests we get from professional accreditors, the HEC Board, OFM, and anybody else who might appreciate WSU’s commitment to improving student learning.

Recommended approach:

  • Print out a hard copy of the rubric (already being revised thanks to feedback at our forums).
  • Read through it to get the flavor.
  • Read the Honors Self Study
  • Rate the study using the online survey/rubric.  (You can cut and paste language from the rubric into the comment box on the rating form online, and that will help Honors understand the criteria you selected as important to your review of their self-study, and it will help us refine the rubric.)

In the news Today:
October 26, 2009, 02:53 PM ET
Most Colleges Try to Assess Student Learning, Survey Finds
A large majority of American colleges make at least some formal effort to assess their students’ learning, but most have few or no staff members dedicated to doing so. Those are among the findings of a survey report released Monday by the National Institute for Learning Outcomes Assessment, a year-old project based at Indiana University and the University of Illinois. Of more than 1,500 provosts’ offices that responded to the survey, nearly two-thirds said their institutions had two or fewer employees assigned to student assessment. Among large research universities, almost 80 percent cited a lack of faculty engagement as the most serious barrier to student-assessment projects.

Assessment of Assessment Forums (Invitation)

Greetings Again

I am hoping you have had a chance to at least poke through the assessment site.

https://universityportfolio.wsu.edu/20082009/ (link since retired from service)

For those of you who are interested, we will be holding two review sessions next week where we will work through the assessment site and apply the criteria to the Honors 2009 self study.  You are welcome to join us for all or part of the gathering and perhaps make the task a bit easier and enjoyable (refreshments will be served).

Technology, assessment criteria, and any range of the implications of this work will be open for review and discussion.

Where:   CUE 502 (The Egg)
When:    Monday @ 3:30-5:00 & Tuesday @ 9:00-10:30

For those of you at remote sites, if you are interested we can web conference you in (and fax you a bagel).

Gary

Two early concerns–time and purpose

(a liaison writes…)

Subject: RE: Follow up (1 of 2)

Hi Gary,
Thanks for the information and update from the meeting yesterday – sorry that I wasn’t able to attend.  I am looking forward to our meeting on 10/26 to catch up!  I was able to get into the Sharepoint site the information looks correct.  Were there specific things you wanted us to add to our profile?

And just to clarify the goals/outcomes (and process) for this group, it sounds like the participating faculty/programs will be piloting an assessment tool and evaluating its  effectiveness in generating valuable information for the University accreditation.  Once the tool is refined, it will be rolled out to all academic units/programs at WSU for them to complete??  Am I on the right track?

I was also wondering if you have an estimate of the time commitment of this project?  As well as a time line?  (trying to keep my department chair in the loop)

Thanks so much!

The Pilot Assessment of the Honors Self-study

From: Brown, Gary
Sent: Wednesday, October 14, 2009 1:00 PM
To: Anelli, Carol A; Baker, Danial; Bray, Brenda; Brown, Gary; Chenoweth, Candace; Cote, Jane M; Helmstetter, Edwin; Jasmer, Doug; Kelly, Roberta; Kidwell, Kim; Lanier, Mary Sanchez; Olsen, Robert G; ‘Pratt, Dick’; Probst, Tahira; Roll, John; Whidbee, David; Ivory, Carol S
Cc: Brown, Gary; Rodriguez-Vivaldi, Ana Maria; Byington, Tori C; James, Larry G
Subject: 2 of 2–the pilot assessment of the Honors self-study
Hello again assessment pioneers.

Here is the task.  If you log into this site, you will find the directions to guide you through the pilot assessment process:

https://universityportfolio.wsu.edu/20082009/

As an overview, you will be asked to:
1.      Read over the Assessment Criteria (more about this in a minute).
2.      Read the Honors self Study
3.      Go through the online rubric and assign the Honors self-study scores on each of the four dimensions of the rubric.

A note about this process (also reiterated on the assessment site and the home of the WSU Assessment Portfolio).

The goal of this work is to put into place an institution-wide system for reflecting on our assessment work.  That system is intended to be formative, focused on principles of good assessment, and that make visible WSU’s commitment to engage our community in a collaborative process of continuous improvement.  It is our belief that a good process will yield good outcomes and demonstrate, as stated inWSU’s strategic plan,  that ‘We are committed to being ethical and responsible stewards of University resources and to being accountable for upholding the full scope of these values.’

Clearly your feedback on the process, all aspects, is essential.  The rubric used in this context is perhaps the first of its kind, and though it is derived from similar work we’ve adapted or done with other professional organizations, every new context is unique. The criteria are framed to reflect the principles of good assessment, and we are hoping the process will both promote rigor and afford flexibility – a key principle endorsed by the Council of Higher Education Accreditors (CHEA) and more recently by the US Department of Education.  Your feedback on the Honors assessment will help us underscore the goal of attaining flexibility, in particular, and your comments in that regard will probably be even more valuable than the scores you assign, as that language along with your direct feedback on the rubric (also invited on the site) will be useful for the next iteration of the criteria or language we use to renew our culture of evidence.

You will note the assessment rubric has two parts.  The first page is a short form, to introduce readers to the four dimensions of the rubric and provide a holistic overview.  This page is followed by a longer form of the same rubric; long and dense, this version provides specific criteria describing each dimension at three different levels.   This length is not an accident.  We have done assessment of rubrics and learned that more sophisticated criteria associates with better outcomes.  Use of a sophisticated or dense rubric is perhaps slow and challenging at first but raters internalize criteria quickly and it ultimately can result in more efficient and more accurate assessment.  But the rubric is also long because an additional goal of this rubric and this effort, and formative assessment in general, is to be educative.  The detailed descriptive criteria are our first attempt to address many of the key bottlenecks we have encountered in our work with about 45 WSU programshere and several elsewhere we have reviewed. Not the least of those bottlenecks is the language of assessment, and so we certainly anticipate deepening our own understanding of the way the language we use is variously perceived.

As part of the practice assessment you will do using this rubric, you’ll also be asked to give feedback on the rubric and your experience using it.

You will note, too, that nowhere in the DRAFT criteria is there a place to assess the level of student performance or gains resulting from the assessment.  It is our view that at this stage of implementing the process that kind of a focus is distracting.

We will also want to consider our levels of agreement on the honors assessment as we hope to move, over time, toward greater agreement, or inter-rater reliability. If we are to reasonably claim WSU has a system, then an indicator of that system will need to be, eventually, consensus.  Some reasonable measure of inter-rater reliability (roughly 80% agreement), not incidentally, is essential when establishing the validity of an assessment that values, as we advocate, the expertise of the faculty in our own programs.  It is that last point ‘the expertise of our own community’ that we hope this process will help us promote.  It is my view that establishing that kind of expertise is essential if we are to maintain the autonomy as an institution, as a university, that our work requires.

Obviously there is much to talk about and work through, but for now a couple of more notes on logistics:

We will close the online rubric rating form a couple of days before our next meeting.  At that point, when you log on you will be able to see the results.

Meanwhile, we will be opening up a couple of open forums for you to visit us and work through this task with help from OAI staff.  Those forums are not limited to technical concerns or procuring technical support.  We welcome additional feedback or discussion on any of the points touched above or others that occur to you.

Gary

Dr. Gary R. Brown, Director
The Office of Assessment and Innovation
Washington State University
509 335-1352
509 335-1362 (fax)
browng@wsu.edu
https://mysite.wsu.edu/personal/browng/GRBWorld/

the referenced rubric A of A Rubric Beta (Oct 15 2009)

First Presentation of Harvesting Feedback as Accreditation System

Jayme developed a visualization of the complete harvesting feedback scheme
(a proposal for a WSU learning outcomes process) entitled from student
feedback to university accreditation as part of a presentation for the
Teaching Learning and Technology Group (TLTgroup.org) Friday Live!

This post links to all the resources
https://communitylearning.wordpress.com/2009/09/21/from-student-feedback-to-university-accreditation/

The term student feedback in the title is an extension of the thinking in
the harvesting gradebook that feedback in both numeric and qualitative forms
is an important goal of “grading.”

A previous TLT group session had laid out in greater detail the harvesting
of feedback for student grading and assessment of the assignment.

https://communitylearning.wordpress.com/2009/07/17/harvesting-feedback-on-a-course-assignment/