Recognition from VP Quality, Curriculum, and Assessment at AAC&U

From a 12/15/2009 webcast, Terry Rhodes,Vice President for Quality, Curriculum, and Assessment at AAC&U:
https://admin.na6.acrobat.com/_a738382050/p17163215/

Questioner: “How is VALUE & Power of Rubrics to assess learning playing in the VSA [Volunteer System of Accountability] Sphere?”

Rhodes: [VSA is ] very concerned about comparability among institutions, but they have indicated they would love campuses to use rubrics and to report on them, but they want to have some way that they can provide comparability. I think again the work that is going on at Washington State begins to provide a way to do that. It’s not necessarily a score but is a wonderful rich way to provide the multiplicity and multiple dimensions of learning in a graphic way that is easily represented and easily communicated.

Questioner: “Are there any accreditor responses to the use of rubrics (vs VSA test scores) share?”

Rhodes: “All of the accrediting workshops at SACS at Middle States–are very heavy into that. Northwest is one area that has lagged a little behind on this, but I think with Washington State pushing them they are going to get more enthusiastic. All of the accreditors have actually viewed rubrics, and the use of them, and the reporting of learning using rubrics as much more useful for campuses than a single test score.

Advertisements

Webinar on using A of A rubric

We did a webinar for TLT group as part of their series “The Power of Rubrics.” This post links to the relevant materials. Also in this session Terry Rhodes presented and made some interesting comments in recognition of our work.

Academic Effectiveness Liaison Council Meeting, Tuesday, October 27

Academic Effectiveness Liaison Council
Date: October 27, 2009
Start Time:  2:00:00 PM
End Time:  3:00:00 PM
Dialing Instructions: 5707955
Origin: Pullman (French 442)
Location: Spokane (SHSB 260), Tri-Cities (TEST 228), Vancouver (VCLS 308J)
Details: Academic Effectiveness Liaison Council
Special Request: Room reserved from 12:00 – 12:30 for set up.
Event Contact:

Donna Cofield     5-4854

Sneak Preview

1.      Liaison identification update
·        We have about half of our liaisons who have helped identify the point people for EACH of the programs in their college.  These point people will be critical in our overall communication strategy.  Remember, even if you plan on serving on point for each of the programs in your college, we need to be sure we know what and how many programs need to be identified.
2.      Forum feedback on assessment process and rubric
·        We have already learned enough and received good feedback on the rubrics.  We will have a short version very shortly to complement the longer version, and there are other notable refinements gained from liaisons’ critical vantage.  Keep the input coming!
3.      Conversations and support for process.
·        We have already received positive support from this effort from Old Dominion, Mount Royal, the University of Kentucky and Penn State who have heard of this work through the mock-up we presented with the TLT Group and are very interested in partnering with us (external review exchange).
·        Conversation with leadership with the NWCC&U has been scheduled for later this week.
·        The Western Cooperative of Educational Technologies already requested a presentation on this work for next year in La Jolla.  (Who wants to go?)
4.    We still need your participation in program contacts and doing the pilot Honors Self-Study assessment.

The task again:
1.      Go to https://universityportfolio.wsu.edu/20082009/Pages/default.aspx
2.      Scroll down to the link near the bottom ‘Honors Review Rubric’ which opens an online survey.
3.      The first page of that link is instructions, at the bottom of which is a ‘begin’ button.
Remember, when we have worked through refinements, the goal of the work should provide with a new template for reporting that streamlines the rating.  By writing reports in  template ‘chunks,’ we will be able to concatenate each of them into various formats to address the different reporting requests we get from professional accreditors, the HEC Board, OFM, and anybody else who might appreciate WSU’s commitment to improving student learning.

Recommended approach:

  • Print out a hard copy of the rubric (already being revised thanks to feedback at our forums).
  • Read through it to get the flavor.
  • Read the Honors Self Study
  • Rate the study using the online survey/rubric.  (You can cut and paste language from the rubric into the comment box on the rating form online, and that will help Honors understand the criteria you selected as important to your review of their self-study, and it will help us refine the rubric.)

In the news Today:
October 26, 2009, 02:53 PM ET
Most Colleges Try to Assess Student Learning, Survey Finds
A large majority of American colleges make at least some formal effort to assess their students’ learning, but most have few or no staff members dedicated to doing so. Those are among the findings of a survey report released Monday by the National Institute for Learning Outcomes Assessment, a year-old project based at Indiana University and the University of Illinois. Of more than 1,500 provosts’ offices that responded to the survey, nearly two-thirds said their institutions had two or fewer employees assigned to student assessment. Among large research universities, almost 80 percent cited a lack of faculty engagement as the most serious barrier to student-assessment projects.

External Interest in Rain King from TLT Group

Program review rubric

From: Stephen C. Ehrmann [mailto:ehrmann@tltgroup.org]
Sent: Monday, October 26, 2009 12:06 PM
To: Larry Ragan; Abdous, M’Hammed; Jim Zimmer
Cc: Gary Brown
Subject: Program review rubric

Hi,
I mentioned to each of you that Gary Brown and his colleagues were in the early stages of using Flashlight Online [TLTGroup’s re-branding of the WSU online survey tool Skylight] to deploy an interesting set of rubrics for program review/evaluation.  Programs would get the rubrics in advance and use those ideas to document their performance. Their reports and a Flashlight form with the rubrics could then be sent to reviewers; their responses to the rubric could then then be easily summarized and displayed. I’ve seen the rough draft of their rubric and it seems quite promising to me. It’s designed for the review of academic departments, but I think the idea could be adapted for use with faculty support/development units.
When the material is ready for a wider look in a few weeks, Gary will send me a URL and I can pass that along. Or you could contact Gary directly if you like. His email address is BrownG@wsu.edu
Steve
**********
Stephen C. Ehrmann, Ph.D.
Director of the Flashlight Program for the Study and Improvement of Educational Uses of Technology;
Vice President, The Teaching, Learning, and Technology Group, a not-for-profit organization
Mobile: +1 240-606-7102
Skype: steveehrmann

The TLT Group:http://www.tltgroup.org
The Flashlight Program:http://www.tltgroup.org/flashlightP.htm
Blog:http://tlt-swg.blogspot.com/

(Old Dominion) (Penn State) are both thinking about how to design comprehensive evaluations of faculty support.  Your rubric for program review seems like it could be adapted to their purposes.   I was talking with folks from Mount Royal (Calgary) at ISSOTL;