CHEA 2011 Award Submitted

CHEA 2011 Award Submitted CHEA has an annual awards competition (http://chea.org/2011_CHEA_Award.html ) for innovative assessment efforts. Attached is the WSU 2011 application, submitted last Friday, describing our pilot year of institutional assessment.

WSU CHEA 2011 Award Application

[SACS] Student Learning Outcomes – Business

On the question, does every major have to report outcomes:

From: SACS Commission on Colleges Discussion Forum [mailto:SACS-L@LISTSERV.UHD.EDU]
Sent: Wednesday, March 31, 2010 2:57 PM
To: SACS-L@LISTSERV.UHD.EDU
Subject: Re: [SACS] Student Learning Outcomes – Business

I can guarantee it is every major within a degree. Also you must show that every program has been assesed and that improvement strategies have been implemented; not plans to improve. Also at your reaffirmation every program must be finished with this process not just a portion.

Comment to the original post

SACS is not NWCCU

Edit
I believe they are somewhat farther down the assessment trail. 

Can we ask this same question directly of our accreditation agency?

Yeidel, JoshuaNo presence information at 3/31/2010 5:39 PM

Coordinating on Glossary Terms

Folks,
Just an update on the meeting yesterday with Larry.  The Goals groups are meeting and defining terms for WSU’s Strategic Goals (Core Themes).  The implication as I read it is that we need to hold off on these terms:

Goals
Outcomes
Objectives

So in the context of the assignment Ashley shared, the language you find that elaborates on these concepts –or translates them effectively as suggested– may have to be reworked to align with efforts of the four WSU Goal groups.  Meanwhile, I am shipping AEA and NWCC&U definitions to the Goal Groups as Larry confirmed and suggested.

There remain  a number of terms and conceptual bottlenecks related to the language of assessment that will no doubt keep us busy.

FYI

Howard Grimes
Mary Wack
Muriel Oaks
Melynda Husky

Each chairs one of the four groups, in order.

Our model, an “adult dose,” and the note that silenced the SACS discussion list

From: SACS Commission on Colleges Discussion Forum [mailto:SACS-L@LISTSERV.UHD.EDU] On Behalf Of Brown, Gary
Sent: Tuesday, January 26, 2010 1:00 PM
To: SACS-L@LISTSERV.UHD.EDU
Subject: Re: [SACS] assessment reflection reporting

Hi Rebecca (and  SACS folks),

We provide a template for reporting that includes 5 dimensions (available athttp://oai.wsu.edu/) :

1.      Program Description
2.      Assessment Team and System
3.      Goals, Outcomes, and Measures
4.      Analysis and Action Plan
5.      And Leadership

We understand that leadership should be integral with the team and system, but since the leadership dimension has proved to be so pivotal, we have broken it out.

In addition to the template, we have a rich rubric and levels that guide the assessment of every program’s assessment efforts.

Each program report on each of the preceding dimensions is rated as absent, minimal, emerging, established, effective, or outstanding.  Scores are provided by assessment experts independently and external experts are encouraged in this formative process.

Each program’s rating is dynamic to encourage continuous improvement, and program scores roll up into college and institutional reports. (The process is new and action plans are being developed and rated this term.)

We have survived the first round of ratings with mostly appreciation for the guidance and recognition of the growing importance of this kind of work.  Obviously there are (vocal) outliers.

Gary

Dr. Gary R. Brown, Director
The Office of Assessment and Innovation
Washington State University
509 335-1352
509 335-1362 (fax)
browng@wsu.edu

From: SACS Commission on Colleges Discussion Forum [mailto:SACS-L@LISTSERV.UHD.EDU] On Behalf Of Lewis, Rebecca J
Sent: Tuesday, January 26, 2010 11:23 AM
To: SACS-L@LISTSERV.UHD.EDU
Subject: [SACS] assessment reflection reporting

Hello All,
UT Arlington has a mature assessment process with pretty standard documentation – an assessment plan; a results report, which documents any proposals for improvement; and a report that documents any improvements that were based on assessment.  However, we are looking for a way to contextualize and to some extent summarize assessment findings.

We are considering asking Deans and VPs to create a reporteach assessment cycle reflecting on the most meaningful outcomes and findings as well as on faculty/staff engagement in the assessment process. We are also hoping this will increase administrative engagement and awareness about assessment across campus.  Below my signature are the proposed contents for this report.

Is anyone out there doing something like this already?  If so, what are you asking to have included in the report?  How is it working on your campus?
I would also like feedback about the proposed contents.

Thanks for your assistance with this!

Rebecca Lewis
Assistant Director of Outcomes Assessment
Institutional Research, Planning and Effectiveness
817-272-5133
UEP Home Page

Dean/VP Assessment Reflection Report
Reflection on important and meaningful outcomes and findings:
A.     How did the units in your school/college or division choose the outcomes to be assessed?
B.     Please describe a few of the most significant findings from the assessments conducted by your units during this cycle.
C.     For colleges/schools, are there any instances where improved student learning can be demonstrated as a result of assessment and subsequent improvements to programs and services?
D.     For divisions, are there any instances where improved productivity and/or efficiency can be demonstrated as a result of assessment and subsequent improvements to programs and services?
Reflection on faculty/staff engagement in the assessment process:
E.      Describe the level of faculty/staff engagement in the assessment process.
F.      Describe any particular examples where faculty/staff have shown initiative and excitement about assessment and the opportunities that assessment can create?
G.    What can the college/school/division do to improve communication about assessment within the unit(s)?
H.     What can the college/school/division do to increase faculty/staff engagement with regard to assessment?
————————————————————————– Post to SACS-L: Email to SACS-L@LISTSERV.UHD.EDU- Subscribe: Send SUBSCRIBE SACS-L to LISTSERV@LISTSERV.UHD.EDU- Unsubscribe: Send UNSUBSCRIBE SACS-L to LISTSERV@LISTSERV.UHD.EDU- Digest format: Send SET SACS-L DIGEST to LISTSERV@LISTSERV.UHD.EDU- Other options & archives: listserv.uhd.edu/archives/sacs-l.html————————————————————————-SACS-L is hosted by the University of Houston-Downtown ————————————————————————– Post to SACS-L: Email to SACS-L@LISTSERV.UHD.EDU- Subscribe: Send SUBSCRIBE SACS-L to LISTSERV@LISTSERV.UHD.EDU- Unsubscribe: Send UNSUBSCRIBE SACS-L to LISTSERV@LISTSERV.UHD.EDU- Digest format: Send SET SACS-L DIGEST to LISTSERV@LISTSERV.UHD.EDU- Other options & archives: listserv.uhd.edu/archives/sacs-l.html————————————————————————-SACS-L is hosted by the University of Houston-Downtown

Assessment, Accountability, and Improvement: Revisiting the Tension

Assessment, Accountability, and Improvement: Revisiting the Tension
A key piece from Peter Ewell and NCHEMS, and a good resource for those who wonder why we do what we do the way we do.  This assessment of assessment is both a rationale and a blueprint:

Ewell says:

“Institutional accrediting organizations remain membership associations, however, so they cannot stray too far toward establishing common standards and applying them through aggressive review.”

“They also remain extremely limited in their ability to influence the majority of institutions not at risk of losing accreditation.”

“The future effectiveness of institutional accreditation in both promoting good practice and in reinforcing the academy’s assumption of consistent and transparent standards of student academic achievement lies entirely in the hands of the academy and its leadership.”

peterewell 2009

Recognition from VP Quality, Curriculum, and Assessment at AAC&U

From a 12/15/2009 webcast, Terry Rhodes,Vice President for Quality, Curriculum, and Assessment at AAC&U:
https://admin.na6.acrobat.com/_a738382050/p17163215/

Questioner: “How is VALUE & Power of Rubrics to assess learning playing in the VSA [Volunteer System of Accountability] Sphere?”

Rhodes: [VSA is ] very concerned about comparability among institutions, but they have indicated they would love campuses to use rubrics and to report on them, but they want to have some way that they can provide comparability. I think again the work that is going on at Washington State begins to provide a way to do that. It’s not necessarily a score but is a wonderful rich way to provide the multiplicity and multiple dimensions of learning in a graphic way that is easily represented and easily communicated.

Questioner: “Are there any accreditor responses to the use of rubrics (vs VSA test scores) share?”

Rhodes: “All of the accrediting workshops at SACS at Middle States–are very heavy into that. Northwest is one area that has lagged a little behind on this, but I think with Washington State pushing them they are going to get more enthusiastic. All of the accreditors have actually viewed rubrics, and the use of them, and the reporting of learning using rubrics as much more useful for campuses than a single test score.

Professional and Institutional Accreditation (ASPB View)

From the Executive Director
American Society of Plant Biologists

Hi again Gary.

I always enjoy learning more about topics that are unfamiliar to me, and accreditation is definitely one such topic! Clearly we could spend considerably more time in dialog (and I hope that we will!), but to directly answer your question, “What accountability bodies (or just pressures) are plant scientists responding to, if any?” I think the answer currently is ‘none.’

That said, you may be familiar with the (NSF-sponsored) -Vision and Change- exercise (see http://www.visionandchange.org/), which is one of the approaches to re-envisioning undergraduate biology education with which ASPB has been closely involved. Although it’s still somewhat fuzzy, it does seem to me to be coming into focus, and SoTL is definitely a major emphasis.

Although I am not aware that the NSF is actively pursuing accreditation metrics as (sort of) one end of an educational/research continuum, it is unusual among science research agencies in that it does have programs that focus on SoTL (in the Education and Human Resources Directorate), as well as the (perhaps better know) research programs (in the Biology, Geosciences, Math, etc. directorates). It is also clear to me that the NSF is making large strides, where appropriate, in interdigitating these programs. Which is to say that program officers are actively encouraged to work together across the directorates.

It is also pertinent, I think, that the NSF instigated a requirement four or five years ago that has had a profound impact on the way in which funded researchers approach the dissemination of their science. Known as “Criterion 2” or “broader impacts”, it obliges grantees to (in a nutshell) demonstrate to the NSF the ways in which they have engaged the public and/or educators and students around the objectives of the funded research project. This (of course) is not directly related to accreditation; my point, though, is that should the NSF so chose, it might be able to find ways to — er — induce more effective teaching among its grantees. (There’s a disconnect here, as I’m sure you appreciate. Organizationally, the role of a grantee as a teacher at his or her institution is largely distinct from their role as an NSF-funded researcher and governed by different structures. But just because it’s a tough nut doesn’t mean there won’t be people or organizations willing to have a go at cracking it.)

Getting back to ASPB — and recognizing that we are currently operating in an ill-defined ‘space’ and with (one) immediate goal of improving our members’ understanding and application of SoTL in their teaching — there is a raft of resources to which I could point you. I’ll start with just a couple, though, and copy in my colleague Katie Engen. Katie is a) more immediately familiar with ASPB’s efforts in this area, and b) in closer touch with our Education Committee (http://www.aspb.org/committees/education.cfm), which tends to pay closer attention to formal K-16 education, and with members of our Education Foundation board (http://www.aspb.org/education/foundation/board.cfm), which focuses more on informal, public education. I’m sure that she will be able to offer additional thoughts and links to resources, and she’ll be a good conduit –should such be needed– to members and leaders who are directly engaged in these efforts.

Speaking of, we are encouraging members to become both more literate about SoTL and more willing to properly study the efficacy of their own teaching (see, e.g.,http://www.aspb.org/newsletter/julaug09/09publish.cfm; please let Katie know if you can’t access this page and she’ll send you a pdf). We’re encouraging direct engagement by the society’s members in K-12 education (not necessarily your immediate interest, but the caliber of primary and secondary education has an obvious and direct impact on tertiary education); see http://www.plantcell.org/cgi/content/full/19/8/2311 for an article on this topic published recently in our top-notch research journal and
http://www.aspb.org/education/importance_statement.cfm for a statement on this topic that was ratified by the Society’s executive committee a couple of months ago.

We have also articulated some fundamental principles relating to knowledge of plants (see http://www.aspb.org/education/foundation/principles.cfm), and a project funded by ASPB’s Education Foundation is developing SoTL-informed hands-on modules around each of these principles.

I’ll stop there and invite both you and Katie to weigh in with any additional thoughts and comments.

Cheers,
Crispin

From: Brown, Gary [mailto:browng@wsu.edu]
Sent: Wednesday, September 02, 2009 5:30 PM
To: Crispin Taylor
Subject: RE: accreditation

Crispin,

Thanks for the quick response!

You have very acutely inferred the heart of my question (though I agree it was a bit muddled).

I’m looking at the way CHEA and almost every other regional and professional accrediting agency is in the process of revising standards, essentially raising the bar and requiring assessment to focus on outcomes (rather than coverage,) and encouraging educators to establish systematic assessment (rather than the fire drill events we are so adept at).  The goal of this across the USA has been to put a renewed focus on making changes in teaching and curricula based upon evidence.

I know that sciences are often without specific accreditors, though not without influencing agencies like NSF, NIH, and, presumably, ASPB.  At the same time, professional accreditation organizations like ABET (Engineering), AACSB (Business), NCATE (Education) etc. are also revising their standards to better align with regional accreditors.

So the question was what accountability bodies (or just pressures) are plant scientists responding to, if any.  I appreciate your answer.  Your response also raises the follow up question:  When you say you are ‘actively engaged with,’  I wonder how you (or I in my role in the office of Academic Effectiveness at WSU) can do more to engage and leverage the important influence of professional peers to encourage attention to the scholarship of teaching and learning.  As you can imagine, the challenge I face in my role is to keep the discussion focused on enriching the student learning experience rather than on perfunctory compliance with an annoying bureaucracy.

I am currently embarking upon a very exciting project with a group of plant scientists here at WSU, so any leads you might provide will be more than welcome by our team as we endeavor to expand and deepen our effort.  And, needless to say, as I anticipate a potentially terrific model of integrated research and assessment, done transparently online with what will be available tools, you and ASBP will certainly be welcome to join us.

Gary

From: Crispin Taylor [mailto:ctaylor@aspb.org]
Sent: Wednesday, September 02, 2009 12:18 PM
To: Brown, Gary
Subject: RE: accreditation

Hi Gary:

Apologies for being dense, but I’m not quite sure what your question is driving at. ASPB is well aware of — indeed, is actively engaged with — various efforts to re-envision the undergraduate biology curriculum, and we assuredly recognize the value of applying what is being learned through research on teaching and learning to improve pedagogy and instructional outcomes. We’re also investing in various online mechanisms and tools aimed at teaching content and/or process. I presume that many of these threads will come together in more formal accreditation programs/efforts, but at this point I do not believe that ASPB is promoting or participating in any such programs.

Having said all that, I am still concerned that I may be missing the point of your question. I think it’d help me do a better job answering that question (or referring you to someone who can) if you could provide me with some examples of the kinds of things you are referring to (e.g., examples from other disciplines), as well as some additional information regarding the context in which you are working.

Thanks for contacting me; I hope I will be able to help out, either directly or indirectly.
Cheers,
Crispin

Crispin Taylor, Ph.D.
Executive Director
American Society of Plant Biologists
15501 Monona Drive
Rockville, MD 20855-2768
Direct: 301-296-0900
Main: 301-251-0560
Fax: 301-251-6740
ctaylor@aspb.org
http://www.aspb.org/

It’s not too soon to save the dates…
PLANT BIOLOGY 2010
Montréal, Canada
Jul. 31 — Aug. 4, 2010

From: Brown, Gary [mailto:browng@wsu.edu]
Sent: Wednesday, September 02, 2009 12:08 PM
To: Crispin Taylor
Subject: accreditation

Hi Crispin,
I’m working with our plant biology programs here at Washington State, and I’m interested in learning more about various educational accreditation influences that may be looming relative to ASPB.  Do you know or do you know somebody I might contact to learn about where the profession may be heading.

Gary

Dr. Gary R. Brown, Director
The Office of Academic Innovation & Effectiveness
Washington State University
509 335-1352
509 335-1362 (fax)
browng@wsu.edu
https://mysite.wsu.edu/personal/browng/GRBWorld/

RE: Recap of the Planning Request

Good questions, liaison council member.

I think we’ve talked a bit about this, but it remains an interesting issue.  We have suggested the plan based upon our experience that seems to be echoed at least once a week these days.

For instance, last week you probably saw it in the Chronicle of Higher Education (October 27, 2009) in the comments of George Kuh, director of the National Institute for Learning Outcomes Assessment.  He makes several observations that align with our experience, reading, and thinking.  He notes, for instance, that “what we want is for assessment to become a public, shared responsibility, so there should be departmental leadership” (paragraph 14).

But to your question, he also notes that lots of places have developed outcomes, however:

“What’s still disconcerting is that I don’t see a lot of evidence of closing the loop. There’s a lot of data around, there’s some evidence it’s being used in a variety of ways, but we still don’t know if that information is being transferred in such a way as to change practices for the better. That’s still the place where we’re falling short” (paragraph 6).

http://chronicle.com/article/An-Expert-Surveys-the/48945/?key=HWsgcl03ZSJNZHs2K3UTKScFaHt6JkJ4Y34WMHYabFlW

Part of the reason closing the loop is so difficult is that outcomes assessment remains removed from what faculty do in their classrooms.  (There’s a nicepiece in Inside Higher Ed today on this, but this email is already too long.) So what we’ve learned that tends to work better and is generally most practical is to put the focus on what faculty are already doing. Peter Ewell, the VP ofThe National Center for Higher Education Management Systems, came to a similar conclusion that resonates with our experience and suggests a strategy for making outcomes assessment truly ‘practical’ or ‘functional’ for closing the loop.  Lamenting the failure of more than 10 years of the assessment reform in helping institutions and faculty close the loop, Ewell says:
“I have learned two additional lessons about the slippery matter of articulating abilities.  First, it’s more useful to start with the actual practice of the ability than with the stated outcome.  Phrases like ‘intellectual agility’ have great charm, but mean little in the absence of an actual student performance that might demonstrate them.  To construct assessment techniques, formal assessment design, as described in the textbooks, demands ever more detailed verbal specifications of the outcomes or competencies to be developed.  But it is often more helpful to go the other way. Given a broad general descriptor like ‘intellectual agility,’ can you imagine a very concrete situation in which somebody might display this ability, and how they might actually behave?  Better still, can you quickly specify the parameters of an assignment or problem that might demand a particular level of ability for access?  The performance that the student exhibits on the assessment is the definition of the ability itself; the ability has no independent existence” (pp. 6, 2004, General Education and the Assessment Reform Agenda).

We’ve worked with a couple of dozen programs here at WSU (and more than a few elsewhere) and found that starting with the real and embedded assignments faculty use is an effective way to approach outcomes assessment.  It helps programs refine and make concrete their understanding of outcomes in the context of their own teaching.  It helps them close the loop as reflected in their own assignment design.  So that’s the recommended plan.

Given the frighteningly short timeline we all face, and in the interests on building outcomes assessment systems in the day-to-day work in which faculty are already engaged, the proposed approach is what we are thinking will give us the best, well, outcome;-)

From: Liaison Council Member
Subject: RE: Recap of the Planning Request

Hi Gary,
I tend to look at things from a ‘practical’ stand point (and I realize that I operate from my own professional education world, which may or may not be reality for the rest of the university) — just so you know where my comments are coming from…

An important step for My College was the development of the curriculum outcomes (and they are far from perfect and in some cases we have discovered not too functional, but they have at least provided a foundation to start from).  I don’t have an informed understanding of how many University programs/colleges/units are operating with curricular outcomes now?  I sense that some of them may not have accomplished that step yet.

So….if there are programs without outcomes developed and outcomes are not integrated into their current courses, how are steps 2 and 3 outlined below possible to achieve in a short time frame?  Would it be better for this process to include a selection of alternative approaches to meet the programs where they are?  For example, implement the plan below for the appropriate units and then provide assistance for the others as they develop outcomes for their programs and provide support/guidance along the way.

Thanks,

Liaison Council Member

From: Brown, Gary
Subject: Recap of the Planning Request

The Plan We Need

The goal of the plan is to establish the team and strategy to be implemented in the spring.

We are working to be sure ALL programs and ALL teaching faculty, adjuncts, and TAs can report some level of involvement in the program outcomes work, including, minimally, identification of program outcomes on their syllabi and participation in the analysis of results and awareness of the action plan (how the program will be closing the loop).

It is reasonable to expect that the spring assessment will be a pilot, and that therefore the action plan might very well be something focused on ways to make the assessment more robust, valid, and designed to encourage greater participation and use of embedded activities faculty have already been doing in their courses.

So specifically, we hope the plan will include the development of a team and system (rubric criterion #1) and the beginnings of the outcomes and measurement strategy (rubric criterion #2):

1.     Identification of WHO in the program will be involved (#1).
2.     Identification of WHICH core courses and WHAT representative activities students will be doing that will be assessed. (#2)
3.     How that approach to direct measurement will be aligned with other existing indirect measures such as student evaluations, and if not, then how that alignment will be addressed in the future (the plan to plan) (#2)
The goal of this plan to plan is to make sure logistics are in place so that we can increase the likelihood that assessment will take place next spring. The spring deadline is necessary so that programs can also do the analysis and complete a report with action items for fall, 2010, when the updated report to the Northwest Commission is due.

Other Items
·        The rubric revisions are underway, thanks to excellent feedback from this group and staff in OAI.
·        We will have the rubric revisions ready along with two mock model self studies. The models will be aligned with the rubric and should help ongoing deliberations about the process and the template.
·        You can glimpse the rating of the Honors self study along with associated comments at this link.

Academic Effectiveness Liaison Council Meeting, Tuesday, October 27

Academic Effectiveness Liaison Council
Date: October 27, 2009
Start Time:  2:00:00 PM
End Time:  3:00:00 PM
Dialing Instructions: 5707955
Origin: Pullman (French 442)
Location: Spokane (SHSB 260), Tri-Cities (TEST 228), Vancouver (VCLS 308J)
Details: Academic Effectiveness Liaison Council
Special Request: Room reserved from 12:00 – 12:30 for set up.
Event Contact:

Donna Cofield     5-4854

Sneak Preview

1.      Liaison identification update
·        We have about half of our liaisons who have helped identify the point people for EACH of the programs in their college.  These point people will be critical in our overall communication strategy.  Remember, even if you plan on serving on point for each of the programs in your college, we need to be sure we know what and how many programs need to be identified.
2.      Forum feedback on assessment process and rubric
·        We have already learned enough and received good feedback on the rubrics.  We will have a short version very shortly to complement the longer version, and there are other notable refinements gained from liaisons’ critical vantage.  Keep the input coming!
3.      Conversations and support for process.
·        We have already received positive support from this effort from Old Dominion, Mount Royal, the University of Kentucky and Penn State who have heard of this work through the mock-up we presented with the TLT Group and are very interested in partnering with us (external review exchange).
·        Conversation with leadership with the NWCC&U has been scheduled for later this week.
·        The Western Cooperative of Educational Technologies already requested a presentation on this work for next year in La Jolla.  (Who wants to go?)
4.    We still need your participation in program contacts and doing the pilot Honors Self-Study assessment.

The task again:
1.      Go to https://universityportfolio.wsu.edu/20082009/Pages/default.aspx
2.      Scroll down to the link near the bottom ‘Honors Review Rubric’ which opens an online survey.
3.      The first page of that link is instructions, at the bottom of which is a ‘begin’ button.
Remember, when we have worked through refinements, the goal of the work should provide with a new template for reporting that streamlines the rating.  By writing reports in  template ‘chunks,’ we will be able to concatenate each of them into various formats to address the different reporting requests we get from professional accreditors, the HEC Board, OFM, and anybody else who might appreciate WSU’s commitment to improving student learning.

Recommended approach:

  • Print out a hard copy of the rubric (already being revised thanks to feedback at our forums).
  • Read through it to get the flavor.
  • Read the Honors Self Study
  • Rate the study using the online survey/rubric.  (You can cut and paste language from the rubric into the comment box on the rating form online, and that will help Honors understand the criteria you selected as important to your review of their self-study, and it will help us refine the rubric.)

In the news Today:
October 26, 2009, 02:53 PM ET
Most Colleges Try to Assess Student Learning, Survey Finds
A large majority of American colleges make at least some formal effort to assess their students’ learning, but most have few or no staff members dedicated to doing so. Those are among the findings of a survey report released Monday by the National Institute for Learning Outcomes Assessment, a year-old project based at Indiana University and the University of Illinois. Of more than 1,500 provosts’ offices that responded to the survey, nearly two-thirds said their institutions had two or fewer employees assigned to student assessment. Among large research universities, almost 80 percent cited a lack of faculty engagement as the most serious barrier to student-assessment projects.

External Interest in Rain King from TLT Group

Program review rubric

From: Stephen C. Ehrmann [mailto:ehrmann@tltgroup.org]
Sent: Monday, October 26, 2009 12:06 PM
To: Larry Ragan; Abdous, M’Hammed; Jim Zimmer
Cc: Gary Brown
Subject: Program review rubric

Hi,
I mentioned to each of you that Gary Brown and his colleagues were in the early stages of using Flashlight Online [TLTGroup’s re-branding of the WSU online survey tool Skylight] to deploy an interesting set of rubrics for program review/evaluation.  Programs would get the rubrics in advance and use those ideas to document their performance. Their reports and a Flashlight form with the rubrics could then be sent to reviewers; their responses to the rubric could then then be easily summarized and displayed. I’ve seen the rough draft of their rubric and it seems quite promising to me. It’s designed for the review of academic departments, but I think the idea could be adapted for use with faculty support/development units.
When the material is ready for a wider look in a few weeks, Gary will send me a URL and I can pass that along. Or you could contact Gary directly if you like. His email address is BrownG@wsu.edu
Steve
**********
Stephen C. Ehrmann, Ph.D.
Director of the Flashlight Program for the Study and Improvement of Educational Uses of Technology;
Vice President, The Teaching, Learning, and Technology Group, a not-for-profit organization
Mobile: +1 240-606-7102
Skype: steveehrmann

The TLT Group:http://www.tltgroup.org
The Flashlight Program:http://www.tltgroup.org/flashlightP.htm
Blog:http://tlt-swg.blogspot.com/

(Old Dominion) (Penn State) are both thinking about how to design comprehensive evaluations of faculty support.  Your rubric for program review seems like it could be adapted to their purposes.   I was talking with folks from Mount Royal (Calgary) at ISSOTL;