Analysis of Inter-rater agreement 2009-10

Lee,

Thanks for telling me that you completed rating Honors also.

Our average ratings for that program were 5; 5.5; 4.5; 5  so we are a little lower than you, but in the same category “Integrating” in all but one.

You can see all our results here: https://universityportfolio.wsu.edu/2009-2010/Pages/default.aspx

We are exploring two measures of inter-rater reliability, within 1 point and within the same category.

In terms of scores, see the graph, which we think is good. 83% of our scores are within 1 point of each other

Regarding being in the same category, we are not doing as well, it seems that we often come close, but straddle the lines.

What is valuable about you rating 2 programs (one high and one low) is that we can begin to get a sense that you see our measure in the same way that we do.  Another kind of test we need to do is see if outsiders agree with us in the messy middle.

We have more work like this to do with external stakeholders to see how well our tool plays in the wider arenas

Nils

On 10/13/10 4:40 PM, “Lee” wrote:

> Hi Nils,
>
> I sent in my review of Honors.  I gave them all top marks.  Was I right?  They
> struck me as being the Bar we’re all trying to reach!  It’s almost like you
> wrote the review rubric to FIT what they’ve done!?
>
> Lee
> ________________________________
> From: Nils Peterson [nils_peterson@wsu.edu]
> Sent: Tuesday, September 28, 2010 3:47 PM
> To: Lee
> Subject: Another WSU program to review
>
> Lee,
>
> Rather than Business, I’m giving you our Honors program. This is a program
> that has worked with our unit for several years and we know them.
>
> I think you will find it contrasts from Biology’s report in ways that may help
> you exercise more of the rubric’s scale.
>
> Thanks for your interest and help

Advertisements

Engaging Employers and Other Community Stakeholders

Do you have ideas or examples of good practice of working with employers to promote workforce development? UK universities and colleges are under pressure to do “employer engagement” and some are finding it really difficult. This is sometimes due to the university administrative systems not welcoming non-traditional students, and sometimes because we use “university speak” rather than “employer speak”.
— a UK Colleague

Washington State University’s Office of Assessment and Innovation has been working on this question for several years. We presented this spectrum diagram to think about how the more traditional Institution-centric learning differs from Community-based learning. It may point to some of the places your programs get stuck thinking about this question.

We have also been exploring methods to gather assessments from stakeholders (employers as well as others) about aspects of academic programs. This example shows the twinned assessment of student work using a program rubric and assessment of the faculty’s assignment that prompted the work. We invite stakeholders to engage in both assessments. In other implementations of this process, we have asked stakeholders about the utility of the rubric itself.

We also are finding differences in the language used by faculty, students and employers. When asked about the most important things to learn about in a business program we got this feedback.

Another example of different groups using different language is this one, where industry and faculty used different language with different foci to give feedback to students. Particularly we saw industry use “problem” as in “problem statement” and faculty use “problems” synonymous with “confused” and “incorrect.”

Our method for learning about both language and values is through simple surveys of stakeholders as they are engaged with us in assessment activities. For example here (In Class Norming Survey), we asked people who had just assessed student work using a program rubric the importance of the rubric itself.

In this survey (AMDT Stakeholder Survey) a fashion design and marketing program is asking industry partners about language and criteria, as a precursor to building a program-wide assessment rubric. All these activities help programs understand the wider context in which they operate.

More on this work can also be found in this article. Brown, Gary, DesRosier, T., Peterson, Nils, Chida, M., Lagier, R. 2009. Engaging Employers in Assessment. About Campus Vol 14(5) Nov-Dec 2009. NUTN award for best essay – 2009

It may help to understand that we define stakeholders broadly to account for the variation among academic programs: employers, alumni, students themselves, professional and graduate school admissions officers, audiences (as in performance arts), etc.

Presently we have developed a rubric to guide the assessment of self-studies that our academic program are doing as part of our University-wide system of assessment, a component of our institution’s regional accreditation activities. You can see a snapshot of how our Colleges are doing here.

Updating and Advancing Accreditation Self Studies

Program Assessment Points and Liaisons:

Thank you for the work you and your colleagues have done and are doing on assessment so far, and thanks for the feedback.   We continue to need your help addressing the many entities that have an interest in WSU’s commitment to student learning outcomes. There are audiences at the state, regional, and federal levels: the Northwest Commission of Colleges and Universities (NWCC&U), the Office of Fiscal Management (OFM), The Board of Regents, the Higher Education Coordinating Board who in turn report to the US Department of Education.

Communicating to these various audiences and implementing assessment that is meaningful and useful to you at the program level, for WSU, and for our accreditors remains our goal.

So again we need your help putting WSU’s INSTITUTIONAL System of Assessment in place for the NWCC&U report due October 1, 2010.  For that report and to help the NWCC&U respond to the federal pressures they are engaging on our behalf, we need to demonstrate that:

1.      WSU has an institutional system of assessment.

2.      The assessment efforts are verifiable and credible.

3.      EVERY WSU program contributes to this system.

4.      All programs support WSU’s 4 Strategic Goals and Six Learning Goals.

5.      We are using solid evidence to make improvements and guide change.
The next program self-study is due MAY 17, 2010.

·        The self-studyshould include revisions to address OAI feedback for all sections PLUS a section 3 of the template (analysis of data gathered in 2009-2010 and/or methods or tools used; and your program’s assessment action plan for the 2010-2011 academic year).

To prepare for the May self-study, please find the attached template or download a new template from the OAI website at:

http://oai.wsu.edu

·        This site also provides a number of resources that you may find helpful during the revision and drafting process.

·         The OAI has collected feedback from programs, faculty, reviewers, and external assessment experts about the Guide to Effective Program Assessment rubric and done an analysis of the December results. The resulting revised rubric is attached to this note and is also available at:
http://oai.wsu.edu

·        The May self-study and OAI’s feedback are designed to help programs meet the fall semester 2010 NWCC&U accreditation reporting deadline currently scheduled for October 1st, 2010.

We welcome the opportunity to join your assessment team and to answer questions and concerns related to the feedback we have provided.  We hope you will invite us to join you and learn more about the approach you are using and  we can share the resources and strategies we have seen that help make sure this effort is useful for WSU faculty and students (the real focus that makes this work worthwhile).

We recognize, too, that many folks have questions or concerns about this process.  To address those concerns, we invite you to join us at one of two open forums this month to address this context, clarify upcoming reporting needs, and together identify possible approaches and strategies:

Office of Assessment and Innovation Open Forums

April 15th at 3:00 in CUE 502

April 16th at 2:00 in CUE 502

If you cannot make one of these two forums, you’re welcome to call us up, drop by, or invite us to meet with you and your colleagues.

Thanks again for your hard work in this important effort,

Gary Brown

Dr. Gary R. Brown, Director
The Office of Assessment and Innovation
Washington State University
509 335-1352
509 335-1362 (fax)
browng@wsu.edu
https://mysite.wsu.edu/personal/browng/GRBWorld/

Program Reactions to Self-Study Feedback

Program Reactions to Self-Study Feedback

I met with a program that expressed feeling frustrated and demoralized by the OAI feedback on their self-study.  They wanted to know what the penalties were for programs that didn’t provide assessment self-studies.  They asked that the results be transparent; that there be consequences for non-compliant programs as well as acknowledgement for programs that made the effort.

They pointed out that nothing in the NWCCU standards explicitly required including stakeholders in the assessment process and that while they recognized the value of such inclusion it was not necessarily practicable, especially when they already felt extended beyond their time and budget to fulfill accreditation requirements. We discussed that “stakeholders” was a term that needed to be unpacked, that it tended to bring up images of industry or future employers. However, stakeholders vary widely from discipline to discipline.  But one form of stakeholder that does not change is students and they agreed that students could be given greater agency in the assessment process.

They also felt the push to come up with “data” which they saw as being essentially number driven. We agreed that the term data should be scrapped in favor of the term “evidence” and that the kinds of evidence they chose to include should be in whatever form was most useful for informing them about the gaps in their learning outcomes and provided insight about what changes to make.

In the end the program came to understand that a simple narrative account of their assessment practices, bolstered by relevant evidence placed in the evidence folder and referenced in the narrative would be a much more effective and less onerous approach to fulfilling the self-study requirements.  Much of what the program is currently doing was not being captured in the report due to the misconception that the institution is looking for numerical data as hard evidence.

They agreed to look over the next draft of the Guide to Assessment and provide feedback about its clarity and usefulness.

comment to the original post

Very Productive

Edit
This sounds like a very productive meeting, with a lot of learning for them and for us.  Congratulations!
Yeidel, JoshuaNo presence information at 2/23/2010 1:23 PM

OAI Assessment Request

A Liaison requests the following:

Goal:

Meet the fall 2010 deadline required by NWCC&U that ALL programs need to be doing program level learning outcomes assessment and using the evidence to implement improvements in the fall of 2010.  The program level assessment needs to be aligned with the WSU university assessment system.

To meet this goal, all programs will need to be doing assessment in the spring of 2010 (or summer) in order to be implementing improvements in the fall of 2010.  To meet the level of assessment quality required, we are asking all programs to share their plans December 18th in order for OAI to provide feedback on their plans.

NWCC&U recognizes that each institution and program has unique characteristics, so no single assessment approach is expected or required, but the principles of good assessment can be generalized and will be used to guide WSU’s process as explained below.

Suggested Approach:

1.      Identify your OAI contact and ask for a meeting so that the assessment plan you develop will be useful to you for improving your program.
2.      To prepare for the meeting (or if you choose to work independently) go to http://oai.wsu.edu/  On the new OAI website are a variety of resources. In bottom edge of the banner, in the gray bar, are three links:
·        Self-Study Template (.doc)
·        Guide to Assessment
·        Resource Packet
For Dec 18, each program is expected to complete a Self-study template. Only section 0, 1, 2 & 4 need to be completed. Programs that are prepared to write section 3 are welcome to.

“Program Point Contacts” and program collaborators are encouraged to use the Guide to Assessment to self-evaluate their self-study. This same tool will be used by OAI to give formative feedback once the self-studies are received.

The Resource Packet contains additional documents that you may find useful, including a step by step process we recommend but that is not required.

Again, program points are encouraged to contact their OAI Point person for assistance.

When complete, the document should be transmitted from the Program Point to the OAI Contact.  A list of these people can be found from the OAI website, left column, red link “Accreditation Liaison Contact List.”  If there are errors in this list, College Liaisons can log in and make changes.

——-
Gary,
I cannot find the link to the assessment materials you talked about in the meeting today on the OAI website.  This creates a great opportunity for me to request from you a stepwise access protocol that I can send to our departmental contact points to initiate this process.  It would be most helpful to have two things from you to initiate this process:

1.     A clear, concise description of what you expect to be delivered to the contact person from each program on Dec. 18.  After reviewing my notes from today’s meeting I am certain that I cannot convey this request clearly to people.
2.     The access protocol mentioned above.

Are you willing to provide me with this information?

I will request that the point people talk to your contact person before initiating this process.  I expect to get push back as a result of this deadline being right before final grades are due.  We can only ask so much of very busy people at a very hectic time of year.

Thank you.

Professional and Institutional Accreditation (ASPB View)

From the Executive Director
American Society of Plant Biologists

Hi again Gary.

I always enjoy learning more about topics that are unfamiliar to me, and accreditation is definitely one such topic! Clearly we could spend considerably more time in dialog (and I hope that we will!), but to directly answer your question, “What accountability bodies (or just pressures) are plant scientists responding to, if any?” I think the answer currently is ‘none.’

That said, you may be familiar with the (NSF-sponsored) -Vision and Change- exercise (see http://www.visionandchange.org/), which is one of the approaches to re-envisioning undergraduate biology education with which ASPB has been closely involved. Although it’s still somewhat fuzzy, it does seem to me to be coming into focus, and SoTL is definitely a major emphasis.

Although I am not aware that the NSF is actively pursuing accreditation metrics as (sort of) one end of an educational/research continuum, it is unusual among science research agencies in that it does have programs that focus on SoTL (in the Education and Human Resources Directorate), as well as the (perhaps better know) research programs (in the Biology, Geosciences, Math, etc. directorates). It is also clear to me that the NSF is making large strides, where appropriate, in interdigitating these programs. Which is to say that program officers are actively encouraged to work together across the directorates.

It is also pertinent, I think, that the NSF instigated a requirement four or five years ago that has had a profound impact on the way in which funded researchers approach the dissemination of their science. Known as “Criterion 2” or “broader impacts”, it obliges grantees to (in a nutshell) demonstrate to the NSF the ways in which they have engaged the public and/or educators and students around the objectives of the funded research project. This (of course) is not directly related to accreditation; my point, though, is that should the NSF so chose, it might be able to find ways to — er — induce more effective teaching among its grantees. (There’s a disconnect here, as I’m sure you appreciate. Organizationally, the role of a grantee as a teacher at his or her institution is largely distinct from their role as an NSF-funded researcher and governed by different structures. But just because it’s a tough nut doesn’t mean there won’t be people or organizations willing to have a go at cracking it.)

Getting back to ASPB — and recognizing that we are currently operating in an ill-defined ‘space’ and with (one) immediate goal of improving our members’ understanding and application of SoTL in their teaching — there is a raft of resources to which I could point you. I’ll start with just a couple, though, and copy in my colleague Katie Engen. Katie is a) more immediately familiar with ASPB’s efforts in this area, and b) in closer touch with our Education Committee (http://www.aspb.org/committees/education.cfm), which tends to pay closer attention to formal K-16 education, and with members of our Education Foundation board (http://www.aspb.org/education/foundation/board.cfm), which focuses more on informal, public education. I’m sure that she will be able to offer additional thoughts and links to resources, and she’ll be a good conduit –should such be needed– to members and leaders who are directly engaged in these efforts.

Speaking of, we are encouraging members to become both more literate about SoTL and more willing to properly study the efficacy of their own teaching (see, e.g.,http://www.aspb.org/newsletter/julaug09/09publish.cfm; please let Katie know if you can’t access this page and she’ll send you a pdf). We’re encouraging direct engagement by the society’s members in K-12 education (not necessarily your immediate interest, but the caliber of primary and secondary education has an obvious and direct impact on tertiary education); see http://www.plantcell.org/cgi/content/full/19/8/2311 for an article on this topic published recently in our top-notch research journal and
http://www.aspb.org/education/importance_statement.cfm for a statement on this topic that was ratified by the Society’s executive committee a couple of months ago.

We have also articulated some fundamental principles relating to knowledge of plants (see http://www.aspb.org/education/foundation/principles.cfm), and a project funded by ASPB’s Education Foundation is developing SoTL-informed hands-on modules around each of these principles.

I’ll stop there and invite both you and Katie to weigh in with any additional thoughts and comments.

Cheers,
Crispin

From: Brown, Gary [mailto:browng@wsu.edu]
Sent: Wednesday, September 02, 2009 5:30 PM
To: Crispin Taylor
Subject: RE: accreditation

Crispin,

Thanks for the quick response!

You have very acutely inferred the heart of my question (though I agree it was a bit muddled).

I’m looking at the way CHEA and almost every other regional and professional accrediting agency is in the process of revising standards, essentially raising the bar and requiring assessment to focus on outcomes (rather than coverage,) and encouraging educators to establish systematic assessment (rather than the fire drill events we are so adept at).  The goal of this across the USA has been to put a renewed focus on making changes in teaching and curricula based upon evidence.

I know that sciences are often without specific accreditors, though not without influencing agencies like NSF, NIH, and, presumably, ASPB.  At the same time, professional accreditation organizations like ABET (Engineering), AACSB (Business), NCATE (Education) etc. are also revising their standards to better align with regional accreditors.

So the question was what accountability bodies (or just pressures) are plant scientists responding to, if any.  I appreciate your answer.  Your response also raises the follow up question:  When you say you are ‘actively engaged with,’  I wonder how you (or I in my role in the office of Academic Effectiveness at WSU) can do more to engage and leverage the important influence of professional peers to encourage attention to the scholarship of teaching and learning.  As you can imagine, the challenge I face in my role is to keep the discussion focused on enriching the student learning experience rather than on perfunctory compliance with an annoying bureaucracy.

I am currently embarking upon a very exciting project with a group of plant scientists here at WSU, so any leads you might provide will be more than welcome by our team as we endeavor to expand and deepen our effort.  And, needless to say, as I anticipate a potentially terrific model of integrated research and assessment, done transparently online with what will be available tools, you and ASBP will certainly be welcome to join us.

Gary

From: Crispin Taylor [mailto:ctaylor@aspb.org]
Sent: Wednesday, September 02, 2009 12:18 PM
To: Brown, Gary
Subject: RE: accreditation

Hi Gary:

Apologies for being dense, but I’m not quite sure what your question is driving at. ASPB is well aware of — indeed, is actively engaged with — various efforts to re-envision the undergraduate biology curriculum, and we assuredly recognize the value of applying what is being learned through research on teaching and learning to improve pedagogy and instructional outcomes. We’re also investing in various online mechanisms and tools aimed at teaching content and/or process. I presume that many of these threads will come together in more formal accreditation programs/efforts, but at this point I do not believe that ASPB is promoting or participating in any such programs.

Having said all that, I am still concerned that I may be missing the point of your question. I think it’d help me do a better job answering that question (or referring you to someone who can) if you could provide me with some examples of the kinds of things you are referring to (e.g., examples from other disciplines), as well as some additional information regarding the context in which you are working.

Thanks for contacting me; I hope I will be able to help out, either directly or indirectly.
Cheers,
Crispin

Crispin Taylor, Ph.D.
Executive Director
American Society of Plant Biologists
15501 Monona Drive
Rockville, MD 20855-2768
Direct: 301-296-0900
Main: 301-251-0560
Fax: 301-251-6740
ctaylor@aspb.org
http://www.aspb.org/

It’s not too soon to save the dates…
PLANT BIOLOGY 2010
Montréal, Canada
Jul. 31 — Aug. 4, 2010

From: Brown, Gary [mailto:browng@wsu.edu]
Sent: Wednesday, September 02, 2009 12:08 PM
To: Crispin Taylor
Subject: accreditation

Hi Crispin,
I’m working with our plant biology programs here at Washington State, and I’m interested in learning more about various educational accreditation influences that may be looming relative to ASPB.  Do you know or do you know somebody I might contact to learn about where the profession may be heading.

Gary

Dr. Gary R. Brown, Director
The Office of Academic Innovation & Effectiveness
Washington State University
509 335-1352
509 335-1362 (fax)
browng@wsu.edu
https://mysite.wsu.edu/personal/browng/GRBWorld/

Academic Effectiveness Liaison Council Meeting, Tuesday, October 27

Academic Effectiveness Liaison Council
Date: October 27, 2009
Start Time:  2:00:00 PM
End Time:  3:00:00 PM
Dialing Instructions: 5707955
Origin: Pullman (French 442)
Location: Spokane (SHSB 260), Tri-Cities (TEST 228), Vancouver (VCLS 308J)
Details: Academic Effectiveness Liaison Council
Special Request: Room reserved from 12:00 – 12:30 for set up.
Event Contact:

Donna Cofield     5-4854

Sneak Preview

1.      Liaison identification update
·        We have about half of our liaisons who have helped identify the point people for EACH of the programs in their college.  These point people will be critical in our overall communication strategy.  Remember, even if you plan on serving on point for each of the programs in your college, we need to be sure we know what and how many programs need to be identified.
2.      Forum feedback on assessment process and rubric
·        We have already learned enough and received good feedback on the rubrics.  We will have a short version very shortly to complement the longer version, and there are other notable refinements gained from liaisons’ critical vantage.  Keep the input coming!
3.      Conversations and support for process.
·        We have already received positive support from this effort from Old Dominion, Mount Royal, the University of Kentucky and Penn State who have heard of this work through the mock-up we presented with the TLT Group and are very interested in partnering with us (external review exchange).
·        Conversation with leadership with the NWCC&U has been scheduled for later this week.
·        The Western Cooperative of Educational Technologies already requested a presentation on this work for next year in La Jolla.  (Who wants to go?)
4.    We still need your participation in program contacts and doing the pilot Honors Self-Study assessment.

The task again:
1.      Go to https://universityportfolio.wsu.edu/20082009/Pages/default.aspx
2.      Scroll down to the link near the bottom ‘Honors Review Rubric’ which opens an online survey.
3.      The first page of that link is instructions, at the bottom of which is a ‘begin’ button.
Remember, when we have worked through refinements, the goal of the work should provide with a new template for reporting that streamlines the rating.  By writing reports in  template ‘chunks,’ we will be able to concatenate each of them into various formats to address the different reporting requests we get from professional accreditors, the HEC Board, OFM, and anybody else who might appreciate WSU’s commitment to improving student learning.

Recommended approach:

  • Print out a hard copy of the rubric (already being revised thanks to feedback at our forums).
  • Read through it to get the flavor.
  • Read the Honors Self Study
  • Rate the study using the online survey/rubric.  (You can cut and paste language from the rubric into the comment box on the rating form online, and that will help Honors understand the criteria you selected as important to your review of their self-study, and it will help us refine the rubric.)

In the news Today:
October 26, 2009, 02:53 PM ET
Most Colleges Try to Assess Student Learning, Survey Finds
A large majority of American colleges make at least some formal effort to assess their students’ learning, but most have few or no staff members dedicated to doing so. Those are among the findings of a survey report released Monday by the National Institute for Learning Outcomes Assessment, a year-old project based at Indiana University and the University of Illinois. Of more than 1,500 provosts’ offices that responded to the survey, nearly two-thirds said their institutions had two or fewer employees assigned to student assessment. Among large research universities, almost 80 percent cited a lack of faculty engagement as the most serious barrier to student-assessment projects.