Re-thinking Course Evals for Prog Eval purposes / Turmoil in Colleges

Re-thinking Course Evals for Prog Eval purposes / Turmoil in Colleges A conversation on the bus home from work with a College Liaison led to this exchange

On 4/6/10 9:18 AM, “Nils Peterson” wrote:
Further to our conversation on the bus. Collecting actionable data, and acting on it, are the things that we believe are likely to engage students and therefore to increase response rates on course evals. How about we get together and talk about some ways to re-think the course eval process. I’m looking for a partner that is in a position to take a bit of an adventurous view of the topic.

(I want to see if this Liaison would venture into something like paired assessment where faculty declare their goals for the course and students say what they perceived happened. We’d get curricular maps, which could then be used by the program to help students navigate course selection.)

Liaison replies:
Nils,
I will be interested in doing this.  However, let’s put off the conversation for several weeks.  Amongst other things, I still do not know where I am going to land beginning July 1.  That is to say, I do not know what my position will be and in what college.

Advertisements

Why Public Reporting

One faculty member responds to OAI assessment plan feedback:

“Oh -kiss my #$$, I have been here long enough to know that THAT thing
will be flushed down the toilet.”

Program starts assessment activities for May 17

Program Point writes:

>
> Hello Nils -just in case you thought we have fallen off the edge of the
> earth ,I thought I would touch base with you to update on the progress we
> are making vis a vis undergraduate assessment. First I apologize for not
> being more proactive, but you have to realize-we are a very small
> department/school, and under VERY much stress trying to juggle all the
> extra requests this semester, without our Director[on sabbatical leave]-
> from the grad school, from this convoluted progress of the  merger with
> [another program] and trying to keep abreast of all the course
> justifications etc etc -and keeping up with research/teaching!!. Having
> said that, we of course recognize the importance of getting this process
> in place for the undergrad assessment.
>   Since to date we have no historical data base,we are actively emplacing
> several – eg Exit interviews, staying in touch with our graduates as
> they join the work force. We have started a data base on their
> acceptance to graduate-level programs [we consider this one of the best
> ‘assessments’ of the success of our training. We are asking all of our
> faculty[ie 8-9 my program, 1-2 sister prgram to submit within their
> Course Syllabus several assessment questions ,specifically pertaining to
> university missions and goals, to be monitored through-out the course.
> We have no official assessment teams in place , but will specifically
> monitor all these data as a faculty in our regular meetings. We are
> looking to enhance the role of our Board of Visitors as the most obvious
> ‘stakeholder’
>  If you have any comments ,suggestions. I would very much appreciate any
> feedback. [ We will try, incidentally to emplace a like system in
> sister program-as the sole tenured faculty has a
> tremendous load]
>        best   ~~
>
>

On 3/29/10 10:37 AM,  wrote

Thanks for the reply.

I appreciate that your program is under stress and yet you are attempting to
work on these important issues.

I am available to assist.

For example, I’d like to encourage that you be clear about program goals and
their associated student learning outcomes. You may find the definitions on
the first page of the guide to assessment useful (see attachment in my
previous email).

I think I can get you a couple sample exit surveys. You would want to adapt
them, for among other reasons, because the samples will be aligned with
another program’s learning goals.

An alumni survey is another good indirect measure. I have sample surveys
that you could align with your program’s goals.

I’ve talked with your assistant Dean about her program, where she is
exploring gathering assessments from admissions officers (taking it a step
more than just acceptance rate, which could be influenced by all kinds of
factors having nothing to do with quality.) You might want to chat with her,
or I could help.

Looking at syllabi is great, especially asking to what extent are they
communicating program learning outcomes. We’ve also done work with programs wanting to assess a key assignment in a course — to what extent is the assignment prompting students to demonstrate the program’s learning
outcomes?

The above are great indicators, that can triangulate if you coordinate them,
but none is a direct measure of a specific skill. Is there a capstone course
that has a project where student work could be assessed with a program
rubric?

This is a long list. You don’t need to undertake it all at once. Part of
your plan may be to develop a schedule of when you collect what kinds of
data. Eg, course evals, each semester; exit interview in May; alumni survey
every other Fall; syllabus review as part of your program’s 5-year
professional review, etc. We know you need to make the work level
manageable. I would encourage that part of what you submit is a plan and
schedule — even if means your data is more of a pilot nature.

I’m happy to come over and talk

A Program Point asks about Udating Accreditation Self Studies for May 17

A Program Point asks about Updating Accreditation Self Studies for May 17 A Point Asks:

Nils,

I am waiting for responses from the rest of the faculty.  I have requested that they submit their questions or comments by April 9th.

Just to clarify, the current action plan states that the department is undergoing a change of chairs.  Are you wanting something more?

Thanks,

On 3/26/10, wrote:

My  track changes of Feb 10 on your self-study document was based on what I heard from you, when on 1/28 you canceled the help I was offering with assessing the 2009 collection of theses:

You wrote: “After our meeting on Tuesday, I began recruiting reviewers for the norming session. Everyone is busy and no one volunteered.  So, I brought this up with the chair who is unwilling to commit any resources, including faculty time, to reviewing the theses.   Therefore, the rating of the theses is not an option at this time.”

I just skimmed the document I returned to you Feb 10 and the new rubric I sent to you earlier this week. The section 3 now includes evidence as well as analysis and action. The thesis review could have constituted evidence. Shall we meet to brainstorm what evidence you do have available?   As I review my proposed draft against the new rubric it does not rate highly.

As an alternative or precursor to meeting, perhaps you would like to meet with your chair and examine your original, or my Feb 10, draft against the rubric and come to some decision of what the program is willing to (can) do next.

I’m happy to sit in on that meeting or to try to assist in any other way you can propose.

Nils

Assessment design is messy: challenges to cooordinating flexible and useful assessment for a college and its programs

Assessment planning, designing, and implementing is a messy process, as are most authentic, responsive endeavors that involve and engage many people from a variety of roles and vantage points.   Different people and programs have different concerns and genuine questions.

What’s the relationship between college and program assessment? Student learning goals identified by a college and by its programs are at the heart of assessing student learning and planning useful and coordinated assessment activities. Many challenges to designing  a coordinated, yet flexible assessment plan that balances multiple considerations.

OAI is addressing this question now with one WSU college, Spring 2010:

From OAI contact to a program head:

I’ll briefly summarize  a) assessment efforts at WSU, b) what I understand assessment efforts in one college, c) what I understand is on the table now for next steps, and d) my role as your OAI contact, as well as OAI’s role.

Assessment efforts at WSU: college-level and program-level

The current efforts for systematic assessment at WSU include two different approaches.

In some colleges, efforts are starting primarily at the program level, including developing an assessment plan, student learning goals, rubrics/tools, measures, and processes.  This approach can create assessment that’s immediately meaningful and useful to a program — but it also brings significant challenges in terms of “rolling up” results into something coherent at the college level.   Piecemeal efforts, dozens of unaligned rubrics and measures, there is a great deal needed to make that useful at the college (and institutional) level.

In other colleges, assessment efforts have been coordinated at the school or college level.  This approach identifies common student learning goals across programs, and seeks to develop common rubrics/tools, measures and processes for assessment.  This approach can provide useful assessment results at the college level, with aligned measures (such as course evaluations, senior exit surveys, alumni surveys, etc.) providing complementary data.  It also brings the opposite challenge — how to build in flexibility and give adequate agency at the program level so that programs can identify and focus on gathering useful assessment data that interests and informs them to guide ongoing improvement in learning and teaching.

Summary of your college assessment efforts to date (as I understand them, primarily involving OAI/CTLT)

For several years … the college has invested time and effort in a developing, piloting and refining a college-wide rubric, with input from faculty in the various areas, in collaboration with OAI/CTLT.  The assessment team, in consultation with OAI/CTLT, adapted WSU’s Critical and Integrated Thinking Rubric, and revised it with input from various faculty  Over two semesters, the rubric has been piloted and refined by assessing student work from five different courses in the college.

The assessment team has also developed and piloted a senior exit survey which aligns with the rubric and has started drafting an alumni survey that is similarly aligned.

Last semester, the Dean requested two faculty to draft student learning goals for the college, in alignment with WSU’s Six Learning Goals of the Baccalaureate. This was done independent from work by the assessment team.

In February 2010, OAI helped map the college rubric to these new learning goals and WSU’s Big 6, including adding another dimension for specialty (to be developed by each program).

Next steps

In March 2010, faculty in one program brought up questions about the purpose of assessment in their area, the rubric, and the college’s new student learning goals and performance criteria created last semester.   These raise important questions about how to balance program agency and assessment needs with college-level needs, including:

  • What is the relationship between the new student learning goals and performance criteria, and the rubric developed over the past two years, the complementary measures and data?
  • In the future, how does the college want to balance college level assessment and program level assessment for a coherent and useful overall assessment effort?

These questions and their answers represent an important conversation among key players in the college, faculty and leadership.

From an assessment standpoint, there is no fixed path ahead.  Rather, those involved should identify goals, needs, constraints, and priorities. The college should probably outline a coordinated, yet flexible assessment plan that balances those considerations; it’s possible to try different approaches in different programs, to decide to focus on shared goals, or to do something in between.

As a note, some programs at WSU are in the process of developing staggered assessment plans, in which they assess only some goals each year.

OAI’s role

OAI has been working with this college’s assessment team for two years to develop, pilot, and refine a college-wide rubric and other complementary measures, including a senior exit survey and alumni survey.  OAI is also available to work with programs on their assessment plan, tools, and measures.

However, in the interest of making the best use of our collective time and efforts, before beginning to develop any program’s assessment plan, I suggest that the college consider the questions raised above.

If invited, OAI is available to consult with the college about different paths you might choose to take for useful, quality assessment.