[SACS] Student Learning Outcomes – Business

On the question, does every major have to report outcomes:

From: SACS Commission on Colleges Discussion Forum [mailto:SACS-L@LISTSERV.UHD.EDU]
Sent: Wednesday, March 31, 2010 2:57 PM
To: SACS-L@LISTSERV.UHD.EDU
Subject: Re: [SACS] Student Learning Outcomes – Business

I can guarantee it is every major within a degree. Also you must show that every program has been assesed and that improvement strategies have been implemented; not plans to improve. Also at your reaffirmation every program must be finished with this process not just a portion.

Comment to the original post

SACS is not NWCCU

Edit
I believe they are somewhat farther down the assessment trail. 

Can we ask this same question directly of our accreditation agency?

Yeidel, JoshuaNo presence information at 3/31/2010 5:39 PM

Engaging Employers and Other Community Stakeholders

Do you have ideas or examples of good practice of working with employers to promote workforce development? UK universities and colleges are under pressure to do “employer engagement” and some are finding it really difficult. This is sometimes due to the university administrative systems not welcoming non-traditional students, and sometimes because we use “university speak” rather than “employer speak”.
— a UK Colleague

Washington State University’s Office of Assessment and Innovation has been working on this question for several years. We presented this spectrum diagram to think about how the more traditional Institution-centric learning differs from Community-based learning. It may point to some of the places your programs get stuck thinking about this question.

We have also been exploring methods to gather assessments from stakeholders (employers as well as others) about aspects of academic programs. This example shows the twinned assessment of student work using a program rubric and assessment of the faculty’s assignment that prompted the work. We invite stakeholders to engage in both assessments. In other implementations of this process, we have asked stakeholders about the utility of the rubric itself.

We also are finding differences in the language used by faculty, students and employers. When asked about the most important things to learn about in a business program we got this feedback.

Another example of different groups using different language is this one, where industry and faculty used different language with different foci to give feedback to students. Particularly we saw industry use “problem” as in “problem statement” and faculty use “problems” synonymous with “confused” and “incorrect.”

Our method for learning about both language and values is through simple surveys of stakeholders as they are engaged with us in assessment activities. For example here (In Class Norming Survey), we asked people who had just assessed student work using a program rubric the importance of the rubric itself.

In this survey (AMDT Stakeholder Survey) a fashion design and marketing program is asking industry partners about language and criteria, as a precursor to building a program-wide assessment rubric. All these activities help programs understand the wider context in which they operate.

More on this work can also be found in this article. Brown, Gary, DesRosier, T., Peterson, Nils, Chida, M., Lagier, R. 2009. Engaging Employers in Assessment. About Campus Vol 14(5) Nov-Dec 2009. NUTN award for best essay – 2009

It may help to understand that we define stakeholders broadly to account for the variation among academic programs: employers, alumni, students themselves, professional and graduate school admissions officers, audiences (as in performance arts), etc.

Presently we have developed a rubric to guide the assessment of self-studies that our academic program are doing as part of our University-wide system of assessment, a component of our institution’s regional accreditation activities. You can see a snapshot of how our Colleges are doing here.

Program starts assessment activities for May 17

Program Point writes:

>
> Hello Nils -just in case you thought we have fallen off the edge of the
> earth ,I thought I would touch base with you to update on the progress we
> are making vis a vis undergraduate assessment. First I apologize for not
> being more proactive, but you have to realize-we are a very small
> department/school, and under VERY much stress trying to juggle all the
> extra requests this semester, without our Director[on sabbatical leave]-
> from the grad school, from this convoluted progress of the  merger with
> [another program] and trying to keep abreast of all the course
> justifications etc etc -and keeping up with research/teaching!!. Having
> said that, we of course recognize the importance of getting this process
> in place for the undergrad assessment.
>   Since to date we have no historical data base,we are actively emplacing
> several – eg Exit interviews, staying in touch with our graduates as
> they join the work force. We have started a data base on their
> acceptance to graduate-level programs [we consider this one of the best
> ‘assessments’ of the success of our training. We are asking all of our
> faculty[ie 8-9 my program, 1-2 sister prgram to submit within their
> Course Syllabus several assessment questions ,specifically pertaining to
> university missions and goals, to be monitored through-out the course.
> We have no official assessment teams in place , but will specifically
> monitor all these data as a faculty in our regular meetings. We are
> looking to enhance the role of our Board of Visitors as the most obvious
> ‘stakeholder’
>  If you have any comments ,suggestions. I would very much appreciate any
> feedback. [ We will try, incidentally to emplace a like system in
> sister program-as the sole tenured faculty has a
> tremendous load]
>        best   ~~
>
>

On 3/29/10 10:37 AM,  wrote

Thanks for the reply.

I appreciate that your program is under stress and yet you are attempting to
work on these important issues.

I am available to assist.

For example, I’d like to encourage that you be clear about program goals and
their associated student learning outcomes. You may find the definitions on
the first page of the guide to assessment useful (see attachment in my
previous email).

I think I can get you a couple sample exit surveys. You would want to adapt
them, for among other reasons, because the samples will be aligned with
another program’s learning goals.

An alumni survey is another good indirect measure. I have sample surveys
that you could align with your program’s goals.

I’ve talked with your assistant Dean about her program, where she is
exploring gathering assessments from admissions officers (taking it a step
more than just acceptance rate, which could be influenced by all kinds of
factors having nothing to do with quality.) You might want to chat with her,
or I could help.

Looking at syllabi is great, especially asking to what extent are they
communicating program learning outcomes. We’ve also done work with programs wanting to assess a key assignment in a course — to what extent is the assignment prompting students to demonstrate the program’s learning
outcomes?

The above are great indicators, that can triangulate if you coordinate them,
but none is a direct measure of a specific skill. Is there a capstone course
that has a project where student work could be assessed with a program
rubric?

This is a long list. You don’t need to undertake it all at once. Part of
your plan may be to develop a schedule of when you collect what kinds of
data. Eg, course evals, each semester; exit interview in May; alumni survey
every other Fall; syllabus review as part of your program’s 5-year
professional review, etc. We know you need to make the work level
manageable. I would encourage that part of what you submit is a plan and
schedule — even if means your data is more of a pilot nature.

I’m happy to come over and talk

A Program Point asks about Udating Accreditation Self Studies for May 17

A Program Point asks about Updating Accreditation Self Studies for May 17 A Point Asks:

Nils,

I am waiting for responses from the rest of the faculty.  I have requested that they submit their questions or comments by April 9th.

Just to clarify, the current action plan states that the department is undergoing a change of chairs.  Are you wanting something more?

Thanks,

On 3/26/10, wrote:

My  track changes of Feb 10 on your self-study document was based on what I heard from you, when on 1/28 you canceled the help I was offering with assessing the 2009 collection of theses:

You wrote: “After our meeting on Tuesday, I began recruiting reviewers for the norming session. Everyone is busy and no one volunteered.  So, I brought this up with the chair who is unwilling to commit any resources, including faculty time, to reviewing the theses.   Therefore, the rating of the theses is not an option at this time.”

I just skimmed the document I returned to you Feb 10 and the new rubric I sent to you earlier this week. The section 3 now includes evidence as well as analysis and action. The thesis review could have constituted evidence. Shall we meet to brainstorm what evidence you do have available?   As I review my proposed draft against the new rubric it does not rate highly.

As an alternative or precursor to meeting, perhaps you would like to meet with your chair and examine your original, or my Feb 10, draft against the rubric and come to some decision of what the program is willing to (can) do next.

I’m happy to sit in on that meeting or to try to assist in any other way you can propose.

Nils

Assessment design is messy: challenges to cooordinating flexible and useful assessment for a college and its programs

Assessment planning, designing, and implementing is a messy process, as are most authentic, responsive endeavors that involve and engage many people from a variety of roles and vantage points.   Different people and programs have different concerns and genuine questions.

What’s the relationship between college and program assessment? Student learning goals identified by a college and by its programs are at the heart of assessing student learning and planning useful and coordinated assessment activities. Many challenges to designing  a coordinated, yet flexible assessment plan that balances multiple considerations.

OAI is addressing this question now with one WSU college, Spring 2010:

From OAI contact to a program head:

I’ll briefly summarize  a) assessment efforts at WSU, b) what I understand assessment efforts in one college, c) what I understand is on the table now for next steps, and d) my role as your OAI contact, as well as OAI’s role.

Assessment efforts at WSU: college-level and program-level

The current efforts for systematic assessment at WSU include two different approaches.

In some colleges, efforts are starting primarily at the program level, including developing an assessment plan, student learning goals, rubrics/tools, measures, and processes.  This approach can create assessment that’s immediately meaningful and useful to a program — but it also brings significant challenges in terms of “rolling up” results into something coherent at the college level.   Piecemeal efforts, dozens of unaligned rubrics and measures, there is a great deal needed to make that useful at the college (and institutional) level.

In other colleges, assessment efforts have been coordinated at the school or college level.  This approach identifies common student learning goals across programs, and seeks to develop common rubrics/tools, measures and processes for assessment.  This approach can provide useful assessment results at the college level, with aligned measures (such as course evaluations, senior exit surveys, alumni surveys, etc.) providing complementary data.  It also brings the opposite challenge — how to build in flexibility and give adequate agency at the program level so that programs can identify and focus on gathering useful assessment data that interests and informs them to guide ongoing improvement in learning and teaching.

Summary of your college assessment efforts to date (as I understand them, primarily involving OAI/CTLT)

For several years … the college has invested time and effort in a developing, piloting and refining a college-wide rubric, with input from faculty in the various areas, in collaboration with OAI/CTLT.  The assessment team, in consultation with OAI/CTLT, adapted WSU’s Critical and Integrated Thinking Rubric, and revised it with input from various faculty  Over two semesters, the rubric has been piloted and refined by assessing student work from five different courses in the college.

The assessment team has also developed and piloted a senior exit survey which aligns with the rubric and has started drafting an alumni survey that is similarly aligned.

Last semester, the Dean requested two faculty to draft student learning goals for the college, in alignment with WSU’s Six Learning Goals of the Baccalaureate. This was done independent from work by the assessment team.

In February 2010, OAI helped map the college rubric to these new learning goals and WSU’s Big 6, including adding another dimension for specialty (to be developed by each program).

Next steps

In March 2010, faculty in one program brought up questions about the purpose of assessment in their area, the rubric, and the college’s new student learning goals and performance criteria created last semester.   These raise important questions about how to balance program agency and assessment needs with college-level needs, including:

  • What is the relationship between the new student learning goals and performance criteria, and the rubric developed over the past two years, the complementary measures and data?
  • In the future, how does the college want to balance college level assessment and program level assessment for a coherent and useful overall assessment effort?

These questions and their answers represent an important conversation among key players in the college, faculty and leadership.

From an assessment standpoint, there is no fixed path ahead.  Rather, those involved should identify goals, needs, constraints, and priorities. The college should probably outline a coordinated, yet flexible assessment plan that balances those considerations; it’s possible to try different approaches in different programs, to decide to focus on shared goals, or to do something in between.

As a note, some programs at WSU are in the process of developing staggered assessment plans, in which they assess only some goals each year.

OAI’s role

OAI has been working with this college’s assessment team for two years to develop, pilot, and refine a college-wide rubric and other complementary measures, including a senior exit survey and alumni survey.  OAI is also available to work with programs on their assessment plan, tools, and measures.

However, in the interest of making the best use of our collective time and efforts, before beginning to develop any program’s assessment plan, I suggest that the college consider the questions raised above.

If invited, OAI is available to consult with the college about different paths you might choose to take for useful, quality assessment.

Draft timeline for Assessment Self-study process

Draft timeline for Assessment Self-study process Showing how the self study data feeds in from programs to the template and flows back out from the template to various reporting needs.

This diagram needed to explain system to NWCCU and also to share with Liaisons and Program Points to help them understand that the goal is focus and labor saving.

[We never finished this diagram or communicated it out. Ed.]

Structure of and Access to Rain-King-Related Sites

From: Joshua Yeidel
Date: Thu, 18 Mar 2010 17:33:07 -0700
To: “OAI.Personnel”
Conversation: Teamsite Migration Changes
Subject: Teamsite Migration Changes

NEW SITE STRUCTURE PLAN
This week, discussions about what-goes-where demonstrated (to me, anyway) that we didn’t have an easy-to-understand plan for universityportfolio.wsu.edu (“up.w.e.”)and assessment.wsu.edu (“a.w.e.”) with regard to restricted-access sites vs. public access.  I worked out a somewhat-revised plan that can be summarized quickly:

All new sites on up.w.e. are public-access.All sites on a.w.e are restricted-access (in various ways), EXCEPT All sites in a.w.e/public are public-access.

More information in the attached wiki page.

OUE Teamsite Migration 2010-New Site Structure – ctltwiki