Reporting Responsibilities to OFM for 2010

Re: Draft of Reported Changes
On Assessment Reporting Responsibilities to the Office of Fiscal Management:
The limited purpose of next year’s budget submission to OFM, which only identifies targets out through two more years, WSU has set targets of 50% and 75% of programs will be able to provide evidence of improvements in learning outcomes assessment.
After the first of the year WSU will return to working on Performance Agreements which will probably look at 6-8 years.  It is our plan to be able to have a good prediction for when we will be able to report with confidence that 100% of programs will have made improvements in learning outcomes assessment.

Building a learning community online

I have been thinking about dissemination and adoption of knowledge as our organization (formerly WSU’s Center for Teaching Learning and Technology) is re-organized to become the Office of Assessment and Innovation (OAI). Our unit’s new challenge is to help the university develop a “system of learning outcomes assessment” in response to new requirements from our accrediting body, NWCCU.

We have captured a discussion about how our unit’s web presence might be changed in a series of notes and whiteboard shots attached to this blog post.

One part of our ideas for this “accreditation system” can be found in this presentation for the TLT Friday Live on harvesting feedback across multiple levels of the university.  We have a prototype of one of the middle tiers running now to test and refine the rubric.

Our system depends on OAI staff working with “Liaisons” for each College and their “Points” in each academic program, and developing skills among the Liaisons and Points so that they can provide useful feedback to programs on the assessment activities that the programs are undertaking. Because of the diversity of WSU, the specific learning outcomes assessment that programs undertake will need to vary by program. What the university seeks is “robust” assessment of student learning. Our method (links above) involves a meta-assessment of the assessment practices of the programs. For programs to understand and develop robust assessment strategies, I believe that the OAI’s, the Liaisons’, and the Contacts’ professional development needs to be a key component of the “system of assessment.” [That is, professional learning in the discipline.]

The challenge is to provide professional development in the context of a multi-campus university, with programs and learners at diverse places in their own learning.
We have advocated that learners find their community of practice and join it and work on their problem in the context of that community. However, that assumes the community of practice exists in an organized way that can be joined. Presently, the COP’s I’m aware of are loose knit collections of bloggers who have developed the skills of tracking one another. Novices would need to learn these skills as an entry requirement to their own participation. That barrier for entry/participation is probably too high for the WSU community we need to reach. I previously wrote a manifesto describing how our unit should change its web strategy. That proposal also included the concept of finding/ building the community of practice online — but it did not solve the problem of how to build that community.

In 2006, Dave Cormier proposed the idea of a “feedbook” of readings that was based on an RSS feed, rather than on traditional paper media. The book was more dynamic because it could be based on blogs or other contemporaneous sources.  In Dave’s later reflections he points to the interesting perspective that the feedbook is (can be) a collaborative effort among a community of learners.

“In addition to the freshness of the material, the multiplicity of voice and perspective and the fact that your textbook will never be out of date, one of the first things that would happen is a decentralization of the instructor. While the instructor would usually be responsible for the basic set of links…gone will be the rabbit out of a hat magic that comes from controlling the flow of knowledge. Students will actually be able to add to that flow of knowledge as their research brings up new sources of course material.”

Dave’s thoughts about multiplicity of voice and perspective seems to fit with Lave and Wenger’s Situated Learning: Legitimate Peripheral Participation Cambridge U Press, 1991

John Seely Brown summarizes several of Lave and Wenger’s writings in a piece written for Xerox PARC:

“This work unfolds a rich, complex picture of what a situated view of learning needs to account for and emphasizes, in particular the social, rather than merely physical nature of situatedness…

“Next, a few clarifications are probably helpful. First, as Lave (1991) herself notes, the situation is not simply another term for the immediate, physical context. If it is to carry any significant conceptual import, it has to be explored in social and historical terms. Two people together in a room are not inevitably identically situated, and the situated constraints on practice do not simply arise in and through such isolated interactions. The people and the constraints importantly have social and historical trajectories. These also need to be understood in any situated account.

“Second, community of practice denotes a locus for understanding coherent social practice. Thus it does not necessarily align with established communities or established ideas about what communities are. Community in Lave & Wenger’s view is not, a “warmly persuasive term for an existing set of relations” (Williams, 1977). Communities can be, and often are, diffuse, fragmented, and contentious. We suspect, however, that it may be this very connotation of warm persuasiveness that has made the concept so attractive to some.

“Third, legitimate peripheral participation (LPP) is not an academic synonym for apprenticeship. Apprenticeship can offer a useful metaphor for the way people learn. In the end, however, in part because of the way apprenticeship has historically been “operationalized,” the metaphor can be seriously misleading, as LPP has occasionally been located somewhere between indentured servitude and conscription.

“As Lave and Wenger put it:
‘Legitimate peripheral participation is not itself an educational form, much less a pedagogical strategy or a teaching technique. It is an analytic viewpoint on learning, a way of understanding learning. We hope to make it clear that learning through legitimate peripheral participation takes place no matter which educational form provides a context for learning, or whether there is any intentional educational form at all. Indeed, this viewpoint makes a fundamental distinction between learning and intentional instruction.’ [1991: 40]

This quote above is one that I’m still trying to fully absorb [np]

JSB continues:

“One of the powerful implications of this view is that the best way to support learning is from the demand side rather than the supply side. That is, rather than deciding ahead of time what a learner needs to know and making this explicitly available to the exclusion of everything else, designers and instructors need to make available as much as possible of the whole rich web of practice-explicit and implicit-allowing the learner to call upon aspects of practice, latent in the periphery, as they are needed.”

“… The workplace, where our work has been concentrated, is perhaps the easiest place to design [for legitimate peripheral participation] because, despite the inevitable contradictions and conflict, it is rich with inherently authentic practice-with a social periphery that, as Orr’s (1990) or Shaiken’s (1990) work shows, can even supersede attempts to impoverish understanding. Consequently, people often learn, complex work skills despite didactic practices that are deliberately designed to deskill. Workplace designers (and managers) should be developing technology to honor that learning ability, not to circumvent it.”

Applying these ideas to OAI/WSU

In the process of becoming the OAI we are re-vamping our website and proposing that it contain several elements:

  • a branded page that provides a basic OAI presence within the university
  • an archive of the former CTLT site, with its various linked resources (many of which retain some value and the URL has some reputation in search engines)
  • a university portfolio space (a showcase)  where we assist the university in mounting is publicly viewable and assessable evidence for accreditation (the system demonstrated above)
  • an assessment workspace for collaborations on assessment activities with academic units (these collaborations may require managed authorizations)
  • a “social” space for collaboration around professional development related to the problems we are working on

It is the last space that presents interesting design challenges, an opportunity to facilitate LPP, and is the cause of this reflection.

Given that the social space will be closely linked to the OAI’s main university presence, we have a requirement that it be “professional.” Dave’s comment above about multiplicity of voice and perspective is the opportunity/concern to manage. Its possible that in a community of learners, some members (perhaps more novice ones) will make contributions at a lower level of professionalism or with less insight. The community, and the visitor, should have ways of both hearing, and not over-valuing, these comments and the community should have ways of responding to these comments that can facilitate learning for multiple players in multiple ways.

In face to face environments, there are various protocols for making contributions, and cues that give an observer orientation to the hierarchy of expertise and authority within the community. An observer can use these cues to conclude which contributions carry the greatest authority in the group. Protocols for responding to contributions can also help organize the group dynamic. In online settings, these cues may be absent and a visitor may sense immaturity or cacophony (e.g., in a feedbook’s content) where in fact what is happening is that novices are exploring their (partial) understanding by sharing within the community.

The problems as I see them:

  • How can an open online community that embraces legitimate peripheral participation maintain a coherence that allows members and visitors to appreciate its professionalism/maturity while still making space for novices participation?
  • What mechanisms can be employed to create “emergent” authority in a web-based learning community without a central oligarchy?
  • Much of the literature on LPP was developed in the context of face-to-face communities. How can an Internet-based community exploit “incidental” (e.g., drop in) participation by experts, who would not have appeared in face-to-face settings? This question is a recognition that via a feedbook mechanism an item from an expert outside the community can be routed into the stream of the community’s readings (thus ‘incidental’).

RE: Assessment Updates and Tasks

Just a quick note to say please hold off on using the Rocket Science model and ‘digest’ rubric.  We did our first round of experimenting with the process today with 15 raters and found we need to provide clarity in the model.  More importantly and for our discussion, the shorter rubric, however much nicer it looks, also glosses too many critical components.  Use it as a reference, but focus mostly on the expanded version for the near term.

We’ll have updated useful versions available this week.

Thanks,

Gary

Dr. Gary R. Brown, Director
The Office of Assessment and Innovation
Washington State University
509 335-1352
509 335-1362 (fax)
browng@wsu.edu
https://mysite.wsu.edu/personal/browng/GRBWorld/

Branch Campus note and attachment

The Vancouver liaison reports:

Most … understood that they should be proactive and get involved in the assessment plans that their departments in Pullman are developing now, rather than reactively respond to them later.

Also, there are three unique programs on this campus: BA, Public Affairs; MA, Public Affairs, and the School of Engineering and Computer Science.

I’ve attached the final version of the document I presented today and don’t see any problem with you posting it.

academic effectiveness liaison council background

Assessment Liaison Updates and Tasks

Folks,

Please share with your Program Assessment Point people:

1.      OAI contacts have been assigned for each program.
You can see the list:  https://universityportfolio.wsu.edu/20082009/Lists/Liaisons/Liaison%20Council.aspx
If you click on the names, you will find more contact information, but if that fails, all OAI contacts can be reached at 5-1355.

No doubt this list will be changing.  Please make changes or send them to me or Judy (judyrumph@wsu.edu ) as they occur on your end.

2.      Attached is the 3rd version of the rubric for assessing assessment and the corresponding template.  The rubric has been rendered in three versions (not perfectly aligned in this draft just yet).  There is the over view, the ‘digest’ version (by popular demand), and the expanded version, which we have found tends to produce better results though perhaps requires a greater initial investment.)  We anticipate one more round of deep revision of the rubric and template following the activities associated with this release, but the principles will remain the same.  We are also close to releasing a timeline and checklist, but anticipate that document merits discussion at our next meeting.

3.      We strongly recommend you forward the blue below and encourage each of your teams to review the assessment process using the rubric to evaluate the mock report (Rocket Science) before submitting their spring assessment plans.  We also recommend they consider their previous self-study reports in light of these criteria.

We also suggest that the assessment of assessment process be conducted synchronously in collaboration between your program assessment teams and their OAI contacts.

And we always welcome feedback but particularly at this juncture as we gear up for spring assessment.

Here is the link to the online Mock/Model Template-based report–Rocket Science.  The process is the same as the one we did with Honors.  If you log into this site, you will find the directions to guide you through the Rocket Science assessment process:

https://universityportfolio.wsu.edu/2009-2010/Pages/default.aspx [since retired]

Again, the process:

1.      Read over the Assessment Criteria (you can download the revised rubric at the link above or read it online).

2.      Read the Rocket Science self Study

3.      Go through the online rubric and assign Rocket Science scores on each of the four dimensions of the rubric.

Meanwhile, we are also sharing this process statewide and with selective professional groups and have received very positive feedback.  More on that when next we meet, but the upshot is transparency.

4.      Finally, don’t forget the deadline for spring assessment plans.  We need plans for spring assessment activity from all programs before December 18th, 2009.  We will review (with the appropriate criteria of the rubric) and provide feedback for each plan as soon as possible but no later than early January. (The sooner we receive them, the sooner we can provide feedback.)

Don’t hesitate to contact me with any questions or concerns. And don’t forget the next Liaison Council meeting:  December 4th at 1:00 in Lighty 403.

Gary

PS, some program points have asked for more models.  A few from 2008 we can point to are here: https://teamsite.oue.wsu.edu/progeval/default.aspx
(follow ‘Assessment Highlights’ and the ‘case reports’ below the highlights)

Dr. Gary R. Brown, Director
The Office of Assessment and Innovation
Washington State University
509 335-1352
509 335-1362 (fax)
browng@wsu.edu
https://mysite.wsu.edu/personal/browng/GRBWorld/

attachment with original email  a of a set beta (3)

the email also references a mythical program “rocket science” as a vehicle to be used testing the rubric. Those files are included here for completeness: 

OAI website launched

Here is the new website in support of the new OAI organization. There is still transitional work to do to retire the CTLT website and more work to migrate legacy SharePoint sites from Teamsite.oue. The parts of this web strategy are emerging

  • A simple branded website (OAI.wsu)
  • A public portfolio of the university’s outcomes assessment work (UniversityPortfolio)
  • Private site(s) for assessment collaborations and OAI internal work (location presently unknown)

Update–what we’re learning already

Just a quick note to say please hold off on using the Rocket Science model and ‘digest’ rubric.  We did our first round of experimenting with the process today with 15 raters and found we need to provide clarity in the model.  More importantly and for our discussion, the shorter rubric, however much nicer it looks, also glosses too many critical components.  Use it as a reference, but focus mostly on the expanded version for the near term.

We’ll have updated useful versions available this week.

Dr. Gary R. Brown, Director
The Office of Assessment and Innovation
Washington State University
509 335-1352
509 335-1362 (fax)
browng@wsu.edu
https://mysite.wsu.edu/personal/browng/GRBWorld/

A Of A Template and Rubric post-NWCC&U Meeting in Seattle (11.6)

From: Brown, Gary
Sent: Sunday, November 08, 2009 1:14 PM
To: Peterson, Nils; Ater-Kranov, Ashley
Cc: Green, Kimberly; Desrosier, Theron; Jacobson, Jayme K; ‘Jane Sherman’
Subject: a of a beta 2

Folks,

Attached is a working version of A of A Beta 2.  I’m trying to build into our Assessment of Assessment rubric and template some of Friday’s language from NWCC&U, focused so far mostly in the template (Nils, for Rocket Science mock model report update).  I’m working mostly on the digest form right now so more alignment with expanded criteria will be necessary in the ongoing iterations.  I’ll be working on squeezing in more language from NWCC&U, though it is deliberate in the level of abstraction they use’a very clear position taken that we operationalize it ourselves as one size will not fit all universities.  Our authority, such as it is, will come from the stance the WSU Executive Council takes and our ability to convey the principles of assessment as useful.

The meeting was literally a reading and Ron Baker’s interpretation of the new standards, which are somewhat different from what was recently posted.  What is key will be clarifying for ourselves the language of:

Mission
Core Themes
Goals
Objectives
Outcomes

Used everywhere and in different ways, we will want our language to align as much as possible with the way NWCC&U is using the terminology. Ron Baker appears to have worked on point to revise and explain the standards, so if we have questions we can go to him.  Jane will be here in December and we hope to confirm our approach to the language at that time, but the working clarification I have so far come to understand (awaiting confirmation) is that  Core Themes equates to WSU’s four strategic “goals”: http://www.strategicplan.wsu.edu/a of a set beta 2 (Yes, we have lots to operationalize in the four goal language in ways that programs and faculty can really make use of in practice).  Our institutional goals for meeting the core themes in the learning realm are still the six Gs of the B.  Programs will have objectives which are generally measurable (though function more like discrete program goals in this scheme) and outcomes which are what is actually measured.  We may have many objectives, for instance, but focus at any given time on actually doing that measurement to determine if we have achieved our outcomes.

(Jane, please weigh in if this corresponds to your understanding.  Needless to say, terminology can be a real bottleneck, is debated among experts, and we have little time to count dancing pin-headed angels, or something like that…..)

Anybody want to join me in CLA point meeting Tuesday 10:30-12:00?   CLA has a mix of chairs and good folks identified to make assessment happen in their programs.

I also want to get the revised A of A and template out to all Liaisons this week, sooner rather than later, and start setting up meetings to walk them through the Rocket Science mock report.

Gary

Dr. Gary R. Brown, Director
The Office of Assessment and Innovation
Washington State University
509 335-1352
509 335-1362 (fax)
browng@wsu.edu
https://mysite.wsu.edu/personal/browng/GRBWorld/
attached draft reporting template and rubric  a of a set beta 2

Assessing Capstone Courses and Internships–Where is the Value?

A faculty on point for her program’s assessment asks:

One of the group discussions at the last retreat was on choosing the capstone course. I understand that the major/core assignment in the capstone  course needs to address all the program goals. That will be the course we assess (along with a 200 level course). The degree really has a natural capstone course built into their core.  So, they are all set.

The group also talked about using the internship experience as their capstone course. Since the majors in the program are fairly diverse, the teaching faculty thought the unique internship experiences would address that issue. For assessment purposes, it seems most efficient to have one 400 level course per degree program that we assess (as opposed to multiple ones, possibly a capstone course for each major).

What are your thoughts about 1) using the internship course as a capstone course and 2) about having multiple 400 level courses that we assess.

Finally, are there other programs on campus that are using their internship as a capstone?

Thanks –
——————————————–

Combing through the new NWCC&U standards, prepping for a meeting with accreditors on Friday November 6, 2009, I note Standard 2.C.5 that states:

“Teaching faculty take collective responsibility for fostering and assessing student achievement of identified learning outcomes.”

That to me means the a capstone assessments gains formal utility when it involves many faculty in a program, and the much better reason than the rule is that program improvements are most potent when those faculty in the program are engaged in the assessment.  When faculty actually assess the performance, debate the performance (inter-rater reliability), a better handle on what it takes to improve the student learning experience emerges.

Internships are also invaluable targets especially when all or part of the same rubric might be used. We might engage internship supervisors and coordinators to also provide feedback on students’ performance during the internship.  That process helps corroborate or validate the capstone assessment and provides independent review that verifies our claimed outcomes. We’ve found that internship supervisors in other programs, in spite of some initial skepticism, were both willing and capable at applying rubric adaptations, and open to input from program faculty when they collaborate in the norming process.

Comment  added to original of this post

Capstone and Internships

There are some terrific internships going on for WSU students, and some not terrific ones.  From my conversations with faculty in a couple of different programs, the intership experience can be one of the best things but in reality it’s all over the map.

I think a great step that a number of programs may choose as a pressing need is to better assess the internship experience, and how students get feedback.  This could be rich assessment data.

However, because the internships are so varied in quality, I don’t see that as a good substitute for assessmetn of a capstone project (which should be addressing all the learning goals, for example, while an internship may not.)

Green, Kimberly at 12/18/2009 12:29 PM

Professional and Institutional Accreditation (ASPB View)

From the Executive Director
American Society of Plant Biologists

Hi again Gary.

I always enjoy learning more about topics that are unfamiliar to me, and accreditation is definitely one such topic! Clearly we could spend considerably more time in dialog (and I hope that we will!), but to directly answer your question, “What accountability bodies (or just pressures) are plant scientists responding to, if any?” I think the answer currently is ‘none.’

That said, you may be familiar with the (NSF-sponsored) -Vision and Change- exercise (see http://www.visionandchange.org/), which is one of the approaches to re-envisioning undergraduate biology education with which ASPB has been closely involved. Although it’s still somewhat fuzzy, it does seem to me to be coming into focus, and SoTL is definitely a major emphasis.

Although I am not aware that the NSF is actively pursuing accreditation metrics as (sort of) one end of an educational/research continuum, it is unusual among science research agencies in that it does have programs that focus on SoTL (in the Education and Human Resources Directorate), as well as the (perhaps better know) research programs (in the Biology, Geosciences, Math, etc. directorates). It is also clear to me that the NSF is making large strides, where appropriate, in interdigitating these programs. Which is to say that program officers are actively encouraged to work together across the directorates.

It is also pertinent, I think, that the NSF instigated a requirement four or five years ago that has had a profound impact on the way in which funded researchers approach the dissemination of their science. Known as “Criterion 2” or “broader impacts”, it obliges grantees to (in a nutshell) demonstrate to the NSF the ways in which they have engaged the public and/or educators and students around the objectives of the funded research project. This (of course) is not directly related to accreditation; my point, though, is that should the NSF so chose, it might be able to find ways to — er — induce more effective teaching among its grantees. (There’s a disconnect here, as I’m sure you appreciate. Organizationally, the role of a grantee as a teacher at his or her institution is largely distinct from their role as an NSF-funded researcher and governed by different structures. But just because it’s a tough nut doesn’t mean there won’t be people or organizations willing to have a go at cracking it.)

Getting back to ASPB — and recognizing that we are currently operating in an ill-defined ‘space’ and with (one) immediate goal of improving our members’ understanding and application of SoTL in their teaching — there is a raft of resources to which I could point you. I’ll start with just a couple, though, and copy in my colleague Katie Engen. Katie is a) more immediately familiar with ASPB’s efforts in this area, and b) in closer touch with our Education Committee (http://www.aspb.org/committees/education.cfm), which tends to pay closer attention to formal K-16 education, and with members of our Education Foundation board (http://www.aspb.org/education/foundation/board.cfm), which focuses more on informal, public education. I’m sure that she will be able to offer additional thoughts and links to resources, and she’ll be a good conduit –should such be needed– to members and leaders who are directly engaged in these efforts.

Speaking of, we are encouraging members to become both more literate about SoTL and more willing to properly study the efficacy of their own teaching (see, e.g.,http://www.aspb.org/newsletter/julaug09/09publish.cfm; please let Katie know if you can’t access this page and she’ll send you a pdf). We’re encouraging direct engagement by the society’s members in K-12 education (not necessarily your immediate interest, but the caliber of primary and secondary education has an obvious and direct impact on tertiary education); see http://www.plantcell.org/cgi/content/full/19/8/2311 for an article on this topic published recently in our top-notch research journal and
http://www.aspb.org/education/importance_statement.cfm for a statement on this topic that was ratified by the Society’s executive committee a couple of months ago.

We have also articulated some fundamental principles relating to knowledge of plants (see http://www.aspb.org/education/foundation/principles.cfm), and a project funded by ASPB’s Education Foundation is developing SoTL-informed hands-on modules around each of these principles.

I’ll stop there and invite both you and Katie to weigh in with any additional thoughts and comments.

Cheers,
Crispin

From: Brown, Gary [mailto:browng@wsu.edu]
Sent: Wednesday, September 02, 2009 5:30 PM
To: Crispin Taylor
Subject: RE: accreditation

Crispin,

Thanks for the quick response!

You have very acutely inferred the heart of my question (though I agree it was a bit muddled).

I’m looking at the way CHEA and almost every other regional and professional accrediting agency is in the process of revising standards, essentially raising the bar and requiring assessment to focus on outcomes (rather than coverage,) and encouraging educators to establish systematic assessment (rather than the fire drill events we are so adept at).  The goal of this across the USA has been to put a renewed focus on making changes in teaching and curricula based upon evidence.

I know that sciences are often without specific accreditors, though not without influencing agencies like NSF, NIH, and, presumably, ASPB.  At the same time, professional accreditation organizations like ABET (Engineering), AACSB (Business), NCATE (Education) etc. are also revising their standards to better align with regional accreditors.

So the question was what accountability bodies (or just pressures) are plant scientists responding to, if any.  I appreciate your answer.  Your response also raises the follow up question:  When you say you are ‘actively engaged with,’  I wonder how you (or I in my role in the office of Academic Effectiveness at WSU) can do more to engage and leverage the important influence of professional peers to encourage attention to the scholarship of teaching and learning.  As you can imagine, the challenge I face in my role is to keep the discussion focused on enriching the student learning experience rather than on perfunctory compliance with an annoying bureaucracy.

I am currently embarking upon a very exciting project with a group of plant scientists here at WSU, so any leads you might provide will be more than welcome by our team as we endeavor to expand and deepen our effort.  And, needless to say, as I anticipate a potentially terrific model of integrated research and assessment, done transparently online with what will be available tools, you and ASBP will certainly be welcome to join us.

Gary

From: Crispin Taylor [mailto:ctaylor@aspb.org]
Sent: Wednesday, September 02, 2009 12:18 PM
To: Brown, Gary
Subject: RE: accreditation

Hi Gary:

Apologies for being dense, but I’m not quite sure what your question is driving at. ASPB is well aware of — indeed, is actively engaged with — various efforts to re-envision the undergraduate biology curriculum, and we assuredly recognize the value of applying what is being learned through research on teaching and learning to improve pedagogy and instructional outcomes. We’re also investing in various online mechanisms and tools aimed at teaching content and/or process. I presume that many of these threads will come together in more formal accreditation programs/efforts, but at this point I do not believe that ASPB is promoting or participating in any such programs.

Having said all that, I am still concerned that I may be missing the point of your question. I think it’d help me do a better job answering that question (or referring you to someone who can) if you could provide me with some examples of the kinds of things you are referring to (e.g., examples from other disciplines), as well as some additional information regarding the context in which you are working.

Thanks for contacting me; I hope I will be able to help out, either directly or indirectly.
Cheers,
Crispin

Crispin Taylor, Ph.D.
Executive Director
American Society of Plant Biologists
15501 Monona Drive
Rockville, MD 20855-2768
Direct: 301-296-0900
Main: 301-251-0560
Fax: 301-251-6740
ctaylor@aspb.org
http://www.aspb.org/

It’s not too soon to save the dates…
PLANT BIOLOGY 2010
Montréal, Canada
Jul. 31 — Aug. 4, 2010

From: Brown, Gary [mailto:browng@wsu.edu]
Sent: Wednesday, September 02, 2009 12:08 PM
To: Crispin Taylor
Subject: accreditation

Hi Crispin,
I’m working with our plant biology programs here at Washington State, and I’m interested in learning more about various educational accreditation influences that may be looming relative to ASPB.  Do you know or do you know somebody I might contact to learn about where the profession may be heading.

Gary

Dr. Gary R. Brown, Director
The Office of Academic Innovation & Effectiveness
Washington State University
509 335-1352
509 335-1362 (fax)
browng@wsu.edu
https://mysite.wsu.edu/personal/browng/GRBWorld/