Google Wave unifies Workspace and Showcase Portfolios

Google Wave was just announced at the Google I/O conference, video here. At minute 27:00 to 31:00 you will see a segment with important implications for the conversation we have been having with Helen Barrett about Showcase vs Workspace ePortfolios. Our original, and her graphic.  What seems to be important about Google Wave is the way it is a PLE and by adding new reviewers of some works, a showcase as well.  This could address the challenge of making showcases which might have otherwise been an aftertthought to the actual work.

The ability to imbed a Wave in another environment, such as a blog, could be used to quickly publish work items and invite wider communities to engage with them.

The survey tool shown later in the demo appears to be able to integrate with other aspects of a document, and if so, and if the survey tool is rich enough, might provide the embedded feedback for at least the author’s use as a harvesting gradebook.

Critical Thinking Skills Upgrade and Prezi

WSUCTLT has published in multiple places on its critical thinking rubric. Our work on ePorfolios, and Harvesting Gradebook have us thinking that the Networked Learner needs some skills not well represented in that rubric. We hesitate to use the 21st Centrury moniker, but we need to update for a new era.

A while back, WSUCTLT created a “critical thinking action figure”  a tongue-in-cheek symbol for the skills we hope to develop in students. A desire to re-think the rubric, and a desire to use play as part of the vehicle for that work, has led to a plan for co-creating the 21st Century Global Action Figure based on some of the ideas from Jenkins, Pink, and Rheingold.  We’ve got a tentative date of Friday, June 5, sometime in the afternoon to run the first virtual back-of-the-napkin session.  For fun, JaymeJ created an invitation in Prezi (which is an interesting tool with a lot of possibilities if you’re not familiar with it. (think SeaDragon)  Here is the link to the invitation.

It takes a few minutes to load and then you can click through the sequence by using the right arrow in the lower right corner or you can zoom in or out of objects by clicking on them.

The tool we were planning on using for the back-of-the-napkin session is Twiddla possibly combined with another tool for the audio—still working out the kinks.

We’ll try to capture the sessions and post them.

Blackboard + Angel = reason for open learning

In response to the Wired Campus article about Blackboard’s acquisition of Angel Learning,  Scott Leslie commented about moving beyond the LMS to networked learning options. His comment led me to this 2005 post where he saw the social software light and a later post looking for help making the case for “fully open” content.

I think it would be useful to go beyond open content to other aspects of openness, such as Downes’ open assessment. Beyond open content and assessment lie open problems worked in community. One of the problems, if students are working in the cloud, is how to manage the assessment of the distributed student work. WSUCTLT has been exploring how to gather assessments from the cloud, with an idea called Harvesting Gradebook. This has led them to look at teaching/learning practices along a spectrum from institution-based to community based (PDF). That work began with perspectives shaped by institutional contexts, but is now branching out to examine open learning outside the univeristy‘s walls.

Scott, as you suggest, part of making the case for open, is the potential for greater scalability. When the institution locks up the content and locks up the expertise in a course, then its hard to scale. MIT’s open content takes one step. Western Governor’s University takes another step with its “bring your own content and experience” strategies. Community based learning unlocks the content, the experts and the assessment, making for the most scalable solution. (Diagram of WSUCTLT thinking on these four models (PDF).)

Google gadgets for presenting data

One of the issues in learning in an era of information abundance is the need for tools to help visualize data. Examples of this are emerging including Blaise Aguera y Arcas demos Photosynth andHans Rosling shows the best stats you’ve ever seen. Our experiments are more modest, using Google to graph data and share it, below, and in a gadget being built by Corinna Lo in support of our Harvesting Gradebook work. In both the graph below and Lo’s work, data in a Google Doc Spreadsheet is being fed to a dynamic graphing tool that can then be mashed up into another presentation.

Contrast the current data above with the idealized trajectory in this WHO/CDC composite.

Graph of levels of infection vs stages in flu pandemic

Graph of levels of infection vs stages in flu pandemic

For-profit assessment solutions: expediency or folly

In the May-June 2009 issue of Change, Pat Hutchings describes the growing for-profit assessment sector in “The New Guys in Assessment Town” .

Hutchings reports on asking a vendor about faculty push back. “ ‘Not so much,’ Galvin says, ‘not after faculty understand that the process is not intended to evaluate their work.’ ”

That is the big lie, the thought that teaching and assessment can be usefully separated, that assessing student performance is not also assessing teacher performance, course and program design and coherence.

We have argued a different perspective:  Assessment is an occasion for learning on the part of both student and teacher, and it can be rolled up from student to course, to program to institutional levels to have impacts that are richer and deeper than out-sourced (over the wall) assessments like the Collegiate Learning Assessment (CLA).

This problem of separating assessment from teaching is further evidenced where Hutchings quotes English instructor Katie Hern: “I’m a little wary. It seems as if, in addition to the assessment feedback we are already giving to students, we might soon be asked to add a data-entry step of filling in boxes in a centralized database for all the student learning outcomes. This is worrisome to those of us already struggling under the weight of all that commenting and essay grading.”

Hern has, in our view, already separated grading students from assessing their performance in wider ways that are meaningful to the program, institution and perhaps society, The ideas of “transformative assessment”  that we endorse run counter to this.  Smart assessments can do several jobs simultaneously.

Hutchings goes on: “…EduMetry also provides data-management solutions. But its special niche reaches back to its original  business, the scoring of student work. …. ‘This is where we see many institutions struggling,’ Galvin says, adding, ‘Faculty simply don’t have the time for a deeper involvement in the mechanics of assessment. Many have never seen a rubric or worked with one, so generating accurate, objective data for analysis is a challenge.’”  Our perspective is, that rather than learn to engage the process, campuses that use this strategy out-source the work (and the learning) to India.

We have been exploring peer and community-based assessment, engaging the wider community in assessing the student, the assignment and the validity of the assessment instrument itself.    Working this way has significant impact on the faculty role (see “A Harvest Too Large? A Framework for Educational Abundance. Batson, T, Paharia N and Kumar M in Opening Up Education) and this spectrum from institution-centric to community-centric learning.

(collabortive post by Gary Brown and Nils Peterson)