Blackboard + Angel = reason for open learning

In response to the Wired Campus article about Blackboard’s acquisition of Angel Learning,  Scott Leslie commented about moving beyond the LMS to networked learning options. His comment led me to this 2005 post where he saw the social software light and a later post looking for help making the case for “fully open” content.

I think it would be useful to go beyond open content to other aspects of openness, such as Downes’ open assessment. Beyond open content and assessment lie open problems worked in community. One of the problems, if students are working in the cloud, is how to manage the assessment of the distributed student work. WSUCTLT has been exploring how to gather assessments from the cloud, with an idea called Harvesting Gradebook. This has led them to look at teaching/learning practices along a spectrum from institution-based to community based (PDF). That work began with perspectives shaped by institutional contexts, but is now branching out to examine open learning outside the univeristy‘s walls.

Scott, as you suggest, part of making the case for open, is the potential for greater scalability. When the institution locks up the content and locks up the expertise in a course, then its hard to scale. MIT’s open content takes one step. Western Governor’s University takes another step with its “bring your own content and experience” strategies. Community based learning unlocks the content, the experts and the assessment, making for the most scalable solution. (Diagram of WSUCTLT thinking on these four models (PDF).)

Advertisements

Google gadgets for presenting data

One of the issues in learning in an era of information abundance is the need for tools to help visualize data. Examples of this are emerging including Blaise Aguera y Arcas demos Photosynth andHans Rosling shows the best stats you’ve ever seen. Our experiments are more modest, using Google to graph data and share it, below, and in a gadget being built by Corinna Lo in support of our Harvesting Gradebook work. In both the graph below and Lo’s work, data in a Google Doc Spreadsheet is being fed to a dynamic graphing tool that can then be mashed up into another presentation.

Contrast the current data above with the idealized trajectory in this WHO/CDC composite.

Graph of levels of infection vs stages in flu pandemic

Graph of levels of infection vs stages in flu pandemic

For-profit assessment solutions: expediency or folly

In the May-June 2009 issue of Change, Pat Hutchings describes the growing for-profit assessment sector in “The New Guys in Assessment Town” .

Hutchings reports on asking a vendor about faculty push back. “ ‘Not so much,’ Galvin says, ‘not after faculty understand that the process is not intended to evaluate their work.’ ”

That is the big lie, the thought that teaching and assessment can be usefully separated, that assessing student performance is not also assessing teacher performance, course and program design and coherence.

We have argued a different perspective:  Assessment is an occasion for learning on the part of both student and teacher, and it can be rolled up from student to course, to program to institutional levels to have impacts that are richer and deeper than out-sourced (over the wall) assessments like the Collegiate Learning Assessment (CLA).

This problem of separating assessment from teaching is further evidenced where Hutchings quotes English instructor Katie Hern: “I’m a little wary. It seems as if, in addition to the assessment feedback we are already giving to students, we might soon be asked to add a data-entry step of filling in boxes in a centralized database for all the student learning outcomes. This is worrisome to those of us already struggling under the weight of all that commenting and essay grading.”

Hern has, in our view, already separated grading students from assessing their performance in wider ways that are meaningful to the program, institution and perhaps society, The ideas of “transformative assessment”  that we endorse run counter to this.  Smart assessments can do several jobs simultaneously.

Hutchings goes on: “…EduMetry also provides data-management solutions. But its special niche reaches back to its original  business, the scoring of student work. …. ‘This is where we see many institutions struggling,’ Galvin says, adding, ‘Faculty simply don’t have the time for a deeper involvement in the mechanics of assessment. Many have never seen a rubric or worked with one, so generating accurate, objective data for analysis is a challenge.’”  Our perspective is, that rather than learn to engage the process, campuses that use this strategy out-source the work (and the learning) to India.

We have been exploring peer and community-based assessment, engaging the wider community in assessing the student, the assignment and the validity of the assessment instrument itself.    Working this way has significant impact on the faculty role (see “A Harvest Too Large? A Framework for Educational Abundance. Batson, T, Paharia N and Kumar M in Opening Up Education) and this spectrum from institution-centric to community-centric learning.

(collabortive post by Gary Brown and Nils Peterson)