Learning outcomes on the horizon

Iceberg tip in the ocean.
Copyright 2012 Lieutenant Elizabeth Crapo, NOAA Corps. - NOAA Photo Library: anta0027, Public Domain, Link

Intended learning outcome descriptions present the distilled essence of a period of study in the form of a brief list of abilities that students “will be able to demonstrate” at the conclusion. The job of explaining the intended learning in clear simple terms is often at odds, however, with the inherent complexity of the actual learning involved. This tension has been very much on my mind over the past few months, in working with new learning outcome specifications for a restructured undergraduate curriculum. I thought that it might be worthwhile documenting and sharing some of the ideas and observations that began to emerge  during this period.

The key problem in reading or writing of intended learning outcome statements is their underlying uncertainty. The question that I’d like to explore here is our level of preparedness to address that uncertainty in our current methods of creating and managing learning outcome information.  The uncertainty is not merely about the chances of the outcomes actually happening as promised but how much it would really mean, if they did. Are the listed outcomes all equally important? Are all equally necessary for future success? Is everything that is important actually there?

In theory, the assessment tasks should be enough to answer any questions and put the learning outcomes in context. However, the assessment context is yet another problem, when assessment itself may be just as remote from the practice context of the subject matter studied and from the personal context and individual needs of the learner. Apart from these considerations, the assessment context brings its own specific problems of interpretation: fragmented task break down, different marking instruments and marking criteria, different task weightings and different outcome weightings within tasks, (or lack of), and heavy reliance on intuitive judgement in linking the various elements.

Trying to reach a coherent synthesis of intended outcomes amidst all these issues is a complex task, with many possible solutions but often no clear way forward. If anyone wants to know what the problem looks like at a practical level, try a Google search on learning outcomes for Fluid Mechanics (or any other standard offering) and note the degree of divergence in learning outcome descriptions. Universities and university staff around the world are clearly working hard on trying to provide greater depth and clarity on the intended outcomes of the university curriculum. However, the more we try to fill the information gap, the more we seem to drown in an overload of information revolving around the still unsatisfied question of what students should really expect to learn from their university degree.

The problem is explored in two sections below. The first section contains some thoughts on the current dominant approach in the management of learning outcome information. I use the label “external compliance approach” because external compliance is mainly what it seems to involve. The word “external” in the description has equal importance with “compliance”. This is an externally triggered process, not one initiated independently by teacher or students through sense of intrinsic need. The underlying argument is an old one going back (under a variety of labels) to the questions of intrinsic verses extrinsic motivation, the role of explicit instruction verses immersive practice, and their connections with the deep verses surface approaches to teaching. While these theoretical principles might be fairly well known, how they have played out in learning outcomes information management is less well known – and possibly worth some discussion. I am taking this opportunity to draw out some of my observations from the content developer level.

The second section is the more important bit: a set of specific suggestions regarding the requirements for doing learning outcomes differently. Online management of learning outcome details and other curriculum information creates new opportunities to re-shape the way learning outcome information is developed and used – with possibilities whose surface has scarcely been scratched. It is relatively easy to conceive a future curriculum information model that largely replicates the top-down bureaucratic reflexes of the paper-based world. Using the new technology in ways that bring real deep changes to the human dynamics around learning outcome content requires greater leaps of imagination. I put forward three possible suggestions for a different kind of curriculum information future.

In presenting these ideas, I move rather quickly over some large and complex areas of knowledge and reflection that cannot be done adequate justice in the space available. Academic references are avoided to keep the discussion light and free-flowing. I’m assuming that if anyone needs them, they will ask.

Compliance driven learning outcomes development: progress comes at a price.

It is very common, almost pervasive, in learning outcomes development work for the task to be approached in terms of compliance with externally prescribed standards. A wide range of standards may or may not come into play: AQF, professional competencies, university graduate attributes, university graduate qualities, discipline learning outcomes, the general principles of constructive alignment, or official guides on learning outcomes writing technique. The point is that learning outcomes writing will start from an external need to comply with one or more of these, and then continues until there is agreement that compliance has been achieved. Compliance with external standards becomes the whole focus of learning outcome effort. Compliance failure becomes the automatic explanation of any obstacles or deficiencies encountered. Poorly drafted outcomes and/or missing outcomes are simply a case of somebody failing to act as required. Making sure that procedures are properly understood, through appropriate staff development, is understood as the key to successful, well-written learning outcomes. (Or alternatively: delegating the job to someone else more familiar with the procedures and better prepared to comply).

In the abstract, the external compliance approach seems relatively benign and unremarkable. It is just about making sure that curriculum information is at minimum standard of usefulness for everyone who needs to use it, including students and staff. Who could argue with that? Given that compliance needs are often the institutional trigger for learning outcome development, a compliance approach may often be a natural pragmatic response. With time and hindsight, however, a number of deficiencies become apparent.

  • The external compliance approach glosses over the difficulty and complexity of learning outcome decisions, reducing them to a technical and administrative level, and ignoring the critical uncertainties underlying them. Whether we use the more technically accurate term “intended learning outcomes” or simply “learning outcomes” ultimately makes little difference. The amount of uncertainty and intuitive guesswork behind the described intentions is nowhere recognised.
  • The external compliance approach demands an inflated (and somewhat ridiculous) omniscience in the learning outcome writer. The writer is expected to confidently speak not only as a subject matter expert but as an expert on the specific implications for the individual student and what kinds of learning might be specifically important for them.
  • The external compliance approach encourages a naive confidence in the capacity of existing technical tools and procedures to address any difficulties arising. The deep dilemmas of what to include and what should be left out in the learning outcome description are met with brisk and breezy collections of basic writing tips (“keep it short” “use action verbs”, “make sure it’s assessable”). In trying to understand long-term learning progression and highest levels of university attainment, Bloom’s pyramid becomes a standard recommended solution for differentiating higher and lower level university outcomes – as if there were no conceptual problem in judging university outcomes by criteria that apply equally well at kindergarten or primary school.
  • Compliance pressures undermine the ability of teaching staff to own their outcome statements, and the authenticity of communication. Outcome statements are composed for a management audience, instead of for the student.
  • The external compliance approach brings an expansion in curriculum management and administrative workloads that is difficult to contain without substantial expenditure on additional non-academic admin and management personnel, or compromises on compliance quality or more often a combination of both.  The flow of additional resources in curriculum support would be counted as a positive if compliance duties did not take overwhelming priority in their deployment (rather than broader educational design support). Management and administrative staff in learning outcome management roles work at an even greater distance from the student learning context than academic teaching staff, which further adds to the risks and uncertainties to be dealt with, when responsibility for outcome content falls into their hands, for whatever reason.
  • The external compliance approach stifles engagement with learning outcomes for both students and teaching staff. Teachers who perceive learning outcomes in terms of administrative formalities are likely to produce outcome descriptions that generate similar perceptions in their students.

Creating a self-sustaining information ecology around learning outcome production and use

The alternative approach works through the information ecology surrounding learning outcome use and production, and the behavioural cues embedded there. The basic idea, in using the term “ecology”, is one of looking at learning outcomes information in terms of the surrounding information environment (including human and technical systems as well as information itself), and opportunities to develop that environment on a more self-sustaining basis, rather than relying heavily on external inputs. Technology has an important part to play in this ecology, in providing new and more powerful online tools for managing learning outcome information, but technology itself is not a specific focus. The question is not what the technology is or the technology does, but the kind of things that we need the technology to do. How can we orient the signals within the immediate environment towards an engagement with learning outcome information as on object of intrinsic value? How can we make learning income information more intrinsically worthwhile for its own sake? How can we reduce the pressures of top-down direction and surveillance? Three examples are offered of how learning outcomes content might be structured for a more active engagement of teachers and students at the curriculum information interface.

1. A learning outcome dashboard with detailed outcome status indicators within unit outlines.

Learning outcomes need to visually look the part that they are supposed to play, as the organising hub of curriculum design. They cannot be just one more block of text among many others (as in the traditional paper-based unit of study outline). Learning outcomes require a compelling visual presence. They must be centrally positioned and information rich. The information around them must forcefully address the central question of their existence: how can we trust what they say? If learning outcomes are to have any value as statements about student learning possibilities, there must be some indication of how probable the possibilities actually are. Confidence indicators alongside learning outcome statements might be set up in a variety of ways. Assessment transparency, assessment weighting, previous assessment records and content currency could all contribute. The indicators could be largely automated from existing data, avoiding manual data entry as far as possible.

The longer-term value of the current unit level learning outcomes and qualities could be visually represented through the structuring of the learning outcome sequence under long term learning outcome headings. The unit’s contribution to longer term learning progression should be a headline, not a footnote in the visual layout. Aggregate values for each long-term outcome category could be automatically generated from the individual confidence values of the individual outcomes specified within. Aggregate values for the learning outcome set as a whole would be similarly generated.

2. The visualization of long-term learning outcome progression

Longer-term course-level learning outcomes could be organised within a similar dashboard set-up where status reporting and access to supporting details is combined. The format could be a grid table that includes (a) a clearly designated direction of progression as main axis, (b) progression staging and stage criteria consistent with that direction, and (c) relevant outcome categories on the secondary axis. Every individual learning outcome would be specified as a set of grid points within a particular field of knowledge progression. Individual fields of knowledge progression would be cross-mapped against each other to facilitate cross-recognition of shared outcome components. The interlinked progression frameworks could evolve continuously through their interactions with each other as well as the individual learning outcome elements mapped within.

3. Student and teacher open access.

Learning outcomes are projections of learning priorities and possibilities into a largely invisible and continuously evolving future. The engagement of students and teaching with these priorities and possibilities must be similarly dynamic, sustained and open-ended. For optimum student and teacher engagement, we need to minimise the barriers on both sides. The capacity of teaching staff to review and update learning outcome details, wherever necessary and as often as necessary, is essential. Student access should be similarly unimpeded from  pre-enrolment through to graduation. Opportunity for direct student comments and queries on specific outcomes would be highly desirable. Version changes should be fully tracked. The frequency of independent updating would be an important metric for engagement levels and an indirect indicator for learning outcome quality. The notion that learning outcomes should represent some sort of fixed and immovable target is a legacy of compliance thinking. Recognition of their intrinsically dynamic character is critical to the development of intrinsic interest and motivation around them, for both teaching staff and students.

And apart from that . . . ?

The list could easily be extended, but three items seems enough for now.  If anyone is engaged with these questions at the moment, it would be great to see your thoughts.

Written By
More from Tim Lever