From the early mechanical teaching machines of Sidney Pressey (1926) and B. F. Skinner (1953) through to the current spate of apps, games, VR and AI – educational technology has repeatedly traversed a well-worn cycle of hype and hope, fad and flop. As tools are built, claims are made, papers written and careers forged it is easy to get lost in the maelstrom. Similar to high-tech’s oft-parodied claims to be “making the world a better place” (thank-you to HBO’s Silicon Valley for the parody) education too is quick to declare how its new tools and approaches are building the “classroom campus/university/” of “the future/21stCentury/etc.”. At times this language can be so standard as to appear to follow a pro-forma. For example, the question begging of whether “higher education is ready for…” (insert; millennials, Gen Z, learning analytics, digital natives, etc.). Alternatively, there is a continual recourse to claims of things ‘being dead’ (e.g. the lecture, the university, the campus etc.). This combined hype and speculation, creates the impression that technological change is a tsunami that requires an advanced warning system, revolutionary infrastructure or – for many – a retreat to higher ground.
Though big titular claims on the ‘future of learning’ are heavily caveated in the fine print, the chasm between the future as envisioned by experts and the current realities of the faculty floor can be vast. This gap can be exacerbated by the usual swarm of buzzwords that heralds the arrival of yet another tool or future. And, while all disciplines have their own sets of jargon and terminology, for education this language can be especially excluding:
Few topics are as beset by jargon as education. If education were intended for a select few this would not matter. But when, rightly, we want everyone to be involved in education, it matters a great deal (Baker, 2001).
And this is the problem. Change, whether invited or not, impacts on all of us. Academic and professional staff, casual tutor and tenured professor – the disruption of changing economic climates, industries and new technologies does not discriminate. It is both unrealistic and undesirable for universities to remain static or immobile. However, knowing what to do, whether as an individual or as an institution, is incredibly difficult. How do we know that today’s flipped classroom is not tomorrow’s 80s perm? The problem is, is that we can never be entirely sure.
In the case of technology, legitimate and significant changes to the world we live in can resemble fads in their infancy. While some trends and their buzz words will pass on into ignominy, others will fledge the nest and take flight into both the mainstream and the dictionary. Knowing which is which is tough. For an example of this look no further than the BBC’s 1999 Newsnight interview with David Bowie:
Paxman: you don’t think some of the claims about it [the internet] are being hugely exaggerated […]
Bowie: No, you see I don’t agree. The internet. I don’t think we’ve even seen the tip of the iceberg. I think the potential of what the internet is going to do to society – both good and bad – is unimaginable. I think we’re on the cusp of something exhilarating and terrifying
Paxman: “It’s just a tool though isn’t it?”
Bowie: No, it’s not. No. It’s an alien life form […] is there life on mars? Yes – its just landed here
Before adoption into the mainstream, innovations are alien. Yet for every Facebook and Twitter are a litany of failures. Take 2007, when the belief that the future of the university was to be found in second life. For those who enjoyed The Sims and Second Life for doing lots of things you’d never do IRL (for horrible examples see here), some imagined that people would relish the opportunity to walk an avatar into a classroom and sit at a desk for half an hour. Though Minecraft has had some educational success, albeit in a different format, the second life university never really took off.
So how, as universities, faculties, and educators do we know where to place our bets? The answer it seems lies not in revolution, but evolution – of small changes, trial, error and dialogue. We know that technology is no silver bullet: no one-size fits all. Flipped classrooms, for example, though supported by video technology, are not in themselves particularly revolutionary. Students have long had pre-work and homework – albeit in a written, rather than a video, format. Likewise, good taught lectures were always responsive to what was going on in the room and not rolled out in unresponsive one-size-fits-all model that some of the ‘flipped classroom’ narrative suggests. Rather, what allows learning to happen is not the “flipped model” but how and with whom it is applied. This has always been true. The difference between a good and bad ‘chalk and talk’ lecture had little too do with the chalk or the blackboard. Instead of bandwagons we need to do what good learning and teaching has always done – and that is return to the needs of the student and working out how and why a new technology or method might suit this cohort, year group, or student.
More widely, we need, at all levels, to do a better job of talking about the change that technologies bring. Of recognising that it can be tiring, frightening, or overwhelming. That not all people are ‘techies’ and that what looks great on paper may fail to translate into the classroom. An activity or technology may work in one unit, only to flunk in another. Similarly, until new buzz words, jargon, and ‘education speak’ makes its way into the mainstream, this language will continue to exclude and alienate. As the tools we use to learn and teach continue to shift it is worth remembering that we have time to adapt, learn, test fail, critique, revise and talk in full sentences. In this way we might see change, not as a tsunami to resist or submit to, but as a wave that we, as the human authors of such tools, create, use and harness.