Er, what exactly does that mean? and why is it so important?
Apologies for the verbose title! My problem is that I am simultaneously trying to shift the conception of what problem we are trying to solve and suggesting a radically different approach to solving it. So, let me begin by breaking the title down into its components. Firstly, my big concern as an academic technologist is that the rate of shiny-new-tech-churn is far too high, both for individuals and for institutions. At the individual level, I see a lot of new software, web services and hardware being bought or signed-up for, but not a lot of it actually “sticking”. In many cases (the iPad being the worst offender), a new technology is being exploited only to a small fraction of its potential. I see a lot of people using iPads to look at web pages, to watch TV and to read (not necessarily write) email. The rate of shiny-new-tech-churn might be high, but the actual rate at which technology change is leading to significant enhancement of practice is far too low. Everything and nothing changes. We are failing to maximise academic technology-practice adoption – failing to make change sticky. When we are choosing/designing technology-practices for ourselves or for others, we need to understand why innovation-adoption fails, and what we can do to maximise its chances of becoming sticky.
Personally my technology-practices (digital and analogue) have been completely transformed by the iPad’s capabilities. As a consequence, the kinds of project that form the basis for my life, and the concerns towards which they are addressed, have also changed for the better. For example, I have read more academic books, more deeply, and with greater consequence, since using the Kindle iPad app (annotation) and the Kindle web site (reviewing my notes). That is an example of maximum technology-practice “stickiness”. I found something of value, I adopted it and it stuck with me as a deeply embedded part of how I work. And furthermore, it was the springboard for the further development of my personal practices, projects and concerns. I am doing something right so as to maximise my “academic technology-practice adoption”.
Unfortunately, I know that I am unusual as an individual technology-practice adopter. Worse still, institutions are churning through new technologies, often at great expense and with little consequence. Like throwing mud at a wall to see what sticks. Teachers, trying to get the best out of new technologies, trying to find technology-practices that work for their disciplines and their students, are finding this to be a massive challenge. So difficult that many do not even actively try to address it. We seem to lurch between no-change and too much change. Surely there is a better way?
Yes there is, we can learn from professional designers. More specifically, we can learn from an interdisciplinary design methodology called Interaction Design (ID), and from a design strategy called Design Thinking (DT). My own research and development work is aiming to adapt these approaches for education. In my project, Interaction Design becomes Interaction Design for Learning (IDfL), and Design Thinking becomes Design Thinking for Learning (DTfL). What we need to do then is to use IDfL and DTfL to find technology-practices that fit and stick (“hitting the sweet-spot”). Technology-practices that become adopted with a greater degree of durability (and lead to even greater things). Fortunately there is a natural affinity between academia and these designerly approaches. I argue (philosophically, as I am a philosopher) that learning/teaching is exactly the same process as innovation and the diffusion-adoption of innovation. But that’s another essay altogether!
Why is successful change in technology-practice so hard?
I believe that six separate but interconnected trends are behind this.
1. Technologies and the practices that they are designed to serve are changing so fast that even experts are struggling to keep up. For example, going from a desktop applications with local storage model to cloud computing is hard enough. The many variations on this new idea, with varying implications for security, privacy and reliability, pose yet another fresh stream of challenges.
2. The strictly demarcated job-for-life, is disappearing, along with rigid social roles and identities. Lives and careers are becoming more like loosely-coupled assemblages in which a single person has many different roles over time and all at the same time. In such complex conditions, technology-practices that fit a person’s own particular blend of activities, and which can be carried across different roles, have a better chance of sticking. This can have two consequences: 1. go for a common denominator; 2. adapt it to your own unique needs. The dependence upon a ubiquitous personal platform might then encourage a resistance to change. New technology-practices in one role might be incompatible with a person’s other roles. Furthermore, when we try to introduce an innovation into a role, we might be faced with little uniformity between the role-transcendent blends of technology-practice that each different person has developed. These incongruities introduce hard to predict, and consequently difficult to support cognitive challenges for users faced with new designs. We know that extraneous cognitive load is the enemy of design, especially learning design. When we work with a technology-practice that is unfamiliar and therefore clunky we have to actively think our way through it, this occupies our limited cognitive capability (attention, short-term memory) and impedes our ability to focus on the purpose of the technology-practice (for example, learning). Usually, when designing technology-practices we take this into consideration. We apply the “don’t make me think” mantra – or in its learning technology incarnation “make me think about the things that I should be thinking about”. We can design to overcome this by getting to know how people already think, their existing knowledge schemas and expectations (in long-term memory), and by easing the transition to new ways of working and the acquisition of new schemas. However, when we are designing for people who are coming to our innovation with diverse backgrounds, using a wide range of technology-practices, it is not so easy for us to design with these transitions in mind.
3. To make innovation even more challenging, there has been an evolutionary explosion in the technosphere. We are seeing an extraordinary range of new hardware, software, web services and interoperabilities being created at ever lower cost (if not free) as tech providers search for markets. There is a complimentary boom in the range of situations into which people put technologies, with ubiquitous computing (that is devices and network connectivity anytime anyplace) driving the ceaseless encroachment of the digital into the analogue. Ten years ago technology was a scarce thing, mostly provided by institutions. Now, most people choose their own blend of technologies to fit their own personal ways for their own personal purposes. In return, technology providers are making yet more scope for personalisation and customisation. For example, whereas Web 2.0 made it easy for ordinary people to modify content within the templates provided by well known web applications (e.g. Wikipedia), the next generation of web technologies (Web 3.0) allows ordinary people to create their own applications, designing interfaces, workflows, data etc.
4. There is a growing variation in the social organisation of invention, reinvention and diffusion of innovations – the social means by which new technology-practices are created, modified and become adopted. The two classic models are becoming weakened. The diffusion (spread) of innovation from expert (governmental or commercial) to user (as described by Everett Rogers) is being reversed. Governments and businesses are increasingly having to respond to the unruly behaviour of unofficial experimenters and inventors, working in ad hoc collaborations whose shared interest is in tinkering with the same toolset rather than optimising a common profession or behaviour. Similarly, communities of practice (as described by Etienne Wenger) have been seen to offer a rational, effective means for innovation to happen and to spread. The idea is that people who share a practice, but who are dispersed across an institution or beyond, should take responsibility for developing their own practices, including technology-practices. Innovation would in this way diffuse horizontally amongst peers, through a community-managed workflow. But this can be weakened where practitioners find more success by forming vertical and heterogeneous collaborations – for example: an innovation in a school might be more effective where teachers, students, parents and support staff collaborate to find a local solution. As with individual emergent experimentation, these ad hoc heterogeneous collaborations are becoming much more common, if not the default mode for many people.
5. There has also been a change in how people reflect on the world and reflexively deliberate on their own history, choices and futures, affecting how they make choices and the outcomes of deliberation (sociologists use the term “reflexive” to refer to a person reflecting on their own actions, choices etc). In a series of three books, based upon a study of people living in Coventry, and a longitudinal study of Warwick University undergraduates, Margaret Archer discovered four distinct modes, each with a different impact upon how people change their practices (reproduced from Archer, 2007: p.93):
Those whose internal conversations require completion and confirmation by others before resulting in courses of action.
Those who sustain self-contained internal conversations, leading directly to action.
Those who are critically reflexive about their own internal conversations and critical about effective action in society.
Those whose internal conversations intensify their distress and disorientation rather than leading to purposeful courses of action.
The response to innovations in technology-practice varies in relation to these at a generic level (e.g. the way in which each person responds to any new technology-practice depends upon their default mode). A specific technology-practice may also lend itself to being of interest to people with a specific mode. Communicative reflexives will, for example, use social networking in a different way, and hence prefer different kinds of social networking platforms. Autonomous reflexives have a more instrumental approach, using whatever is necessary or useful at the time so as to get themselves closer to achieving the ambitions upon which they are focussed. Meta-reflexives will often be concerned about the ethical and aesthetic implications of adopting a technology. And fractured reflexives might be better off keeping away from Twitter!
6. Finally, there are significant variations in how well individuals are able to recognise the designs that make up their technospeheres, how effectively they can reimagine the design of their technology-practices, and their subsequent ability to implement change. We might call this capability designerliness (following the title of a book by design theorist Nigel Cross called Designerly Ways of Knowing). The levels of designerliness can be roughly described as:
Design aware (notices the design of things and practices, notices good and bad design).
Design active (acts to improve the design of things).
Design reflexive (thinks about and tries to improve how they do design, selects from a range of techniques and strategies).
It is also important to consider not only the designerliness of individuals but also the designerliness of the cultures and institutions in which they are embedded. It can be hard for a person to be design active in some institutions. Ultimately, the extent of designerliness will radically alter the adoption of invention and adoption of new technologies – but this is dependent on all of the other factors, especially social organisation. If we get social organisation right, then we can enable the highest level of designerliness, and make effective invention, reinvention and diffusion of innovation happen. Interaction Design for Learning (IDfL) is a method that aims to achieve this. Design Thinking for Learning (DTfL) is a strategy for making IDfL work. The strategy is to enhance the designerliness of communities and heterogeneous collaborations.
Coming soon – part 2 of this article, IDfL and DTfL and how to use them to the maximise technology-practice adoption.