With huge advances in technology throughout the years, comes huge expectations for how every aspect of our lives will change for the better. But in the world of education, what happens when the realities fall short? Here, Phillip Lupton shares the evolution of technology in education thus far, and discusses what it means to successfully implement technology in the classroom to truly improve teaching and learning in the modern world.

The next big thing?

“The motion picture is destined to revolutionise our education system…”
– Thomas Edison, 1922

“The use of videodiscs in classroom instruction is increasing every year and promises to revolutionise what will happen…”
– Semrau and Boyer, 1994

“…the iPad 2 will revolutionize education…”
– Gregory Ferenstein, 2011

Even prior to the dawn of the digital revolution, all manner of theories as to how technology would transform teaching and learning had been prophesised. Over the last thirty years, key inventions have been in the spotlight: interactive whiteboards (c.1990), the World Wide Web (1991), Windows ’95 (1995) – which encouraged adoption of the home computer, iPhone & iPad (2007 and 2011, respectively) and mobile applications (2008). Each of these technological advances was expected to change the face of education as we knew it.

It is true that all have been integrated into schools and home learning to some extent and, for some, they are used daily. However, no hardware or software seems to have triggered the technological revolution in education that we were expecting. In fact, a 2015 OECD global study found that “Those students who use tablets and computers very often tend to do worse than those who use them moderately.” At first read, one might assume that technology itself was having an adverse effect on education, but perhaps the issue is really one of harnessing its power effectively.

 There seem to be two questions to explore here: why has the digital revolution not changed teaching and learning as predicted and, given the ever-increasing range of technology available, how can educators make better use of it? 

Why has the digital revolution not changed teaching and learning as predicted?

To answer this, we need to look at the basis for the prophecy of a digital revolution in education, and it turns out that two large (and optimistic) suppositions were made. Firstly, it was assumed that all those born after 1980 would spend their early years utterly steeped in technology and, as such, would arrive at school as ‘digital natives’, able to turn their skilled, cyber-savvy hands to any new technology with ease. It was also assumed that these students would also be able to multi-task to a greater degree than their elders.

Paul A. Kirschner and Pedro De Bruyckere somewhat broadside these ideas in their paper ‘The myths of the digital native and the multitasker’ concluding that “…there is quite a large body of evidence showing that the digital native does not exist nor that people, regardless of their age, can multitask.

It seems that students, just as they might when learning a new language or sport, require explicit instruction on how to learn to use any new form of technology effectively. Assuming that students are innately able to use technology to understand and assimilate knowledge in another subject has ensured many ICTs have not had the desired effect on learning. 

How can educators make better use of technology?

To find an answer to this question, one must remember the first rule of teaching: know your students. To discern whether new technology will help or hinder a student, a teacher must have a good handle on a student’s capacity to take on new information at that time. Once this is understood, it’s possible to start planning the next learning step and assessing whether technology might support that step.

It is useful to introduce cognitive load theory at this point. Highly simplified, cognitive load refers to the amount of working memory required to take on new knowledge and to eventually create a permanent store of that knowledge. Why is this useful? If the cognitive load associated with learning to use new technology is too high, little mental bandwidth is available for the knowledge that is being delivered by that technology. Conversely, if the cognitive load is too low, students become easily bored and lose focus.

The Education Endowment Foundation offers four recommendations as to how technology can support learning: 

1. Consider how technology will improve teaching and learning first 

In a 2015 survey by Canvas, almost half of primary and secondary teachers admitted that they rarely used technology that had been bought for the school. When asked about this, David Fairbairn-Day, the Head of Education Strategy and Business Development at Promethean, said “I see schools putting in policies for every student to have access to a tablet. Sometimes, once the tablets arrive, they are scratching their heads. Now we have them, what do we do with them?” Before investing in new technology, schools should consider how it will support students’ learning and, indeed, educators’ teaching, as it must do both to be effective.

2. Consider how technology can aid explanation and modelling

Through animation, video and other multimedia experiences, technology has the potential to enhance the way educators explain difficult concepts to students. The point to remember here, however, is that extraneous information may heighten the cognitive load and detract from the learning potential. 

3. Use technology to improve the impact of pupil practice 

Technology can improve how often and how well students practice new skills and revisit learned information, both at home and in the classroom. Self-marking assessment allows students to work independently, wherever and whenever they wish, while providing teachers with the peace of mind that students are receiving the correct information at all times. 

4. Use technology to provide instant assessment feedback

This form of feedback supports but does not replace teacher feedback. Technology can make collecting assessment data easier, faster and more accurate, leaving teachers free to interpret that data effectively and act on it to improve the learning experience.

When used ineffectively, an interactive whiteboard is no better than a chalkboard. However, when educators use an interactive whiteboard to facilitate engagement with new information in a way that encourages participation and focus, this piece of hardware transforms the learning experience. In addition to the humble whiteboard, toy robots can teach the fundamentals of coding, spreadsheet programmes can teach scientific data processing and analytical skills, music software can teach recording and composition, and console games that require the user to physically dance, can improve gross motor skills. 

What we therefore must be mindful of is how we adapt our pedagogical practice to integrate technology in a way that allows students to be active participants in their learning. To support such integration, governments and school leadership teams can create a supportive culture in which teachers are equipped with the technical ability to use new technology and the best pedagogical practices to decide what works for their students and how, together, they can make the most of it. 

Looking to the future though, I ask myself the same question that many before me have asked: what is the next big thing? 

Just last month, Elon Musk announced a $27 million investment in Neuralink, a venture with the bold mission to develop a brain-machine interface, which will improve human communication, even in non-verbal users. If that wasn’t enough, earlier this year, The Open University in the UK went on record to predict that brain-to-brain learning will be realised by 2070.

Maybe these will be the next technologies to join the learning revolution.

Author: Phillip Lupton is a British Education Advisor specialising in educational reform, curriculum and assessment development, teacher training and school leadership.