Using Technology in the Classroom: How can we do it better?

photo of a light bulb

Written by Phillip Lupton

Phillip Lupton has 20 years of experience leading improvement in schools. He is an expert in teacher CPD, curriculum and assessment development, school leadership and educational reform.

1st October 2019

In this blog post, Phillip Lupton assesses the impact that technological advances have had on education so far, before considering a few principles that can help teachers to put these innovations to better use in the classroom.

In many ways, technology has yet to live up to expectations that it will provide ‘silver bullet’ solutions to challenges in education. But recently, many teachers have immersed themselves in the world of online teaching and learning like never before. So… now seems like the perfect moment to reflect on two big questions raised in this post:

  • Why hasn’t the digital revolution transformed teaching and learning as predicted?
  • How can teachers use technology more effectively in the classroom?

 

The next big thing?

“The motion picture is destined to revolutionise our education system…”
– Thomas Edison, 1922

“The use of videodiscs in classroom instruction is increasing every year and promises to revolutionise what will happen…”
– Semrau and Boyer, 1994

“…the iPad 2 will revolutionize education…”
– Gregory Ferenstein, 2011

Even prior to the dawn of the digital revolution, all manner of theories as to how technology would transform teaching and learning had been prophesied. Over the last thirty years, key inventions have been in the spotlight: interactive whiteboards (c.1990), the World Wide Web (1991), Windows ’95 (1995). In turn, these led to the proliferation of the home computer, iPhone and iPad (2007 and 2011, respectively) and mobile applications (2008). Each of these technological advances was expected to change the face of education as we knew it.

It is true that all have been integrated into schools and home learning to some extent and, for some, they are used daily. However, no hardware or software seems to have triggered the technological revolution in education that we were expecting. In fact, a 2015 OECD global study found that “Those students who use tablets and computers very often tend to do worse than those who use them moderately.” At first read, one might assume that technology itself was having an adverse effect on education, but perhaps the issue is really one of harnessing its power effectively.

There seem to be two questions to explore here: why has the digital revolution not changed teaching and learning as predicted and, given the ever-increasing range of technology available, how can educators make better use of it? 

Why has the digital revolution not changed teaching and learning as predicted?

To answer this, we need to look at the basis for the prophecy of a digital revolution in education, and it turns out that two large (and optimistic) suppositions were made. Firstly, it was assumed that all those born after 1980 would spend their early years utterly steeped in technology and, as such, would arrive at school as ‘digital natives’, able to turn their skilled, cyber-savvy hands to any new technology with ease. It was also assumed that these students would be able to multi-task to a greater degree than their elders.

Paul A. Kirschner and Pedro De Bruyckere somewhat broadside these ideas in their paper ‘The myths of the digital native and the multitasker’, concluding that “…there is quite a large body of evidence showing that the digital native does not exist nor that people, regardless of their age, can multitask.

It seems that students, just as they might when learning a new language or sport, require explicit instruction on how to learn to use any new form of technology effectively. Assuming that students are innately able to use technology to understand and assimilate knowledge in another subject has ensured many ICTs have not had the desired effect on learning. 

How can educators make better use of technology?

To find an answer to this question, one must remember the first rule of teaching: know your students. To discern whether new technology will help or hinder a student, a teacher must have a good handle on a student’s capacity to take on new information at that time. Once this is understood, it’s possible to start planning the next learning step and assessing whether technology might support that step.

It is useful to introduce cognitive load theory at this point. Highly simplified, cognitive load refers to the amount of working memory required to take on new knowledge and to eventually create a permanent store of that knowledge. Why is this useful? If the cognitive load associated with learning to use new technology is too high, little mental bandwidth is available for the knowledge that is being delivered by that technology. Conversely, if the cognitive load is too low, students become easily bored and lose focus.

The Education Endowment Foundation offers four recommendations as to how technology can support learning: 

1. Consider how technology will improve teaching and learning first 

In a 2015 survey by Canvas, almost half of primary and secondary teachers admitted that they rarely used technology that had been bought for the school. When asked about this, David Fairbairn-Day, the Head of Education Strategy and Business Development at Promethean, said “I see schools putting in policies for every student to have access to a tablet. Sometimes, once the tablets arrive, they are scratching their heads. Now we have them, what do we do with them?” Before investing in new technology, schools should consider how it will support students’ learning and, indeed, educators’ teaching, as it must do both to be effective.

2. Consider how technology can aid explanation and modelling

Through animation, video and other multimedia experiences, technology has the potential to enhance the way educators explain difficult concepts to students. The point to remember here, however, is that extraneous information may heighten the cognitive load and detract from the learning potential. 

3. Use technology to improve the impact of pupil practice 

Technology can improve how often and how well students practise new skills and revisit learned information, both at home and in the classroom. Self-marking assessment allows students to work independently, wherever and whenever they wish, while providing teachers with the peace of mind that students are receiving the correct information at all times. 

4. Use technology to provide instant assessment feedback

This form of feedback supports but does not replace teacher feedback. Technology can make collecting assessment data easier, faster and more accurate, leaving teachers free to interpret that data effectively and act on it to improve the learning experience.

When used ineffectively, an interactive whiteboard is no better than a chalkboard. However, when educators use an interactive whiteboard to facilitate engagement with new information in a way that encourages participation and focus, this piece of hardware transforms the learning experience. In addition to the humble whiteboard, toy robots can teach the fundamentals of coding, spreadsheet programmes can teach scientific data processing and analytical skills, music software can teach recording and composition, and console games that require the user to physically dance, can improve gross motor skills. 

What we therefore must be mindful of is how we adapt our pedagogical practice to integrate technology in a way that allows students to be active participants in their learning. To support such integration, governments and school leadership teams can create a supportive culture in which teachers are equipped with the technical ability to use new technology, going back to the best pedagogical practices in order to decide what works for their students and how, together, they can make the most of it. 

Looking to the future though, I ask myself the same question that many before me have asked: what is the next big thing? 

Elon Musk recently announced a $27 million investment in Neuralink, a venture with the bold mission to develop a brain-machine interface, which will improve human communication, even in non-verbal users. If that wasn’t enough, The Open University in the UK then went on record to predict that brain-to-brain learning will be realised by 2070.

Maybe these will be the next technologies to join the learning revolution.

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

You may also like…