|« Myth of the Digital Native - Part 2||Moodle Moves to the Front »|
This posting is long overdue. It has sat on my 'to comment' list for several months. Having just completed some internal research on undergraduate student use of Massey University's Virtual Learning Environment (VLE), it seems timely to comment on the myth of the so-called digital native. The term digital native was first coined by Marc Prensky (2001) when he observed:
"Our students have changed radically. Today's students are no longer the people our educational system was designed to teach. Today's students have not just changed incrementally from those of the past, nor simply changed their slang, clothes, body adornments, or styles, as has happened between generations previously. A really big discontinuity has taken place. One might even call it a 'singularity' - an event which changes things so fundamentally that there is absolutely no going back. This so-called 'singularity' is the arrival and rapid dissemination of digital technology in the last decades of the 20th century" (p.1).
The distinction Prensky and many followers made between digital natives and digital immigrants quickly became uncritically accepted throughout the popular literature. That said, I'm pleased to report that in 2005 I recorded my misgivings about this crude binary distinction. In an editorial, entitled 'The Next Generation: Looking into the Future', in the journal Computers in New Zealand Schools, I wrote:
"The notions of digital native and digital immigrant may be useful slogans for provoking debate but the distinction does not stand up to inspection."
Drawing on the work of Owen (2004), I pointed out:
• The notion of a teenager glued to the phone endlessly speaking to their friends as an illustrative concept pre-dates the mobile phone.
• The vast majority of children in developed countries spend less than 30 minutes a day on computer games. By all accounts the main demographic for computer games players is in fact 20-35 year-olds.
• In the US, the highest usage of the Internet at home is among 35-44 year-olds. Professional adults actually make more significant use of the different capabilities of ICT than anyone else.
• There is plenty of evidence to testify that not all teenagers spend lots of time with technology (e.g., Facer, Sutherland, Furlong & Furlong, 2003). They do lots of other things instead such as riding bikes, playing music, skateboarding and so on.
I went on to say that while there may be profound changes taking place these need to be situated in diversity rather than dichotomy, and in less technologically deterministic language. Moreover, such changes need to be treated as problematic, as in the words of C.P. Snow (1971):
"Technology... is a queer [strange] thing. It brings you great gifts with one hand, and it stabs you in the back with the other" (cited in Owen, 2004).
The concept of the digital native implies that digital technology is an entity and influence independent of external forces—a classic example of technological determinism. Thus, as Owen (2004) observes, uncritically embracing slogans like digital native can lead to bad decision making based on erroneously simplified and unrealistic expectations of what our future will be like in the so-called Digital Age.
With the benefit of hindsight, our concerns were solidly grounded and have been confirmed in a number of recent studies and journal articles. Part 2 of this posting reviews and explores some of these publications.
Trackback address for this post
Feedback awaiting moderation
This post has 1981 feedbacks awaiting moderation...