I don't like the term 'digital native'. It's not that there aren't generational differences or that there isn't something in the concept: it's more that it strikes me of being a horrible simplification of what is actually going on.
My first issue with the term is that it is very poorly defined. How do I tell if somebody is a digital native? Indeed, am I one? (Oh, and do people really exist who print out all their e-mail rather than reading it on the screen?) If it is based on technology exposure and use, are there different types of digital natives? An obsessed computer game player might not be a MySpace addict and vice-versa. Is the concept about literacies more than computer usage?
Or would the definition be more meaningful if it were related to psychology instead? Do you require some sort of addiction to instant gratification to be a digital native? A tendency towards continuous partial attention? A certain lack of fear of experimenting and failing at things? A preference for inference over deduction and dislike of learning step-by-step? And are these things independent or not? Can one prefer 'random access' but not operate at twitch speed? Is there a binary divide or are there different degrees of ‘digital nativeness’? There's also the question of whether there is there a definite link between technology exposure and these psychological aspects. It's perfectly plausible that playing lots of computer games when young might change your brain, but that's different from it actually being true.
The next thing that bothers me is that much of the evidence of prevalence of digital nativeness in our youth is anecdotal. Somebody's kid has twenty windows open at once on their computer, or a student listens to music while doing their homework (didn't we do that fifteen years ago?). We know pretty much everyone of a certain age uses MySpace and a few other things certainly, but I haven't seen anything at the type of level of detail that really tells us what the situation is. Without a sensible way of deciding if somebody is a digital native or not, then it's also hard to tell the extent to which digital natives exist in other generations. I've certainly got friends of my age who are constanly on facebook, who play lots of computer games or who find it hard to put their mobile phone down. Teenagers don't necessarily have a monopoly on such things. I'm a little bit sceptical too because some of my own (equally anecdotal) experience. I remember a couple of years ago discovering that nobody in my class of first year students had heard of wikipedia. One of my students didn't have a clue what to do when I told the class to open the web browser and go a particular URL. I've also encountered teenagers who didn't live on a diet of computer games and are slightly nervous about using computers.
Supposing that we can come with a sensible way to frame digital nativeness, which I don't think is inconceivable. Then, we have the issue of whether there are differences in how we should teach digital natives compared to digital immigrants. Again, I get frustrated by the lack of concrete evidence that I've been able to find (if there is any, then I'd love to know!). My instinct is that engaging activities and quick feedback cycles would probably be appreciated by everyone. The digital natives might just be less tolerant of bad teaching. It's also important to realise that many computers games may be ‘random-access’ in some senses but highly structured in others. Proper research is important because without it, there's a danger that the digital immigrants will read that digital natives behave in a certain way, and then make incorrect inferences - the ‘they send texts a lot, therefore we much teach by text’ syndrome. Personally, I feel that from a practical perspective, if you do want to figure out how to teach digital natives, then you probably have to try to become one, rather than observe from afar. Otherwise, it's a bit like trying to understand French culture without ever having learned any of the French language.
Even if there is a difference, there's the political issue of how you deal with such a difference. Do you have separate classes or forever discriminate against people not privileged enough to have had access to computers during their youth or who are re-entering education at any older age? Also to what extent should we preparing our students for the huge majority of workplaces that aren’t yet ‘digital native’-friendly? Just as importantly, although you may learn useful things from computer games and the like, it's not totally clear that every aspect of digital nativeness is automatically to be desired. Mindfulness has its place, a certain appreciation of when delayed gratification is worthwhile is worth learning and multitasking often reduces productivity. Sometimes it’s better to stop and think about what caused that bug rather than putting the debuggeronto it, or to think about the issue that everybody is blogging about rather than just reading what everybody says in search of the holy grail. Maybe it’s better to concentrate on the conversation you’re having with the person in front of you than to answer your mobile. If this phenomenon is partly some sort of Skinnerian addiction, then maybe we shouldn't be encouraging it - we might want to avoid the ‘institutionalisation of short attention span’. If they are having withdrawal symptoms, it’s not surprising that today’s generation don’t like the slow place of much teaching. Students may not like following through logical arguments, but is that a good reason why they shouldn’t learn to do so?