Over the weekend I attended a conference in San Francisco on the iGeneration: How the Digital Age is Altering Student Brains, Learning and Teaching. Evidence for Kevin Kelly's conclusions in What Technology Wants, that the Technium continues to shape human existence is supported by the most current research on the brain function of Digital Natives. What I wonder, are the implications for teachers, and learners in the future?
Comments
Thinking not of yesterday, but of tomorrow
Hi Louise,
I think you did a great job with your podcast and the conference sounded really interesting! Listening to your podcast I was particularly struck by the perceived positive and negative effects multitasking and technology are having on both younger and older users. I find it hard to believe that there are tweenage girls who effectively do homework, watch TV, listen to music, use the Internet, and text at the same time -- maybe all those things are happening in the background and are distracting, but utilizing all these technologies at once? I don't think this is multitasking at all -- I think this is just distraction and interruption in the course of "doing homework." I often employ these "multitasking" methods myself and I can tell you that Facebooking while I'm writing a paper isn't multitasking, it's procrastinating.
I do believe that all this distraction is negatively affecting their development -- after all, focus is an enormous part of studying and performing any sort of task, especially school assignments such as the mentioned five paragraph essay. What I found especially interesting is that their focus of language has switched from past or present tense to a use of future tense -- hard to imagine using outside of a proposal or creative writing. But I don't think this should be especially troubling, I just think that perhaps teachers will have to, as in so many generations before, revitalize the techniques they use to teach writing to accommodate the eccentricities of Gen Z. Future tense... I find that so compelling, to think that technology has brought about a change in the idea of the way children think of time. We think of studies as being based in the past, of looking back, but now children are looking forward. I don't think this is pejorative -- I think this is more of a reorganization of temporal constructs and the way we relate to technology and society, through technology, and I think it's interesting that it's happening to this generation, this age group. Only think of the kinds of changes they'll bring about as they grow -- our very concept of time could be changing as we speak.
Pedagogy, Multitasking, Focus, and ... Hey, Wanna Go Ride Bikes?
Louise,
Nice work on this podcast; the peaceful background music complemented your discussion nicely, although the audio levels were occasionally a bit low. Sounds like a very cool conference! Color me jealous. I thought these issues are all extremely relevant to our discussion, both as graduate students and instructors. Our communications as students and teachers are always/already mediated through our technological interactions within Kelly's "technium," a point that's concretely enacted by this course. While much of our reading centers on a consideration of how these interactions change, it behooves us as current (or future) instructors to also consider how these changes affect our students... or indeed anyone occupying the discursive position of "student." We could consider this as an act of praxis, one that allows our rhetorical inquiries to inform our teaching. Based on your podcast, I would tentatively break down this line of thought into 2 categories: changes in cognition and changes in curriculum.
When describing changes in cognition, it seems difficult to contest established neurological findings. Perhaps, as Kelly said, what could be critiqued are the types of questions asked, or the range of findings. Is the evidence taken from Dr. Nash's study of "tween" girls generalizable to all teenagers, female or male? Your sources suggested that internet use can benefit adults but distract teenagers; how do they define the cut-off point (ie: when does one stop and the other begin)? Do these benefits or distractions taper with age? Further, is this data conclusive or merely corollary? I felt this also applies to those researchers who cited a rise in stress and A.D.H.D. among students of the digital generation; to what extent are these findings merely the result of increased awareness and better testing capabilities? As you state, this data raises more questions than it answers.
The point about synapses being rewritten by repeated use returns me to our continual feedback loop, as use inscribes knowledge, and knowledge defines use. The deep grooves of synaptic channels worn by multitaskers may indeed be different than those of non-multitaskers, but I would be curious to know whether these two groups demonstrate different understandings or experiences with the technologies they use. Further, at what point do we set a limit for "multitasking?" Where is the dividing line between these groups? Simultaneously doing two or three things at a time? What element of usage or technological interface differentiates these actions from others, such as walking & chewing gum & talking at the same time? These and other related questions point to the fallacy of over-determining cognitive cause and effect without adequately taking into account contextual complexity. A recent article in the New Yorker examines this topic tangentially by reviewing a number of works that try to weave their own grand meta-narratives about technology use and socio-cognitive change: How The Internet Gets Inside Us.
In terms of changes in curriculum, as Kelly points out numerous times, what one generation invents, the next takes for granted. Digital technologies are certainly no exception, and the earlier we become socialized with/through them, the more imbricated we become. Many of these skills are naturalized to the point of second-nature to younger generations, such as multi-tasking, or the act of seeking dynamic agency or individuation within traditionally static modes (such as wanting to personalize off-line media, humorously demonstrated by this real-life Facebook "like" stamp. Teaching students entirely brought up within a digital environment will certainly be (and is) different, whether we want to categorize them all as multitaskers or not. As Louise points out, this affects teachers and course designers, distance technicians and textbook authors... and anyone else trying to get (and keep) a student's attention. But perhaps this is the wrong focus, maybe this is too biased towards traditional pedagogical approaches; what if instead of expending our energy trying to hold their attention, we allow them to direct their own efforts by engaging them through multiple outlets? This is certainly the practice currently being enacted by our own course experience. Some of you may be familiar with concepts of multiple intelligences or differences in learning styles (such as VARK: visual, auditory, kinesthetic/tactile); we may shortly be adding uni-focal or multitasking to those lists!
One closing question, or cognitive food for thought: if we tentatively accept the premise that millennial students brought up within the digital age must by necessity be multitaskers (either shallow or focused), what of those that aren't? Already many career-related fields actively seek multitaskers in order to enhance productivity, a trend which will undoubtedly continue; won't this desire spread, providing further incentives for multitasking? My thought here is that regardless of whether we think the use of digital technology causes increased multitasking or not (and ignoring the neurological debate about the effects of multitasking), it would seem that the professional/social/personal imbrication of technology creates an incentive structure for multitasking. The number of demands for our attention will almost certainly continue to increase, so given these factors wouldn't students be more likely to teach themselves to multitask, even if they were not already predisposed to doing so? This suggests the difficulty of determining cause & effect again, but also points to the problem of cognitive escalation. Will the more focused among us have to start faking it?
For a separate but related discussion of escalation involving cognitive enhancing drugs, there's a great article in an earlier April, 2009 issue of The New Yorker called "Brain Gain," by Margaret Talbot. I'll try to link to it, but it might be firewalled: Brain Gain?
- Jeff
PS: Since we've been looking at cognition, technology, and pedagogical praxis, I thought I'd share a pertinent debate that I use to (attempt to) engage my intro to writing freshmen:
WSJ- Does the Internet Make Us Smarter? (Clay Shirky)
WSJ- Does the Internet Make Us Dumber? (Nicholas Carr)
"Legen...wait for it, and I hope you're not lactose-intolerant because the last part is...dary!"