Apparently, an increasing number of colleges and professors are banning laptops in their classrooms, citing poor grades and general distraction. These are definitely issues worth consideration, but the article raises a good point counterpoint:
For years, educators have been clamoring to put technology in the hands of young students through partnerships with big tech companies, best symbolized by the One Laptop Per Child initiative. But by the time those kids grow up, they might well find university authorities waging a war on laptops in the classroom.
Is a blanket ban on laptops really a realistic means of dealing with perceived digital distractions in the classroom—or does it it actually create a deficit in terms of learning?
Banning laptops in the classroom seems to be a view shared by many. However, to do so at a time when so many students are actually being trained to think in conjunction with digital media tools seems counter-intuitive. Granted, instructors are charged with educating students and any distraction is an unwelcome one, but in the absence of laptops students will undoubtedly find some other means of distracting themselves during monotonous lectures. Thus the introduction of laptops to the classroom does not change the interaction between the student and the material—which is unfortunate, and we'll get to why in a moment—but represents a physical manifestation of "distraction" (i.e., boredom) for school administrators and instructors to single out.
For example, research by Anderson and McClard (1993) demonstrates that students have different perceptions about what constitutes study time versus social time as compared to adults. Essentially, students believe that much of the time that could be classified as "social time" by outsiders is actually study time. The reason for this is that during study breaks and other social activities, students often collaborated and discussed difficult assignments:
For example, during a pizza study break students covered a variety of topics related to their current class assignments. One student discussed a political science paper, another tried to figure out how to approach a paper topic in a literature course, and yet another student discussed topics in a geology course ... Because study breaks were a social activity, it appeared to the outsiders that students goofed off a lot. Clearly, the study break, as defined by the student, was actually study time. The academic problems that the students worked out during a break could not be worked out in more formal settings or on their own. The study break was a secure environment for testing ideas (Anderson and McClard 1993: 165 - 166).
Anderson and McClard demonstrate how learning can occur despite potential distractions. That is not to say that some distractions aren't detrimental. When grades begin to slip as a result of too many study breaks and preoccupation with digital devices, a warning is certainly necessary. It is then up to the student to take the necessary steps to correct issues. Hopefully, as our society becomes more tech-friendly, campus resources will also include support for those looking to better manage tech resources.
The "blanket ban" position ignores the changing relationship we are developing with technology, which is particularly relevant to the Millenial generation and those that will follow them because it ignores the ways perceived distractions may actually strengthen the learning process. These individuals grow up completely immersed in technology—they consume at least 10 hours of media a day according to one source between texting, e-mailing, tweeting, posting on Facebook, catching a favorite show on hulu.com or playing games online. And they are managing it all with a sense of ease that few beyond their generation can understand.
How? Their brains are changing. Dan Tapscott, author of Grown Up Digital, uses findings from a large research project to suggest that the Net Generation will have spent more than 30,000 hours on the Internet and playing video games by the time they reach their twenties. Consequently "this has changed their mental reflexes and habits, the way they learn and absorb information." While some view this reliance on technology and media as a detriment to logic, Tapscott disagrees:
Net Geners are faster than I am at switching tasks and better at blocking out background noise. They can work effectively with music playing and news coming in from Facebook. They can keep up their social networks while they concentrate on work—they seem to need this to feel comfortable. I think they've learned to live in a world where they're bombarded with information, so that they can block out the TV or other distractions while they focus on the task at hand. This is a powerful advantage in a digital environment that's buzzing with multiple streams of information.
Tapscott's assertions are supported in part by John Seely Brown (2000) who noted that the attention span of the teens, which varies between 30 seconds and five minutes, is comparable to top managers operating in industries that require fast context-switching. "So the short attention spans of today's kids may turn out to be far from dysfunctional for future work worlds" (Brown 2000: 13). Brown describes the emergence of a new form of literacy where the individual needs to know how to "navigate through confusing, complex information spaces and feel comfortable doing so" (2000: 14). Thus it's not that today's students—and those of the future—are incapable of learning, they are learning in new ways, and perhaps it's time for the academy to shift its pedagogical methods.
Today's students are in a better position to learn how to learn at an earlier stage, rather than just memorize and repeat facts. And hasn't this always been the goal of the academic institution? To teach individuals how to find, access, and interpret information for themselves? Consider the chart at right, taken from Brown's study. If we think about knowledge as an iceberg, academic learning tends to emphasize just the tip, whereas the bulk of learning lies beneath the surface when the individual is immersed in the practices that they wish to learn (i.e., discovery-based learning):
Curiously, academics' values tend to put theory at the top in importance, with the grubbiness of practice at the bottom. But think about what you do when you get a PhD. The last two years of most doctoral programs are actually spent in close work with professors, doing the discipline with them; these years in effect become a cognitive apprenticeship (Brown 2000: 15).
The web is changing this by encouraging cognitive apprenticeships at earlier stages so that students are more adept at the management of real world data, and not just theories and facts. [Right: Knowledge as Iceberg chart from Brown 2000. © Brown]
My concern about the removal of laptops from the classroom is linked to the double digital divide. As we create a digital world that frequently and increasingly overlaps with the non-virtual one, it is important that individuals not only have access to technology but understand how to use it. This should certainly be a part of any college education. Instead of banning these "distractions," instructors need to take into account how best to use them—to encourage learning and to prepare students for the world beyond academia.
Let me also say this: I know firsthand how pressed instructors are for time, however, I also know from my own experiences that students have an expectation that Web 2.0 will be a part of their lives in the classroom. As devices like the iPad become common, and textbooks move to digital formats that allow for digital note-taking and increased sharing of information, this can be a boon to instructors. I understand the concerns about students not getting the information needed, but it is time to think about how that information is presented and whether this is in line with how students of today and tomorrow are actually obtaining and processing information. A blanket ban on laptops is not a solution—it is a step backwards and a disservice to students of the digital age.
Are you a current or past classroom laptop user? What was your experience? Share your thoughts below.
Credit where credit is due: @tadmcilwraith posted a link to this article on 4/26/10 which inspired this response.
Anderson, K., & McClard, A. (1993). Study Time: Temporal Orientations of Freshmen Students and Computing Anthropology Education Quarterly, 24 (2), 159-177 DOI: 10.1525/aeq.1993.24.2.05x1119a
Brown, J. (2000). Growing Up: Digital: How the Web Changes Work, Education, and the Ways People Learn Change: The Magazine of Higher Learning, 32 (2), 11-20 DOI: 10.1080/00091380009601719