Recently, WBUR ran a selection of articles on what it calls "Digital Lives." The series thus far covers relationships, video games, and multi-tasking. The series as a whole is of course playing upon fears and leaving one with more fear than substantive tools and ideas about how to live a digital life. I'm going to focus on the multi-tasking one as prompted by a peer of mine, but needless to say all of the articles feel superficial and fearmongering than productive to having an indepth discussion. For instance, the video-games article spends much of the discussion on the antiquated concerns around video games and violence and only so very briefly talks about the benefits of videogames (almost as an afterthought). And any article on video games in our culture that doesn't mention the likes of Jane McGonigal or Tom Chatfield are ones that have been poorly researched.
So 10 Reasons why the multi-tasking article annoyed me:
1. I'm always dubious by a media outlet that uses all the various forms of media (including several forms of electronic media) to tell us how dangerous digital media is. They present an article that is filled with links (i.e. distractions) that include text, tests, and video as well as a slew of distractions surrounding the article--ads and additional links. Clearly, if they believed in the truthiness of their words, they wouldn't engage in the same distracting-generating machinations.
2. Peter Sagal is held up as some kind of important person to consult or worth consulting about this. Why is a game-show radio host the go-to person about digital media and multi-tasking? Does that mean I can qualify Marc Summers (host of Nickelodeon's Double Dare game show) as a military consultant for a discussion on the risks involved with further involvement in the Middle East?
3. Eyal Ophir as a consultant is equally dubious since much of his suggestions are focused on directing digital interaction as a new experience or different experience than what it is now (and he just so happens to be working for a company that is trying to reinvent the browser--that is, his recommendations are directing readers to believe his product is relevant).
4. No real mention of evidence or counter points that have been made to the arguments about multi-tasking (e.g. Brian Chen in his book Always On rips large holes in the research).
5. No clear definition of what multi-tasking is or acknowledge that we regularly are multitasking successfully. The closest they get is "multiasking is believing that we can think about several things at once." Qualifying "think" is problematic and ill-defined. I can cook and listen to an audiobook at the same time; I can run on a treadmill and watch a movie at the same time. Hell, I can even walk, talk, and chew gum at the same time--all of which requires conscious action and thought to some degree. We drive and talk to someone in the car regularly.
6. The metaphor of the brain as computer (using "working memory") is ironic, because of course, computers can do multiple things at the same time (look at the bottom of your screen--how many programs do you have open).
7. The primacy of conscious thinking for creativity, deep thoughts, etc is a bit problematic too since the subconscious also works substantively to push us to that level. There's much to be said about how our subconscious gives space for ideas to develop and equally important as Steven Johnson points out in his book, Where Good Ideas Come From, a good amount of creativity is also generated from engagement and interaction with others (e.g. this post was generated by an interaction with a peer).
8. "A recent study, No A 4 U, shows that students who use Facebook or email even as they do their homework earn lower grades than those who don’t." The study is correlation; not causation--which could easily mean those students were predisposed to do poorly in those classes and give over to using social media (and in the absence of social media might do just as bad).
9. The "landmark study" that they promote--assumed properties but did not actually measure or assess the kind of multi-tasking we're talking about. It had participants looking at blue and red rectangles on a screen and changing them about and the participant had to register only the correct color (red) as both red and blue changed. This may be a type of multi-tasking but it is seriously decontextualize from the multi-tasking we're talking about and very much a different beast. Asking someone to engage with a foreign screen and play a game they are not likely to have played before--is not the same on a physical or conscious level to what someone does when they are absorbed in their screen. That's like assessing someone's cooking skills by putting them in the bathroom and giving them a random assortment of ingredients...as opposed to their own kitchens with their own supplies.
10. They anthropomorphize technology ("Technology has brought us all of these amazing rivers of news and updates and communication. But Ophir says technology is doing a terrible job of filtering out what’s irrelevant, or knowing the best time to show you what you do want to see.") as if they themselves (the writers, editors, producers, companies) have no connection to what's going on and why there are some many distractions vying for our attention (going back to point #1). Furthermore, we have the tools within the technology to better use it and less distract it--more tools aren't necessarily needed.
There's always more to say on the topic, but I thought I'd get my initial response out first and go from there. At the end of the day, I want us to be critical of technology and its uses, but with that criticism should also come guidance, purpose, and a balance of viewpoints. Too often we're reacting like we've unleashed Frankenstein's monster; of course, just like Frankenstein, if we took more care and consideration with the monster, foreboding disaster might not actually be around the corner.
By Any Other Nerd Blog by Lance Eaton is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.