Ruchira nudged the A.B. authors to respond to a recent story in Wired, the magazine that by my lights utterly confuses its content with its ads. "Tell us if you agree or disagree" with the story, she asks, noting incidentally that she agrees. I’m pretty sure I disagree.
First, however, I have to decide what it is I’m disagreeing with. The story, like the layout of the print edition of the magazine, is confusing. Is it suggesting that our ability to memorize basic data like "standard personal info" is declining? Or that we are simply learning to use technological devices to outsource memory tasks? Or both? It appears, contrary to the lede, that author Thompson is not in fact arguing that "we’re running out of memory," but that some of us are growing more adept at interacting with storage and retrieval devices in lieu of taking care to memorize simple data. Unfortunately, when he asks, "Does an overreliance on machine memory shut down other important ways of understanding the world?," I’m afraid it’s too late. As with jazz, if you have to ask, you’ll never know. In Thompson’s case, I’m afraid the answer to his query is "yes."
I prefer to avoid characterizing our immersion in technology in terms of over-reliance, which suggests there is some proper degree of reliance independent of broader cultural circumstances. It is more likely that we rely improperly on technologies. For instance, one problem lurking in Thompson’s scenario is ordinary and far less attuned to the science fiction vibe informing his story: failure to backup—and test restoration of—all of our outboard memory. In fact, silicon memory does not have perfect recall, a lesson no fun to learn the hard way. A related perspective I take has to do with Thompson’s so-called standard personal information. Technologies shift the boundaries of what we regard as standard. Thus, elderly people—Robertson’s proxy for folks who tend not to engage with new technologies—consider relatives’ birth dates "standard," where the pre-30 crowd do not. It has nothing to do with memory, and everything to do with the cultural significance of these data.
I, for one, can fill out forms requiring addresses, phone numbers, SSNs, birth dates, license plate numbers, driver’s license numbers, and the like without stopping to think. I still remember the license plate on my parents’ long gone 1968 VW Beetle: WAP 772. I have amazed myself and others with such moronic feats, although once I genuinely regretted the skill. Back in the late ’70s, listening to one or another of Savoy Brown’s LPs, I read the recording details, as usual. In a flash, I recalled that the producer and studio were the same as those of one of Caravan’s LPs, and that the dates of the recordings were proximate. I mused upon the two bands crossing paths as they independently crafted the great works I had come to enjoy. And then I wondered why the hell I cared at all about such nonsense. Shortly, I got rid of a large part of my record collection. I felt I had been preempting "other important ways of understanding the world," and so in one respect I would have been in agreement with Thompson’s notion that trivia was crowding out intelligence.
My fundamental complaint with the story, however, has to do with the institutionalized Wired ethos it purveys, its cool hacker party line. Here’s a story nominally about memory, intelligence, and epistemology…but really only about gee-whiz boys-and-their-toys bells and whistles, hyperbolically celebrated with the silly claim that "the cyborg future is here." Thompson does want it both ways, but not just in terms of varieties of memory. He wants to maintain a cultural standard—a baseline drawn roughly where elderly people, for example, take matters for granted—and also to validate behaviors that fall short of or exceed that baseline: his proud forgetfulness, his "genius…on the grid." But the baseline is shifting while he’s busy congratulating himself for adopting Google and Wikipedia into his routine. And he’d better dock the outboard and back it up.
10 responses to “Read Only (Dean)”
Depending on how you understand the phrase “running out of memory,” I’m inclined to agree with it. Say my brain is like a $5 flash drive that you get at Target. If you think that the digital age has given us more information, then well, I’ve only got 128 MB of memory, so something’s got to go. Combine that with changed cultural norms, such that memorizing birthdays is no longer important, and changed technology, such that memorizing addresses is no longer necessary, then yeah, I’ve got less memory to commit to these “measures” of “memory.”
If you understand the phrase to mean that memory is actually getting worse, however, then I disagree. There’s no evidence that we’re remembering less– only that we’re remembering differently. This could be explained simply by, as suggested above, different priorities in what needs to be stored and available for recall.
Of course, if the author genuinely can’t remember a single friend’s email address, then he’s just an idiot. Or this is a symptom of stress– and the world we live in is stressful, and technology exacerbates that, so it sort of fits.
LikeLike
I wouldn’t call 9/25/07 a ‘recent story’, Dean. I tried to find the original Ian Thompson article that Clive Thompson quotes- couldn’t locate it. I’m off on a tangential obsessive search again… Maybe a blog post will result from it.
LikeLike
Many apologies. In my great hurry to report for jury duty on time, I sent out the wrong link.
Here is the more “recent story.”
LikeLike
Joe pretty much speaks for me, but I have to put pressure on the hypothesis that we are now immersed in more information. My take is entirely counter-intuitive. I believe the amount of information is static, and I illustrate the point by noting that the weather tomorrow was, is, and always will be what the weather tomorrow turns out to be. We flatter ourselves that we are contending with more information than our predecessors, but as I see it we are merely obsessing more about details (or trivia). See. I warned you it would be counter-intuitive.
Ironically, I’m having a hard time concentrating on the real recent story, the one by Nicholas Carr, but not because the ‘net has made me stupid. It’s because there’s something stupid about Carr’s essay, not least its echo of Thompson’s “Golly gee! The world after the ‘net is on Extreme Makeover!” Carr’s musings strike me more as symptomatic of emotional, rather than cognitive, responses to the brave new world, which challenges our ambitions, our sense of fitting in, our with-it factor. His thesis highlights uncertainty, and he wonders whether he’s a “worrywart.” This is fair territory for literary production, but I read Carr’s article as something of an allegory, a tale about dread and worry translated into a safe account of mental algorithms. Readers of Angus Fletcher’s marvelous study of allegory will recognize the common motivation for recourse to the “symbolic mode.”
By the way, isn’t it funny that Sujatha caught the mistaken reference because September 2007 seemed not recent (enough)? Had I paid closer attention while I was writing the post, I would not have characterized it as such, but I take this to be an instance of a shifted standard. It’s reasonable for Carr not to have written, “In a recent work by Nietzsche…,” but a nine-month old article out of Wired?
LikeLike
What is it with War and Peace already?! Thompson and Carr both mention it, and now this. (Thanks to The Liminal Librarian for the tip.) The gist of this Newsweek story is “Gen Y cares less about knowing information than knowing where to find information,” but “[T]here is no empirical evidence that being immersed in instant messaging, texting, iPods, videogames and all things online impairs thinking ability,” and “Insofar as new information technology exercises our minds and provides more information, it has to be improving thinking ability.”
Why, I wonder, has reading War and Peace come to represent the paradigm of a “difficult task”? How odd that three English language articles resort to a translation of a work originally written in Russian! Why not Joyce’s Ulysses or, God forbid, Finnegans Wake…or Beckett’s novels, or Milton, Gertrude Stein…
LikeLike
Dean:
The amount of information is NOT static. Forget the weather – “it is what it is” whether we know it or not, as you say. But there are numerous man made gizmos and transactions out there that were not around only a decade ago. Take the case of passwords. A few years back we had to remember our name, address, phone numbers and SSN for most official transactions. Now we are entangled in the maze of ever increasing number of “passwords” for accessing ever increasing amount of information and data available to us at the touch of a button or a key stroke. You set up one easy-to-remember password for one account and most of us would like to keep the same one for everything if possible. But that is nearly impossible. You may have begun by picking a password with just six letters. Along comes a new business which needs seven. Some allow only numerals. And another that requires a combination of alpha-numeral. The correct answer to a “secret question” or mother’s maiden name is the “Open Sesame” to certain well guarded data. The problem arises when one can’t remember if it is your mother or your spouse’s. All this has added to increased load on our memory. We used to have to remember only the telephone numbers of our friends and family. Now we also record their cell phone numbers and email addresses. That is already a lot of extra info in our personal day to day dealings. So, our memory is not necessarily getting worse but we are having to store more in our brains and not always so successfully. Hence the dependence on silicon based memory (or plain old paper).
I found a lot of familiar territory in both the Carr and the Thompson articles – about my ability to read in one sitting and the propensity for forgetting what I had posted myself on A.B. a few years or even a few months ago. There is however one difference between me and Carr. Although the “volume” of my book reading has decreased considerably in the past five years since I became a serious net surfer, I am still capable of “deep” reading without distractions while faced with print material. My choppy and fidgety reading is confined entirely to reading on the web. I cannot read a lengthy piece on the computer screen in one go. (While writing this comment, I have checked the blog traffic thrice, my email half a dozen times and the NYT headlines twice so far.)
There is some truth to our seeming forgetfulness of simple things like phone numbers and birthdays. My mother did not just remember birthdays of the entire clan but also the months in which most weddings took place among the extended family. She remembered names and other details of relatives three and four times removed and could recite lengthy poems she had learnt in her youth from memory. I cannot do any of that very well. But I “know” much more trivia about the world in general than she did. There is a Chinese saying which goes: The longest memory pales before the faintest ink. We all know that when we write something down, the information is more accurate and easily retrievable than committing it to memory. However, writing down something plays a different trick on the brain – we tend to forget the information more easily in the long run. It is well established that illiterate people often have far better navigational skills than those who can read maps and road signs. The former, compelled to depend on their memory alone, take better note of landmarks. Now that we have phone memory, computer memory and depend on the GPS to find our way, we do not remember those pieces of simple information as well as we used to just a few years back because we are secure in our knowledge that we can “look it up” as needed. The brain doesn’t have to be taxed into storing that info. I don’t know if Google (and the internet) is making us stupid. But yes, we are definitely remembering and processing information differently.
I too was struck and amused by Sujatha’s assertion that a seven month old article is not “that” recent. I ascribe that to her youthful disdain for even the “slightly stale” in a medium where ever new material is sprouting up like mildew in a dark, damp bathroom. That is why I used the quotation marks around recent story in my previous comment. :-)
And why is War and Peace the gold standard for dense and difficult reading, when the others you mention will serve the same purpose admirably? Who knows? It is probably because more people have heard of or attempted to read W&P (there is a movie version) than have tackled Finnegans Wake. It is a bit like the “rocket scientist” is to tricky mechanical tasks and Hitler to evil.
LikeLike
In reverse order:
Yes, but “rocket scientist” and Hitler are more purely figurative than W&P, it seems to me. That is, I don’t think it’s being used here merely as a figure of speech in the way “rocket scientist” is used. People don’t literally mean “rocket scientist” when they say it, where here I think W&P is the very referent. I’m just amazed by the coincidence of these three stories alighting upon the book for an example (albeit not entirely independently, since Carr obviously read Thompson’s).
When you write, “we are definitely remembering and processing information differently,” I wonder where lies the difference. In the “remembering and processing”—as if the brain actually operates in a different fashion, a way in which it had never operated before—or in the fact of our bothering to process different information or a different distribution of kinds of information? Alright, so now we spend more intellectual time and energy remembering passwords. That doesn’t mean the activities of remembering and processing occur any differently. It’s like saying, “I used to walk to the post office, then to the market, then home. Now I first walk around the block seven times, then to the market, then the post office and home. I’m walking differently these days.” Of course I’m not. The path is different, but the operation of walking is identical.
This gets to the matter of the constancy of information, where I get all metaphysical and weird, even Borgesian. I refuse to live exclusively in the moment. The fact of the explosion of passwords is a characteristic of our environment…exactly like the weather. Precisely the same quantum of information suffices to represent this world today as would have sufficed when I was born in ’59. Limiting the universe of information to what would have sufficed to represent the world in ’59 and comparing that to what suffices with respect to 2008 neglects, for one, that there will be a ’60, and so on.
I’m doing a miserable job of arguing my point, because I’m not really interested in trends in information. (Shocking, I know, coming from a librarian.) I’m interested in imposing a perspective that reminds us what a singular information scientist, Peter Allen, taught when he penned “Everything Old Is New Again.” The Bible said something similar. Take that Newsweek article thesis that pretends to reveal a heretofore neglected distinction between information and its embodiment: “Gen Y cares less about knowing information than knowing where to find information.” To the extent it describes a preference of a generation of youth, it’s a testable thesis. But the distinction itself has nothing to do with new technologies. Samuel Johnson (1709-1784) already famously established it: “Knowledge is of two kinds. We know a subject ourselves, or we know where we can find information upon it.”
LikeLike
Talking about mistaken references to the ‘recent story’ term, how is it that no one caught my blooper about ‘Ian Thompson’ instead of ‘Ian Robertson’? Never fear, I will have a profile post coming up about that interesting gentleman. Just wait till I manage to track down his much-quoted 3000 people polled on relative’s birthdays and phone numbers.
Ruchira, Dean,
‘Recent’ is relative. Given my addled state of mind, it usually translates to anything in the last 2 hours. Anything older than that is ancient and consequently consigned to the dustbin of history by my brain. Which is why I get through life with virtual and real post-it notes.
‘Rocket scientist’ is a phrase that has always given me a secret kick- I was a real one in the paleozoic past before I arrived in the US and started losing all my mental acuity. That’s what Pepsi and pizza will do to you! So when someone says things like”That needs a rocket scientist to figure it out”, I think to myself “Yes, and I’m the rocket scientist who will!” Of course, I make an exception for facetious use of the phrase- those do not refer to me.
LikeLike
More evidence that my brain is turning to mush despite my being a cell-phone and Blackberry luddite: I meant to say ’till I manage to track down his much-quoted paper about 3000 people polled on relative’s birthdays and phone numbers.’
LikeLike
I’ve read this exchange with interest, as the decline in readership of novels has long been something I’ve thought about. My concern with information technology is not so much the sprawl of it, but the constant interruptions and obstacles to concentration when you’re getting constant emails, web ads, and IM prompts. I’m glad that industry at last seems to be figuring this out: http://www.nytimes.com/2008/06/14/technology/14email.html
Working with people who expect constant monitoring of email makes it impossible to get anything done!
LikeLike