Saturday, July 16, 2011

Net-Gen Memory Sinks and Digital Cautionaries

Nick can certainly hold his own in idea-court, but I felt it would be a useful springboard for me to cross-examine Futurismic editor Paul Raven’s examination of Nicholas Carr’s sober rumination on newly minted studies which reveal that our milieu of omnipresent search access may be making us much better at finding information, but detracting from our ability to actually remember said info.

Just to give you the thrusts of each post, here is the essential thesis from Carr:

“There's a fascinating - and, to me, disquieting - study on the internet's effects on memory that's just come out in Science.* It provides more evidence of how quickly and flexibly our minds adapt to the tools we use to think with, for better or for worse. […] Does our awareness of our ability to use Google to quickly find any fact or other bit of information influence the way our brains form memories? The answer, they discovered, is yes: "when people expect to have future access to information, they have lower rates of recall of the information itself and enhanced recall instead for where to access it."”


And Raven’s snark:

“Ah, here we go again – another study that totes proves that the intermawubz be makin’ us dumb. Perfect timing for career curmudgeon Nick Carr …”


Early in Bruce Lee's tour de force "Enter The Dragon", he attempts to explain an essential concept to a troubled martial arts pupil, "It is like a finger pointing away to the moon." The student, befuddled, stares at Lee's outstretched finger. The master whaps the young padawan on the head, correcting, "Don't concentrate on the finger or you will miss all that heavenly glory." He indicates the spectacular full moon, which the student had ignored, so engrossed with the extended pointing digit.

I think Raven is obsessing over Carr’s finger, as it points away to the moon, all the while missing all that heavenly glory. He focuses, as the study suggests, on the "access point", the link-provider, which he seems to a priori disagree with, rather than the actual *information* and arguments provided by the access point. Or, more specifically, Raven focuses on the many fingers of a projected Jerry Springer-esque , (good for clickthrough counts, I guess) and misses the more subtle, deeper, and untribalized discussion of unfettered ubicomp’s affects on human memory. A discussion which requires a bit more neuronal RAM and drive-space to runtime the large flows of data processed through the webwork of logic/concept-chains. More memory than we’re used to needing for our one-hit tweet-zingers and three-chord napkin-length articles necessary to ensnare the hyperneurotic eyeballs and their preschool-length attention spans for some PPM digital nickels before they alt-tab to greener lulz and hurl birds at buildings. Haute post-moderne and charlatan-like as it is to proclaim the 'death' of a cultural phenomenon, it's difficult not to see the Atlantean sinking of book giant Borders as an ominous reaper's knell. In response to Carr’s article, Raven asks:

“Do we actually store fewer and fewer memories, though? Or do we perhaps store the same amount as ever, while having an ever-growing external resource to draw upon […] Or, to use web-native vernacular: citation needed”


Perhaps this comment is an apotheosis of Carr’s concept of the Shallow mind, pancaked to a great breadth of trivia and buzz wordage, yet sporting a retention window less than the length of what used to pass as a long-form article that is now considered an unpublishable, long-winded novel on the ‘net scene. So short, in fact, that complex arguments built on layers of evidence and inference (including non-Wikipedia non-churnalism citations) located in paragraphs a mere page-down keypress away readily evaporate from our atrophying prefrontal lobes in the time it takes to read and copy-paste a selection into our blogpost editor. From earlier in Carr’s post:

“Does our awareness of our ability to use Google to quickly find any fact or other bit of information influence the way our brains form memories? The answer, they discovered, is yes: "when people expect to have future access to information, they have lower rates of recall of the information itself and enhanced recall instead for where to access it.".


It’s not a reckless intellectual quantum leap to say that suggest we’re remembering less important info, unless you’re one of the owners of the Master Nodes i.e. Google, Wikipedia, Apple, who’s marketing epamphlets trumpet the importance of memorizing their URL over storing actual relevant information in your head.
Raven continues:

“I submit that we form similar consolidations on a collective basis using the internet as a substrate; hyperlinks, aggregation blogs, tranches of bookmarks both personal and public.”


Again, I believe Raven is missing the crux here and putting the digital cart before the human horse. The consolidations, embellished by the Emerson hat-tip earlier in Carr’s article (long-filtered through the noodle-sieve, I know) clearly underline the point which is that the influx of sensorineural and linguistic stimuli that we receive are not interpreted in our mind as netflix streaming projections on some internal real3D IMAX by some homunculi (AKA the Cartesian Theater). Rather, we take input through the “webcams” of our senses, but these streams (or “feeds” to the ‘wubzkins) of raw rgb light and sound are then used by our minds to *create* our experiences, accessing the sum total of our memories, our whole human being, our strange-loopy “selves”: and what are we but our remembered experiences? We hear a high trill of frequencies in the 800-200 hertz spectrum and say, “I hear a thrush singing”. But in actuality, all we have “heard” as input from our ear “microphones” is sound, but through our consolidated memories of the sound, of experiences with these whimsical little mammals, we are actively constructing a complex picture of reality: that there is a bird’, with a spotted chest’, sitting in a ‘tree’ nearby, ‘singing’, and that bird is a ‘thrush’. In this sense, the mind-as-computer metaphor is perhaps less accurate than the mind as a Cold War spy submarine, decked out with sensors and information collection data, attempting to construct a blueprint, based on the incoming numbers, of what is actually out there.

It stands to reason, then that if our memories are increasingly neurplasticizing to remember the “info faucet” *sources rather than actually storing the information in our wetware, then, that our “whole human beings”, our submarine intelligence, that library of dossiers collected on the world may be shrinking, growing dimmer as well.
. (*or source: Google’s got that whole antitrust shennanigans on the front-burner now… hate to say I told you so, but…)

It's either naive or a decimation of human potential (gadgetization, if you're Jaron Lanier) to believe it's enough to be able to access bits of memory from the 'net to answer the pressing question of the moment, then forget again like a vacuum-tube calculator with a three item max memory stack, the data vanishing into the ether after the equation is "solved". Assembly-line robots, phone-calcs and Google can already do that. Any significant thought -- be it a college thesis, an in-depth political analysis, a biting pithy journalistic expose, a novel, a new scientific theory -- these all require vast stores of interlinked memories *inside the neuron constellations*. Why? Because our thought is not a Turing machine algorithm of bit-by-bit search n grab, fill in the blank, then erase, but rather a wholistic electromagnetic symphony of information integration, transformation, permutation happening simultaneously. We must have all the local random-access memory "in mind", precisely because we don't know when our ruminating synaptic storm will pull on this stored memory of the concept of RAM, another brain-file labeled "symphonies", and several neuroscience white papers I read millenia ago in 'net-years, bouncing, mulling, weaving these memories together, organizing sequentially and finally producing a cohesive argument -- as I'm attempting to. We don't sit down at the computer and like some GM robotic factory line pull up our "collective synapses" of hyperlinks, aggregates, and bookmarks then screw, bolt, and mashup the parts together, post, then finally clear it all from our empty sieve-brains for the next unit of "thought production". Of perhaps that's just become the modus operandi of much of our infosphere and its aggergators: just steal Google's bribe-weighted top-10 list for the Hot Topic of the day and round up the last few good tweets, toss it into a blogpost, pepper with a few twenty-byte pseudopithy wisecracks, and send it to the presses Tumblr. We think with out memories, and if our memories shrivel down from muscular Blue Gene/Ls to scant 8-bit registers that hold only "bookmarks" of Google, Twitter, and the other aggregators, then it stands to reason our thought-programs will decrease in power, depth, and greatness to pixelly shits-n-giggles and barfights.

Any of the major breakthroughs in science, our Man-on-The-Moon achievements, our greatest literary works, the world-changing paradigm shifts, have occurred not through some direct problem-solution, search-and-answer, or “hyperlinks, aggregation, and tranches of bookmarks” read and recently forgotten, left for some Watson 3.0 expert system to sort out. Rather, it was through grand synthesizations, imaginative leaps across the entire sweep of the human experience and human knowledge, recalled and recombined, the complex interweavings of lenses shifting until a clearer perspective emerged. Einstein’s relativity crown-jewel could not have been brought to fruition without the vividly remembered experiences of his train ride from Princeton to Boston, along with countless other segments of his life’s journey, not to mention decades worth of mind-melting math and physics stored in his “brainmeat”. Shakespeare could not have spliced the wonderous chimeras of metaphor without a vast and deep catalogue of raw linguistic DNA. Had Joyce offshored his encyclopedic and deeply webbed knowledge to Google, he would’ve been just another middle-class Bourgeouis Dubliner and never have erected his labyrinthian edifices tracing an Ariadne thread through mazes of ancient Greek mythology, the Swiftian streets of 19th century Ireland and countless other realms of human experience without that instantly accessible and vast memory stew, deeply interconnected “meatmap” of the myriad territories. So what happens when we remember only where to find the infoteats, the all-knowing Google and pantheon of aggregator givers of information; focusing our mental powers on the directing fingers till our heads are filled only with pointer pointing to other pointers, yet signifying nothing? Sure, for us, the digital immigrants, the impact is arguable. But what of a generation who grows up phoning-in the answers to their math problems, cliff-noting their way with Google, to whom the “songs of thrushes” are but “sounds”. And what happens later, when these individuals need to synthesize large amounts of info simultaneously into coherent theses and arguments and dissertations and sophisticated political opinions if they are forgetting every bittified tidbit soon after it is read? What will become of these kids, disconnecting from and forgetting the finer details of their train rides and other real-world experiences with their heads buried in the unending firehose of Chattering Class babble, claiming, “It’s ok, I can always watch the Youtube of it later.” Again, it’s not the end of the world. Just food for thought, hopefully still digestible.

Unlike the straw golem of Nick which some have molded to slap around, using Twittified snippets, conveniently (or involuntarily) letting his more balanced arguments slip through their porous noggins, Carr readily admits that the digital revolution(s), the internet, even search engines all provide us with great benefits not off-par with the invention of cities or the combustion engine. From the article:
“Human beings, of course, have always had external, or "transactive," information stores to supplement their biological memory. These stores can reside in the brains of other people we know (if your friend John is an expert on sports, then you know you can use John's knowledge of sports facts to supplement your own memory) or in storage or media technologies such as maps and books and microfilm.”
It is certainly arguable that “we form similar consolidations on a collective basis using the internet as a substrate; hyperlinks, aggregation blogs, tranches of bookmarks”, and I nor Carr would disagree, except perhaps to the degree of isomorphism between a human mind and the internet. However this is completely orthogonal at best, and even bolster’s Carr’s point. Certainly there is something to be said for the intelligence of crowds and expert systemsfor specific species of questions, particularly those with single-output answers such as a stock price, the outcome of an election, or a trivia question (see: IBM’s Watson, Who Wants To Be A Millionaire lifelines). The quantity and ease of accessing info is unprecedented, leading to many positive outcomes such as mass-awakenings and revolutionary twitter grapevines used to overturn a great number of oppressive regimes.

However, unless you’re from the woo-woo recesses of the Singularitarian cult-cum-nerd religion, you’re going to be hard pressed to prove that the “Global Brain” is transmuting into anything like a human mind just because we’ve got a lot of URLs, blogs and bookmarks floating around our glorified telephone network. Complexity of a system does not automatically equal intelligence, despite all the evolutionary tendencies we Homo Superstitiens have to anthropomorphize and attribute agency to everything from lightning and ocean gods to robots to the sum total of treehugged “Gaia” to teh internets. Transmission of “information” does not necessarily equal intelligence. Human telegraph and phone networks transmitted great deals of information, and simply increasing the amount and even complexity of information transmission does not automatically make the system as a whole an intelligent gestalt with its own agency. Pure complexity is in fact selected against in evolution, as is illustrated in digital evolution simulations such as Polyworld, where the most complex-brained organisms succumbed to the dire and ever circling wolves of competitive pressure as they lugged their ponderously overgrown, energy-sink grey matter about. A brain's thought processes can be very complex, such as a paranoid schizo's intractably intricate conspiracy theories of Illuminati-driven Area 51 cover-ups. We do not give these people Nobel Prizes and let them run nations precisely because they are spewing complex yet *unintelligent* gibberish. If it exists, 90% of this supposed 'net brain is Russian botnet spam and pirated bootlegs of Transformers, and the next 9% of our new robot Overmind’s neuron firings consist of banal facebook !Likes, sub-literate flame wars, regurgitated zero-sum retweets aggregated by aggregations of aggregations vieing for Master Nodehood, and shallow lolcatey mash-ups. If the internet substrate has an emergent conscious personhood, it is a psycho schizophrenic retard.

And if we accept the premise that this massive internetwork of links and penis enlragement spam and pageranks and Tumblr feeds and auctioned-off Facebook social graph really is consolidating all of this info for us, then we’re implicitly admitting that we indeed are outsourcing our mental faculties, our memory and all it enables, to “The Interwubz”. To our privately owned search engine overlords under investigation for antitrust anticompetitive behavior, and to their technovangelist-overhyped “collective intelligences” which are nothing more than very good indexing and expert systems. Systems which enable exponential communication/info explosions, but which are incapable of inventing a cure for cancer, kicking the climate problem for us, writing a masterpiece of a novel, or doing any of the other heavy mental lifting which requires a fully-developed, whole human being with enough human-unique RAM, and the ability to turn off the infotainment bombardment, focus, and actually do some of that archaic “deep thinking” thing required for said truly significant accomplishments beyond the hoi polloi of sharing, tweeting, tagging, ranking, remixing and slapfight commentary.

Ironically, what the Digital Revolution guarantees us is not that deep, game-changing thought but instead really kickass Jeopardy playing AIs and the ability to never again be stumped at a party-night game of Taboo thanks to our “smart” phones. If the web is really thinking, then as they say in pre-web vernacular: that’s 'game-set-and-match' in favor of the so-labeled “Rejectionistas”. Though “digital cautionary” may be a preferred neologism. Nobody’s setting off EMPs in downtown San Fran and escaping, Walden’s Pond in hand, to some un-GPS located South Pacific isle whenceforth to live out some mid-life crisis Rousseau fantasy. It’s about conscious consumption, drinking responsibly. And, maybe, a little biting of the finger-pointing hands that tube-feed our brains their corporate-filtered version of their aggregated knowledge – and thus the control – which we might do well to keep in mind ourselves, as independent thinkers, and rememberers.

And if the results of the study and Carr’s inferences prove true, I don’t believe it’s a trivial phenomenon we can just brush off by tagging it “handwringing” and handwaving it out of our mental filter bubble as Ludditic fear-mongering. Not when the actual journalism hard won by the few journalists left still walking the lonely beat at state capitols and Big Biz by the digital revolution-produced newspaper Holocaust is increasingly soundbytten to meaningless shreds by bottom-dwelling clickwhore feeding frenzies, cynically and pointlessly aggregated by aggregates of aggregators, “optimized” out of our single-serving universes by Search Giant/Soc-Net filter-bubbles, and finally erased by corporate reputation-scrubbers whose raison d’etre is to intentionally help us forget, as if the weak digital media imprints of events need help vanishing beyond our one-hundred-forty character memory horizon, eroded by an offshoring of our minds to our “lickable” gadgets.

No comments:

Post a Comment