Friday, July 29, 2011

Dystopia Climate Control: Why A Sustainable American Revolution Will Take More Than Economic Pain

In response to Martin Ford's thoughtful article/challenge: Could Income Inequality Lead To Civil Unrest in the United States?

Civil unrest on a small scale may occur, but revolution is unlikely and this is why. Because the puppetmasters who's Right and Left hands are the government/corporate world respectively have long been well aware of the potential for revolution and shadow-regime change, and adjust the Dictator-ishness accordingly. The Big-Time plutocracies are too intelligent to fully materialize; it is just a matter of how much pain they can cause the populace, how much daily bread they can steal and get away with it. How much corporate tax they can unload off into the Caymans, how long they can Kangaroo Court the Debt Ceiling debates, how much wealth they can strip via financial shell games and oligopolies. They've got whole armies of financial consultants, pollsters, think tanks dedicated to taking the temperature of the public's anger and making sure it never reaches a boil. We're living in an economic climate-controlled room where they are constantly fine-tuning the fuckedness quotient; they've learned from old King Georgy Porgy and Haiti circa 1791, saw the writing on the Tunisian walls long before Mubarak painted them with his own peoples' blood, dark and anemic from malnutrition. How else do you think The Money People have kept the majority of Americans believing they're doing better than they actually are, anethetized by the fading fumes of "The Greatest Nation on Earth" mythology for the past half century, despite a constant decline of the average American living standard down the economic spiral towards the 3rd world? We're rats in lab cages, watching shadows on our cave walls of flatscreens and touchscreens, telling us how much "opportunity" is out there, keeping us disoriented by disinfotainment and addicted to junk-food media as we're fed a precisely measured dosage of well-being such that we never reach that Jeffersonian threshold of Despotism, never critical-mass and rise up off our couches and Netflix streams of Jersey Shore to "revolt".

Maybe the Overlords have lost control over their capitalism machine, maybe existing webwork will fail. And the HFT shorters and subprime barons with their blood money, like their predecessors, the IBMs and Coca Colas who lifted their Jew skin encrusted gold bars of the Third Reich from that burning viper-den and invested it in America, they will take their spoils and start the seeds of the next Goldman Sachs or the next Stalin to come a few decades down the line

The human desire to compete for wealth and status and prestige and its second-order effects within complex socio-political configurations are routinely glossed over by the platonic hallowed Shangrilahs that permeate much of Marxist ethos and the abundantists, who are its spiritual progeny. You can try to deny that desire to compete, like some communist states and flower-throwing communes and naïve Silicon Valley programmers (now under antritrust investigation), and believe that billions of years of psychological evolution are just going to go away, and we’ve all seen how the Soviet experiment turned out.

Somebody will always come out on top, whether that’s 500 frat brother network of CEOs, a genocidal dictator, an authoritarian People’s Republic, a gated enclave run by robots and AI, defended by uber-Predator drones. Today’s cute anti-authoritarian revolutionary in the bushes is tomorrow’s tyrant on the throne. Today’s “hope n change” presidential candidate is tomorrow’s Wall Street-owned puppet. Today’s grass roots take-back-the-government tea party is tomorrow’s “guns gods n gays” Republican freakshow. Look at Iran overthrowing Teh Dictators in '79, jump-cut to them now repeating history with underground Twitter railroads, riding Ouroboros. Revolution may come, but it is not enough to let the Tzar’s heads roll under your Bolshevik revolution if you wake up the next morning, rub the party hangover out of your eyes, and discover a Stalin at the podium.

It takes, at minimum, an educated, active populace engaged in attempting to create the best of all possible worlds. Closely examine Sweden, Canada, any of the countries with consistently higher well being and happiness levels than the US. The pattern is not "abundance" or per-capita-output but rather the cultural zeitgeist of the general population as a whole, collectively policing their governments and corporations, making sure the next malignant tumor doesn't come into power, actively demanding that their social safety nets remain in place. A populace who is is not checked out of reality into "reality TV" and cyberfantasy worlds, distracted by the nerd-neuroses of Angry Birds and maintaining virtual personas, constantly awaiting the next pellet of online "social" interaction like some wirehead lab rat. And spending the rest of the time watching ten hours of media a day. Marxists/abundantists who prefer to leave these fundamentally non-linear x-factors out of their perfect equations for their model worlds are making their own Procrustean Beds (and the similarly deadly beds of innocent and/or ignorant bystanders). Not unlike the hordes of Gaussian Copula and “Great Moderation” flaunting economists who tried to fit the world to their theories and not vice versa. And we’ve all seen starkly and painfully how that 2008 economic Singularity worked out.

At any rate, dictatorship is a crude, medieval, and ultimately ineffective technique for maintaining control over a large population. A method for South American Banana republics and strategically placed Arabic resource-cows, but not the home town. You don't shit where you eat, or at least, you don't shit so much that the flies start attacking you. Trust that if there are Masters of the Universe -- as there likely always will be lest we genetically hack our brains to quit desiring status, prestige, and wealth (unlikely) -- they will choose Brave New World over 1984. Much more efficient.

Sunday, July 24, 2011

Technium: God/Smack

Reflections in response to the Kevin Kelly vs Nick Carr discussion on whether technology is intrinsically moral, even Godly, with its own "will", and it is our sacred duty to be "fruitful and multiply" as much technology as we can.

It's a bit worrying to see technology elevated to a religious status, an uneasy feeling similar to the one I get when I read Singularitarian Time Magazine spreads.

It's difficult to say that technology in the geological-scale view has not increased human options and possibility for utility. But "Happiness" and utility are sticky Gordian issues. As Warren Buffet, one of the greatest beneficiaries of modern technology has said, "Time is the most valuable asset." If this is true, perhaps we apotheoses of tech with our realtime-media absorbed neuroses, overloaded schedules and eternal scarcity of time are really far poorer, ultimately, than the 2-4 hour mongongo nut gathering day of the Bushmen who had all the time in the world to relax and spend with family. Perhaps "technium" is closer to a nerd-equivalent of cocaine. It looks so great on the missionary's TV, so you leave the rainforest to score some. At first you get a euphoric rush and feel like anything is possible. But over time, your high-paying, high-tech digital addiction becomes a cage of Angry Birds, virtual personas you must maintain, social ladder-climbing, and general keeping up with the Hamptons. Soon you discover your most precious asset -- time -- sucked into the black hole that the "technium" drug has left in you, feeding it just to stay afloat.

While many rural flock to cities, it is equally true that many of the highest-powered cityslickers seek out more tranquil, more sane, lower-tech lives, moving out into the country and trying to live off the land i.e. the organic, local and resilient community movements. it's also debatable whether the millions of Chinese who leave their rice paddies in order to become slave-labor iPad makers working 16 hour days for a dollar inhaling asbestos are improving their lives.

But ultimately, I think my fear is when we project a "morality" on technology such that it is good in and of itself. A hammer is a tool, which can be used for good or evil: you can build a house or you can smash in a skull. Atomic energy can be used to annihilate all life on Earth many times over, or it can be used to keep your lights on. Was hundreds of thousands of people dying horribly in a nuclear blast, many more suffering cancers and other effects of fallout an expression of God's love? Computers can be used to help us communicate more easily, or computers can be used for high-frequency-trading creating a malignant financial system disconnected from reality and ultimately helped a near-global economic meltdown. Games can help surgeons improve success rates, or they can become a distracting time-sink techcrack. In short; technology is not "love" or "godly", but rather a set of inert instruments which allow options that can have both positive and negative effects. Just as "guns don't kill people," so technology is neutral: it is people that give it a moral valence through their use of it.

Saturday, July 16, 2011

Net-Gen Memory Sinks and Digital Cautionaries

Nick can certainly hold his own in idea-court, but I felt it would be a useful springboard for me to cross-examine Futurismic editor Paul Raven’s examination of Nicholas Carr’s sober rumination on newly minted studies which reveal that our milieu of omnipresent search access may be making us much better at finding information, but detracting from our ability to actually remember said info.

Just to give you the thrusts of each post, here is the essential thesis from Carr:

“There's a fascinating - and, to me, disquieting - study on the internet's effects on memory that's just come out in Science.* It provides more evidence of how quickly and flexibly our minds adapt to the tools we use to think with, for better or for worse. […] Does our awareness of our ability to use Google to quickly find any fact or other bit of information influence the way our brains form memories? The answer, they discovered, is yes: "when people expect to have future access to information, they have lower rates of recall of the information itself and enhanced recall instead for where to access it."”


And Raven’s snark:

“Ah, here we go again – another study that totes proves that the intermawubz be makin’ us dumb. Perfect timing for career curmudgeon Nick Carr …”


Early in Bruce Lee's tour de force "Enter The Dragon", he attempts to explain an essential concept to a troubled martial arts pupil, "It is like a finger pointing away to the moon." The student, befuddled, stares at Lee's outstretched finger. The master whaps the young padawan on the head, correcting, "Don't concentrate on the finger or you will miss all that heavenly glory." He indicates the spectacular full moon, which the student had ignored, so engrossed with the extended pointing digit.

I think Raven is obsessing over Carr’s finger, as it points away to the moon, all the while missing all that heavenly glory. He focuses, as the study suggests, on the "access point", the link-provider, which he seems to a priori disagree with, rather than the actual *information* and arguments provided by the access point. Or, more specifically, Raven focuses on the many fingers of a projected Jerry Springer-esque , (good for clickthrough counts, I guess) and misses the more subtle, deeper, and untribalized discussion of unfettered ubicomp’s affects on human memory. A discussion which requires a bit more neuronal RAM and drive-space to runtime the large flows of data processed through the webwork of logic/concept-chains. More memory than we’re used to needing for our one-hit tweet-zingers and three-chord napkin-length articles necessary to ensnare the hyperneurotic eyeballs and their preschool-length attention spans for some PPM digital nickels before they alt-tab to greener lulz and hurl birds at buildings. Haute post-moderne and charlatan-like as it is to proclaim the 'death' of a cultural phenomenon, it's difficult not to see the Atlantean sinking of book giant Borders as an ominous reaper's knell. In response to Carr’s article, Raven asks:

“Do we actually store fewer and fewer memories, though? Or do we perhaps store the same amount as ever, while having an ever-growing external resource to draw upon […] Or, to use web-native vernacular: citation needed”


Perhaps this comment is an apotheosis of Carr’s concept of the Shallow mind, pancaked to a great breadth of trivia and buzz wordage, yet sporting a retention window less than the length of what used to pass as a long-form article that is now considered an unpublishable, long-winded novel on the ‘net scene. So short, in fact, that complex arguments built on layers of evidence and inference (including non-Wikipedia non-churnalism citations) located in paragraphs a mere page-down keypress away readily evaporate from our atrophying prefrontal lobes in the time it takes to read and copy-paste a selection into our blogpost editor. From earlier in Carr’s post:

“Does our awareness of our ability to use Google to quickly find any fact or other bit of information influence the way our brains form memories? The answer, they discovered, is yes: "when people expect to have future access to information, they have lower rates of recall of the information itself and enhanced recall instead for where to access it.".


It’s not a reckless intellectual quantum leap to say that suggest we’re remembering less important info, unless you’re one of the owners of the Master Nodes i.e. Google, Wikipedia, Apple, who’s marketing epamphlets trumpet the importance of memorizing their URL over storing actual relevant information in your head.
Raven continues:

“I submit that we form similar consolidations on a collective basis using the internet as a substrate; hyperlinks, aggregation blogs, tranches of bookmarks both personal and public.”


Again, I believe Raven is missing the crux here and putting the digital cart before the human horse. The consolidations, embellished by the Emerson hat-tip earlier in Carr’s article (long-filtered through the noodle-sieve, I know) clearly underline the point which is that the influx of sensorineural and linguistic stimuli that we receive are not interpreted in our mind as netflix streaming projections on some internal real3D IMAX by some homunculi (AKA the Cartesian Theater). Rather, we take input through the “webcams” of our senses, but these streams (or “feeds” to the ‘wubzkins) of raw rgb light and sound are then used by our minds to *create* our experiences, accessing the sum total of our memories, our whole human being, our strange-loopy “selves”: and what are we but our remembered experiences? We hear a high trill of frequencies in the 800-200 hertz spectrum and say, “I hear a thrush singing”. But in actuality, all we have “heard” as input from our ear “microphones” is sound, but through our consolidated memories of the sound, of experiences with these whimsical little mammals, we are actively constructing a complex picture of reality: that there is a bird’, with a spotted chest’, sitting in a ‘tree’ nearby, ‘singing’, and that bird is a ‘thrush’. In this sense, the mind-as-computer metaphor is perhaps less accurate than the mind as a Cold War spy submarine, decked out with sensors and information collection data, attempting to construct a blueprint, based on the incoming numbers, of what is actually out there.

It stands to reason, then that if our memories are increasingly neurplasticizing to remember the “info faucet” *sources rather than actually storing the information in our wetware, then, that our “whole human beings”, our submarine intelligence, that library of dossiers collected on the world may be shrinking, growing dimmer as well.
. (*or source: Google’s got that whole antitrust shennanigans on the front-burner now… hate to say I told you so, but…)

It's either naive or a decimation of human potential (gadgetization, if you're Jaron Lanier) to believe it's enough to be able to access bits of memory from the 'net to answer the pressing question of the moment, then forget again like a vacuum-tube calculator with a three item max memory stack, the data vanishing into the ether after the equation is "solved". Assembly-line robots, phone-calcs and Google can already do that. Any significant thought -- be it a college thesis, an in-depth political analysis, a biting pithy journalistic expose, a novel, a new scientific theory -- these all require vast stores of interlinked memories *inside the neuron constellations*. Why? Because our thought is not a Turing machine algorithm of bit-by-bit search n grab, fill in the blank, then erase, but rather a wholistic electromagnetic symphony of information integration, transformation, permutation happening simultaneously. We must have all the local random-access memory "in mind", precisely because we don't know when our ruminating synaptic storm will pull on this stored memory of the concept of RAM, another brain-file labeled "symphonies", and several neuroscience white papers I read millenia ago in 'net-years, bouncing, mulling, weaving these memories together, organizing sequentially and finally producing a cohesive argument -- as I'm attempting to. We don't sit down at the computer and like some GM robotic factory line pull up our "collective synapses" of hyperlinks, aggregates, and bookmarks then screw, bolt, and mashup the parts together, post, then finally clear it all from our empty sieve-brains for the next unit of "thought production". Of perhaps that's just become the modus operandi of much of our infosphere and its aggergators: just steal Google's bribe-weighted top-10 list for the Hot Topic of the day and round up the last few good tweets, toss it into a blogpost, pepper with a few twenty-byte pseudopithy wisecracks, and send it to the presses Tumblr. We think with out memories, and if our memories shrivel down from muscular Blue Gene/Ls to scant 8-bit registers that hold only "bookmarks" of Google, Twitter, and the other aggregators, then it stands to reason our thought-programs will decrease in power, depth, and greatness to pixelly shits-n-giggles and barfights.

Any of the major breakthroughs in science, our Man-on-The-Moon achievements, our greatest literary works, the world-changing paradigm shifts, have occurred not through some direct problem-solution, search-and-answer, or “hyperlinks, aggregation, and tranches of bookmarks” read and recently forgotten, left for some Watson 3.0 expert system to sort out. Rather, it was through grand synthesizations, imaginative leaps across the entire sweep of the human experience and human knowledge, recalled and recombined, the complex interweavings of lenses shifting until a clearer perspective emerged. Einstein’s relativity crown-jewel could not have been brought to fruition without the vividly remembered experiences of his train ride from Princeton to Boston, along with countless other segments of his life’s journey, not to mention decades worth of mind-melting math and physics stored in his “brainmeat”. Shakespeare could not have spliced the wonderous chimeras of metaphor without a vast and deep catalogue of raw linguistic DNA. Had Joyce offshored his encyclopedic and deeply webbed knowledge to Google, he would’ve been just another middle-class Bourgeouis Dubliner and never have erected his labyrinthian edifices tracing an Ariadne thread through mazes of ancient Greek mythology, the Swiftian streets of 19th century Ireland and countless other realms of human experience without that instantly accessible and vast memory stew, deeply interconnected “meatmap” of the myriad territories. So what happens when we remember only where to find the infoteats, the all-knowing Google and pantheon of aggregator givers of information; focusing our mental powers on the directing fingers till our heads are filled only with pointer pointing to other pointers, yet signifying nothing? Sure, for us, the digital immigrants, the impact is arguable. But what of a generation who grows up phoning-in the answers to their math problems, cliff-noting their way with Google, to whom the “songs of thrushes” are but “sounds”. And what happens later, when these individuals need to synthesize large amounts of info simultaneously into coherent theses and arguments and dissertations and sophisticated political opinions if they are forgetting every bittified tidbit soon after it is read? What will become of these kids, disconnecting from and forgetting the finer details of their train rides and other real-world experiences with their heads buried in the unending firehose of Chattering Class babble, claiming, “It’s ok, I can always watch the Youtube of it later.” Again, it’s not the end of the world. Just food for thought, hopefully still digestible.

Unlike the straw golem of Nick which some have molded to slap around, using Twittified snippets, conveniently (or involuntarily) letting his more balanced arguments slip through their porous noggins, Carr readily admits that the digital revolution(s), the internet, even search engines all provide us with great benefits not off-par with the invention of cities or the combustion engine. From the article:
“Human beings, of course, have always had external, or "transactive," information stores to supplement their biological memory. These stores can reside in the brains of other people we know (if your friend John is an expert on sports, then you know you can use John's knowledge of sports facts to supplement your own memory) or in storage or media technologies such as maps and books and microfilm.”
It is certainly arguable that “we form similar consolidations on a collective basis using the internet as a substrate; hyperlinks, aggregation blogs, tranches of bookmarks”, and I nor Carr would disagree, except perhaps to the degree of isomorphism between a human mind and the internet. However this is completely orthogonal at best, and even bolster’s Carr’s point. Certainly there is something to be said for the intelligence of crowds and expert systemsfor specific species of questions, particularly those with single-output answers such as a stock price, the outcome of an election, or a trivia question (see: IBM’s Watson, Who Wants To Be A Millionaire lifelines). The quantity and ease of accessing info is unprecedented, leading to many positive outcomes such as mass-awakenings and revolutionary twitter grapevines used to overturn a great number of oppressive regimes.

However, unless you’re from the woo-woo recesses of the Singularitarian cult-cum-nerd religion, you’re going to be hard pressed to prove that the “Global Brain” is transmuting into anything like a human mind just because we’ve got a lot of URLs, blogs and bookmarks floating around our glorified telephone network. Complexity of a system does not automatically equal intelligence, despite all the evolutionary tendencies we Homo Superstitiens have to anthropomorphize and attribute agency to everything from lightning and ocean gods to robots to the sum total of treehugged “Gaia” to teh internets. Transmission of “information” does not necessarily equal intelligence. Human telegraph and phone networks transmitted great deals of information, and simply increasing the amount and even complexity of information transmission does not automatically make the system as a whole an intelligent gestalt with its own agency. Pure complexity is in fact selected against in evolution, as is illustrated in digital evolution simulations such as Polyworld, where the most complex-brained organisms succumbed to the dire and ever circling wolves of competitive pressure as they lugged their ponderously overgrown, energy-sink grey matter about. A brain's thought processes can be very complex, such as a paranoid schizo's intractably intricate conspiracy theories of Illuminati-driven Area 51 cover-ups. We do not give these people Nobel Prizes and let them run nations precisely because they are spewing complex yet *unintelligent* gibberish. If it exists, 90% of this supposed 'net brain is Russian botnet spam and pirated bootlegs of Transformers, and the next 9% of our new robot Overmind’s neuron firings consist of banal facebook !Likes, sub-literate flame wars, regurgitated zero-sum retweets aggregated by aggregations of aggregations vieing for Master Nodehood, and shallow lolcatey mash-ups. If the internet substrate has an emergent conscious personhood, it is a psycho schizophrenic retard.

And if we accept the premise that this massive internetwork of links and penis enlragement spam and pageranks and Tumblr feeds and auctioned-off Facebook social graph really is consolidating all of this info for us, then we’re implicitly admitting that we indeed are outsourcing our mental faculties, our memory and all it enables, to “The Interwubz”. To our privately owned search engine overlords under investigation for antitrust anticompetitive behavior, and to their technovangelist-overhyped “collective intelligences” which are nothing more than very good indexing and expert systems. Systems which enable exponential communication/info explosions, but which are incapable of inventing a cure for cancer, kicking the climate problem for us, writing a masterpiece of a novel, or doing any of the other heavy mental lifting which requires a fully-developed, whole human being with enough human-unique RAM, and the ability to turn off the infotainment bombardment, focus, and actually do some of that archaic “deep thinking” thing required for said truly significant accomplishments beyond the hoi polloi of sharing, tweeting, tagging, ranking, remixing and slapfight commentary.

Ironically, what the Digital Revolution guarantees us is not that deep, game-changing thought but instead really kickass Jeopardy playing AIs and the ability to never again be stumped at a party-night game of Taboo thanks to our “smart” phones. If the web is really thinking, then as they say in pre-web vernacular: that’s 'game-set-and-match' in favor of the so-labeled “Rejectionistas”. Though “digital cautionary” may be a preferred neologism. Nobody’s setting off EMPs in downtown San Fran and escaping, Walden’s Pond in hand, to some un-GPS located South Pacific isle whenceforth to live out some mid-life crisis Rousseau fantasy. It’s about conscious consumption, drinking responsibly. And, maybe, a little biting of the finger-pointing hands that tube-feed our brains their corporate-filtered version of their aggregated knowledge – and thus the control – which we might do well to keep in mind ourselves, as independent thinkers, and rememberers.

And if the results of the study and Carr’s inferences prove true, I don’t believe it’s a trivial phenomenon we can just brush off by tagging it “handwringing” and handwaving it out of our mental filter bubble as Ludditic fear-mongering. Not when the actual journalism hard won by the few journalists left still walking the lonely beat at state capitols and Big Biz by the digital revolution-produced newspaper Holocaust is increasingly soundbytten to meaningless shreds by bottom-dwelling clickwhore feeding frenzies, cynically and pointlessly aggregated by aggregates of aggregators, “optimized” out of our single-serving universes by Search Giant/Soc-Net filter-bubbles, and finally erased by corporate reputation-scrubbers whose raison d’etre is to intentionally help us forget, as if the weak digital media imprints of events need help vanishing beyond our one-hundred-forty character memory horizon, eroded by an offshoring of our minds to our “lickable” gadgets.

Terminus Machina: Kennedy High

Excerpt: "Kennedy High"

The school was a sagging public husk, the color of dead trees, and now served as the scratch paper for urban-tribal grafitti tagging battles, squatting cubby holes for the Deadweight refugees of the endless economic recess. Frayed steel reinforcements wormed out like bone fragments from fractures in the concrete. The layout itself was indistinguishable from a prison; high barbed walls and claustrophobic passages, CCTV panopticon of a since-failed mini police-state. Since handed on the torch of civ-lib infringement to far more menacing off-balance sheet tentacles of Gnossis and the Olympian cabal of information-financial empires who’d hollowed their nation state Titan-parents into a North American Weimar Republic. The school was an ironic apotheosis of 80’s apocalyptic riot paranoia and resultant fear for children’s safety, infamously designed, like other Cali schools, by the San Quentin architect. Bullet proof polycarbonate windows the size of police car dividers. Several glass panes had been chiseled out of their cavities, leaving random empty sockets of jagged concrete, like the methed-out mouth of a stage four wyre addict. The impervious quartz probably serving now as anti-riotpolice riot shields in guerilla firefights against Cybersec’s drone paramilitary. The school’s star spangled banner was somehow yet waving despite the surrounding entropy, flying bright amongst the red glare of car fires and distant artillery skirmishes burning the night an eternal crimson twilight, some grand sardonic District Ten joke biled up by its collective subconscious. It was nice to see that Americans could still pull together to accomplish great things.

The bombed-out factor of Kennedy High was extreme, even among the municipal dereliction all around, and the rubble lay with a certain archaic mothy stillness that suggested it had been hit long before the Intellectual Property Wars or the Plebland Riots. Caused, perchance, by some unforeseeably Gaiagenic Black Swan disaster turned armchair humanitarian sink, or more likely, a stray Killerhunter missile whose iffy slave-labe North Korean QA allowed a buggy AI to swing AWOL. Or, perhaps, active human malfeasance, the only commodity in abundance nowadays. Whatever the culprit, the truth was now redacted from the universe’s cache by the sandblasts of time. The digital media footprint of the event equally vanished beyond the one-hundred-forty character memory horizon, eroded by a half-century’s outsourcing of mental faculties to ubicomp. History washed away like unnoticed dead bodies swept down the Hudson amongst shoals of shredded bank paper, to be swallowed by the vast sea of last-moment’s Shiny #trend, the bitrotting remains of actual journalism soundbytten to meaningless shreds by bottom-dwelling clickwhore feeding frenzies and corporate reputation-scrubbers.


KrashKourse

Dad was a company man. Put in the hours to keep our sterile McBoat afloat. Grafted his brain onto his Flexbook ‘Maginepad. Coming up with new names for the same flavor of coffee. Played checkers with spreadsheets trying to eat the red tokens with the black. Trim the fat. “Stanford and Dillingham are bluest of blue chip. A collapse there has virtually negative probability, it’s a Sigma seven-plus event. It is more likely that the sun won’t come up tomorrow than Stanford and Dillingham will go under.” I looked outside, the sun buried under metric fuckloads of pollution and the black fog of class war, devouring the sky like nuclear winter. Thought, “Gee, maybe the house gave you the wrong odds on that one.” Maybe that was the point. Perhaps for a moment I understood what it meant to “sell short”: to gamble that an enterprise will fail. The Real Money People had their chips stacked on civilization rolling snake eyes.

You knew it was bad when the power realty agents in their pink Miu Miu and their Emu Egg fertata open-house hors d’oeuvres were pulling up like lost Eurasian-Agg tourists in District 10, polishing the turd-colored Section 8 housing with pointless Premium Web holotours, charging more TorrentCoins for a three-mouseclick credit check and “processing” fee than it would cost to buy all four stories of rust-bleeding cinderblock.

He tried to explain “net present value” to me once. The memory itself is a smear of numerati-speakage; labyrinthian CDO cash flows deluging mean variance voodoo wrapped around paradoxes like “time value of money”, and, somehow extricated from the intractable Cthulu-knottage of number theoretics a single dollar output, like some system-analysts’ Houdini escape act. It was at that moment I’d come, fully and irrevocably, to the conclusion that the art of finance was fundamentally the art of escape, an axiom corroborated by the fact that the previous two false-alarm Global Meltdown/Looting double features had gone off without a single senior-tranche indictment.

“See, Krash, this is how much we’ll be worth in 30 years, adjusted for 2050 dollars.” So many trailing zeros, like strung pearls, typefaced in sixty-point custom minted “Legal Tender” font, as if dressing the projection in dark suit and red tie might infuse it with truth. Portfolio graphed in interlocking Gaussian-Copula curves of gold and azure, accelerating exuberantly beyond the bounds of the finite spreadsheet’s gridded money-time continuum, like the exponential stairways to Singularity heaven trumpeted at some Silicon Valley technovangelist summit, before the tech collapse 2.0 and San Fran Revolts. Poetically, by 2050, there would exist no dollars with which to measure one’s “net present value”, and all those dead president and cryptohologram-etched cotton rectangles would be cold ash residue left by a trashcan fire or woven into post-post-modernist kilts, or preserved in vacuum-sealed zero-humidity Pluto museum vaults like original papyrus gospels archived in the bowels of the Vatican. Money to be preserved, rightfully, as the most holy documents of 20th Century America’s true religion.

Dad dealt in, traded futures. Like the financial barons, bankers, who were also in the business of trading futures. They traded the futures of the next generation away for a few more billions, a few more zeros on the Excel scoreboard, a few more pearls to string on the plastic-stuffed empty manikin of a trophy wife, a few more summerings in Cayman Island tax haven/resorts. That was his problem. He was in the business of futures in a world that was rapidly becoming very post-future. Generation Hex was the sledge-hammer pendulum swing about to be reeled in by our so-called “parents”, who left us to wallow in the ash of a squandered Golden Age. The life draining Great American Lie of The Future was overdue for a hard-takeoff iconoclasm by the post-future bastard children.

Later mom and I discovered he wasn’t worth those 2.4 million 2050 dollars. Dad was very mark-to-model. Ran off to mark Icelandic models in some Plutocrat Enclave’s playland he’d managed to cocksuck his way into, discarded mom and me like last-season’s smart phone, and left us to pick up the pieces of his mid-life crisis “family phase”, when I was fourteen.

He never called me Krash, I made that part up. I was just another statistic in an itemized list of yearly deductables, a “1.2 kid” checkmark in some misplaced aspiration toward American-gothic middle class normalcy that had curdled into mocked TV cliché decades ago. But in Generation Hex, I wasn’t just the wrong bubble in some multiple-choice exam. In Generation Hex, I was Krashkoarse, one of the elite hackers resisting the Puppetmasters, the future-selling bastard fathers of the world. In Generation Hex, I mattered.

Saturday, July 2, 2011

Published!

The Ghosts of Cloud City, a post-apocalyptic short storified episode of my earlier fiction is up over at the scintillating bunker of wonders that is Mythaxis. And a special shout out to Gil, webmeister and editor thereof, a man who truly does his darndest out of curatorial love. For while we highfallutin 'authors' lounge about in Starbucks spewing daydreamt hallucinations, those in the editorial cutting room toil unceasingly to polish the material into gleaming publishability!

I also particularly liked the header art he created for my story. Spot on interpretation.

Be sure to check out some of the other stories as well, I am truly honored to be situated amongst such talent.