Thursday, June 23, 2016

This is what ideology looks like

Lev Manovich had some interesting takeaway points from his recent visit to Facebook Korea that highlight familiar tendencies in contemporary media studies. The scare quotes serve as signposts for where he's headed in his post, as they designate the terms he deems obsolete in the Facebook era: "ideology," "control," "dominant logic," and, of course, "global capitalism" (as in: "There is no 'master plan' or 'global capitalism'"). This is a short step away from the familiar Thatcherian observation about "society." All there are, in the end, are particularities combining in assemblages whose activities are spontaneous, emergent, and unpredictable -- irreducible to the crude terminology of critical theory and free of any discernible structuring logics. Ideology is dead: long live the new (?) ideology of new materialist pluralism.
I suppose there are two ways to take these claims: the more reasonable (that ideology is complex and multi-faceted, but it still exists, that abstractions alway leave something out, but retain a certain utility) or the wholesale ingestion of the Kool-aid (once upon a time people may have been duped and propaganda existed, and capitalism was a thing, but now everything is so complex and particularized that abstractions themselves no longer have any use at all, everything is up in the air and free -- and because of that wonderfully liberating). There is certainly plenty to be said in the support of the first interpretation, but the second one seems to fit better with the conclusion of Manovich's post:

"The future is open and not determined. We are all hacking it together. There is no "master plan," or "global capitalism," or "algorithms that control us" out there. There are only hundreds of millions of people in "developing world" who now have more chances thanks to social media and the web. And there are millions of creative people worldwide adapting platforms to their needs, and using them in hundreds of different ways. To connect, exchange, find support, do things together, to fall in love and to support friends. Facebook and other social media made their lifes more rich, more meaningful, more multi-dimensional. Thank you, Facebook!" 

Wow -- this is a veritable paean to Facebook. Clearly there are interesting things taking place on Facebook, and there are plenty of constructive uses for it, but it seems a bit extreme to portray it as the savior of love, support, and the meaning of life. Not long ago, it seemed to me that perhaps the moment had passed for emphasizing a critique of the flip side of the benefits and conveniences of the online commercial world, because the moment of an unquestioning cyber-utopianism has passed, but it seems alive and well. 

To paraphrase Adorno:
Just as the ruled have always taken the morality dispensed to them by the rulers more' seriously than the rulers themselves, the defrauded new media enthusiasts today cling to the myth of success still more ardently than the successful. They, too, have their aspirations. They insist unwaveringly on the ideology by which they are enslaved. Their pernicious love for the harm done to them outstrips even the cunning of the authorities

Tuesday, August 18, 2015

Why do people like to pretend that Trump has a chance?

For the record: Donald Trump will not be the next President of the U.S. He won't even be the Republican nominee. Not this time.

Notwithstanding, plenty of reasonably respectable pundits, commentators, and observers continue to pretend that he's a credible contender. This is perhaps in part because he lives up to the caricature of the United States embraced both at home and abroad by those who really, seriously, worry about just how crazy the country is becoming. Europeans and other assorted overseas commentators love the idea of Trump -- not because they like him -- but because he is a distillation of what America has come to stand for: a somewhat absurd combination of ignorance and self-confidence, backed up by tremendous wealth, sheer bluster, and arrogant disregard for the concerns of others. He fulfills their understanding of just how disturbingly awry things have gone in the United States. This is why they believe he has a chance -- because he seems to cut through the clutter of pretension and spin to reveal the true face of what America is, what it has become, what it represents as a character on the world stage. Fair enough, that might be a good enough reason to take his campaign seriously -- after all, he is cutting through the code of politics to say what so many Republican candidates really mean. Racist right-wing populism is not a novelty in today's GOP -- it's the coin of the realm, and Trump does not have a monopoly on it. He has succeeded in portraying himself as the craziest contender in a roster of extremists -- and this should provide the likes of Ted Cruz and Scott Walker with some comfort. By contrast with Trump's bluster, the rest of the disconcertingly extreme GOP roster appears almost moderate.

There are some compelling symbolic reasons for thinking Trump might have a chance. There are also some understandable practical ones, at least from the perspective of the commercial media. Trump makes good copy in the era of 24-hour total tabloid coverage. He doesn't require much in the way of expertise to analyze, because he doesn't have any actual policies. He represents the Holy Grail of cable news and the blogface-twittersphere: the affective charge of politics without the actual work of politics. The trappings of the issues are there and he's lusciously quotable and tweetable: he makes sensationally controversial off-the-cuff remarks and then doubles down on them. Perhaps most compellingly -- and tellingly -- he doesn't give a damn. Unlike political hustlers from across the spectrum, he does not come across as desperate for the job -- on the contrary, he acts like he's doing us all a favor by bringing his brand of deal-making to the low-paid (for him) office of the Presidency. This attitude plays out as an effectively refreshing rejoinder to the coded dance of political campaigning. He promises the reality TV novelty of a "politician" saying what he actually thinks: an example of what happens, to paraphrase MTV, when the candidates stop being polite and start "getting real."

Trump's thoughts aren't particularly interesting or novel -- which is perhaps why they feel so familiar to those who like to say that he's just like them -- but he is talented at staging the spectacle of correctness-busting candor. The public response is not to his political platform (he doesn't have one), nor even to the originality of his insights (come on!), but to the role he is playing of the politician unfettered. Popular American lore reveres the nation's businessmen above its politicians and it embraces the image of the maverick, liberated from convention by dint of breathtaking wealth. In terms of cable TV entertainment, Trump is a spectacle who will continue to attract high ratings, saturation media coverage, and popular attention.

But he will not "go the distance" -- in part because he's not really interested. Undoubtedly he'd love to be President, if someone would just hand the thing to him on a platter. But he's not willing to do the work of really figuring out how to run the country (Dubya wasn't either, but he was content to play the role of figurehead). His campaign is, fundamentally, lazy -- in part, we get the impression, because Trump is sure he has better things to do with his time than to do the work of an actual politician. From a business perspective, politics is, even for the winners, a loser's game. This is what frees Trump up: unlike the pundits, he knows he's not going to win the nomination. Not a chance. But he's been given a free media card, and he's going to play it until it's time to go home. The remarkable thing about his campaign -- what differentiates it from those of us his rivals -- is just how much he's enjoying it. And why not -- this is what he does. This is better than reality TV -- he can corner the entire US News media whenever he feels like it.

After making a lot of noise about not being beholden to anyone because he's rich, it is unlikely Trump will spend heaps of his own money on a long-shot bid. But what about his claim that he would be willing to spend $400 million on his campaign if he's "doing well" (which he is, for the moment)? That would be such a bad business decision, even Trump is unlikely to make it.

You need a lot of money to get elected in this country, and there is no clear route for Trump to raise enough cash from other people to make a credible run. The establishment GOP has already chosen its money candidate -- and it's not Trump. Despite all the free media he gets, Trump is unlikely to mount a workable grass roots campaign -- it's one thing to draw crowds with the spectacle of the freedom of obscene wealth, but quite another to then ask them to open their wallets for you.

Trump will ride the free media bandwagon and the poll lead for as long as he can -- which is highly unlikely to be all the way.  He will achieve what he set out to do: increase the value of his brand. The media will give him as long a ride as they can, because he trails record-breaking ratings along the way -- and because politics without politics is so much easier and more profitable to cover than the real kind.

By this time next year, Trump will be a Giuliani-like memory of the irrational exuberance of the campaign's early days. This might sound like wishful thinking - an attempt to comfort myself in the face of the specter of a Trump Presidency -- but the truth of the matter is that the other Republican contenders are likely to pursue equally destructive and regressive policies. We would be foolish to talk ourselves into believing that with Trump's departure from the field, right-wing populist extremism will have been rejected or publicly discredited. It's gone mainstream. Trump's signal achievement will have been to have provided the media with a great excuse to do what they do best: spend their time on outrageous comments and manufactured controversies. And the pundits who spent so much time pontificating on the meaning of Trump can get to work puntificating on the meaning of his departure, not quite noticing what an empty creature of their own creation he has turned out to be.

Saturday, May 16, 2015

The Fate of Art

It was very strange to see BoingBoing promoting this hackneyed critique of contemporary art by "artist" and illustrator Robert Florczak or Prager "University." More on the scare quotes in a second. Why strange? Maybe it's the blender effect of Twitter which constantly recirculates the old as if it's new and the new as if it's already been around the block so often that by the time you get to it it's old news. Maybe it's because former WIRED editor Chris Anderson retweeted it with the following observation: "Well argued and brave. Plus fun prank on his grad students." Really? Let's start with the last bit first. The prank that Anderson thought was so fun: giving his students a close up photo of a painting he claims to be by Jackson Pollock (but is actually a close up of his studio smock) and making them explain why it's so great, so that he can then humiliate them by revealing the true source of the image. This raises some interesting questions about his grad students (at Prager University?), who seemed to think that this:
was pretty much indistinguishable from this:
Ok, I get it, squiggles are squiggles, but these are supposed to be graduate students in art (history? studio art?) of some kind. Which makes one wonder what kind of university this is. Apparently it's the online creation of conservative talk show host Dennis Prager -- a venue for right-wing, low budget Ted-type Talks devoted to topics like "Feminism vs. Truth" and "The War on Boys," and why Christians are the "Most Persecuted Minority." Maybe the inability to tell the difference between these two images helps explain why Florczak, who paints things like this:

seems to think that he's working in the tradition forged by the painters of images like this: 

and this: 

Rather than the tradition forged by the creators of images like this: 

and this:
Florczak's claims seem to have something to do with technique and skill -- things that, for example, both Kenny G. and John Coltrane have mastered, but that don't make them the same type of artist. That this distinction is lost on the likes of Anderson and BoingBoing's Mark Frauenfelder (another former WIRED editor) is an indication of the cultural confidence of the tech world, in which expertise becomes fungible and the perpetual vindication of financial success a kind of all-purpose cultural qualifier.

Wednesday, February 12, 2014

Post-Critical Theory: Desire and New Materialism

What to make of the recurring claim that matter "desires" -- articulated perhaps most passionately by Karen Barad: "Matter feels, converses, suffers, desires, yearns and remembers." I suppose the real question here is what one might mean by "desire" in this context (or "converse," for that matter). I suggest that these are metaphorical uses of the terms -- matter (except for that which takes the form of human sociality) does not have recourse to language even though it may "communicate" in the archaic sense of a physical transfer (heat can be communicated, so too electrical signals -- even quantum states). Without access to language, matter can no more desire, in a psychoanalytic sense, than it can converse. Surely it can be entangled, embedded, or otherwise caught up in some form of relations with other entities and with itself -- indeed it cannot not be. 

But that is something altogether different from the dimension opened up by language (as might be demonstrated in negative fashion by, for example, by Ian Bogost's dismissal of linguistic forms of production as not being on a par with more properly material ones. For more on this point, see my critique of Alien Phenomenology). This is perhaps where the pendulum swing away from discourse represented by "new materialism" goes a bit too far: in conserving notions like desire while simultaneously setting aside any engagement with the dimension of language (and, consequently, that of the subject).  

This setting aside has ramifications for the fate of critique, as suggested by Barad's vociferous dismissal of critical approaches:  "I am not interested in critique. In my opinion, critique is over-rated, over-emphasized, and over-utilized...Critique is all too often not a deconstructive practice, that is, a practice of reading for the constitutive exclusions of those ideas we can not do without, but a destructive practice meant to dismiss, to turn aside, to put someone or something down." This is a response that reveals much about the stakes of critique in contemporary academic (primarily literary-theoretic) circles. Critique has become a game of one-upsmanship and can have unconstructive rather than deconstructive results. If, once upon a time, the point of critique was to address human suffering, reflexive critique can apparently, exacerbate it -- at least in certain circles. Someone's (or something's?) feelings might get hurt. 

For Bogost, the concern is somewhat different: overly humanistic thinking -- even of the ostensibly critical kind -- can get a tad boring: "Just as eating only oysters becomes gastronomically monotonous, so talking only about human behavior becomes intellectually monotonous.”  

It is hard not to read such observations as registering the level of contemporary academic alienation. I'm worried that these are the types of concern ("I"m bored" or "If you critique my argument, then you're putting me down") that come to the fore when you've lost any urgent sense of the point of what you're doing beyond constructing an argument for argument's sake -- what Adorno might call the wholesale aestheticization of theory. It seems absurd to even say this in the current conjuncture, but what if social theory were, on some level, actually about working toward making the world a better place for humans? OK, that might bore some people who've eaten too many oysters, but presumably they have the luxury not to worry about where the next oyster is coming from, and perhaps the lack of imagination to consider the fate of those who do not. 

It is this alienation that, I think characterizes the critical inertness (or refusal) of what passes for "new" materialism these days. I put "new" in scare quotes, since there is a strong affinity between this versionof materialism and what Zizek describes as "Althusser’s materialist nominalism of exceptions (or'clinamina'): what actually exists are only exceptions, they are all the reality there is. (This is the motif of historicist nominalism endlessly repeated in cultural studies...) However, what nominalism does not see is the Real of a certain impossibility or antagonism which is the virtual cause generating multiple realities." This structuring or generative antagonism -- and for Zizek it is, of course, the constitutive rift of capitalism -- is what falls by the wayside in such materialist nominalisms. One symptom of this loss, is the sidestep away from the register of language and its deadlocks -- and thus, of course, from an engagement with the question of desire. Matter may desire -- in some reconfigured, alinguistic conception of the notion -- but desire does not matter.  

Monday, August 26, 2013

Drone Theory and Goldfish Crap

Generally I like the idea of dividing academic labor up so I can read the theory I like and apply it to things I don’t (“symptoms” of a damaged world). But these days, that division is breaking down, and some of the hip “new” theories are creeping disconcertingly into the symptomatic realm. In particular, some recent work on “object oriented ontology” and new materialism leaves me trying to figure out why those whose critical commitments I share might find them interesting or useful. 

The problem is not so much how to work out the theory, but to make sense of its uptake. The more I engage with this work – and, I’m not sure how much more time I really want to spend on it – the more it looks to me like a close relative of the enthusiasm over data mining and the forms of “knowledge” it generates. The logics align with one another – post-narratival, post-subjective, post-human – even though the sensibilities are ostensibly opposed. The following is a bit of a rant that emerged as a by-product of an offer to collaborate on a review of Ian Bogost’s Alien Phenomenology, a symptomatic book if ever there was one. The invitation meant having to read the book, which I found largely a frustrating endeavor, as evidenced by the following observations (all citations are from the book, which I read on Kindle without pagination):

Ian Bogost’s paean to the pleasures of the great outdoors – the “grassy meadows of the material world” casts poor old Immanuel Kant in the role of the stereotypical video gamer tethered to the tube.  It is hard not to hear in Bogost’s call to flee the “rot of Kant” seeping from the “dank halls of the mind’s prison” the all-too-familiar admonition to video game geeks to “get out of the house.” Perhaps this is a call Bogost has heard so frequently that he has internalized it sufficiently to wield it against others: the call of the great outdoors is a recurring refrain in his celebration of the mysteries of the object world – primarily and paradoxically incarnated for him in the form of high-tech electronics: digital cameras, computer games, and cathode ray tubes. “Let’s go outside and dig in the dirt” he enjoins us, but only metaphorically, really. 

In a sense, the entire book is a rejoinder to the call to get out of the house:  “I’m already outside -- that’s where I’ve been all along.” Bogost’s interpretation of what, following Messailloux he calls “correlationism” (which he equates with seeing things through the lens of how they impact humans) pits him firmly against any attempt at developing an analysis that “still serves the interest of human politics” (a charge he levels at Latour for not being anti-correlationalist enough).  But this opposition runs headlong into the repeated theme of his urgent (though largely unexplained) claim that “to proceed as a philosopher today demands the rejection of correlationalism”: we need to get outside and romp in the “grassy meadows” so we can collect the “iridescent shells” of realism and so on. If we chose to do so because it turned out to be good for us, of course, we would have succumbed to the trap of correlationism. Even animal studies is too anthropocentric for Bogost’s tastes because, “we find a focus on creatures from the vantage point of human intersubjectivity, rather than from the weird, murky, mists of the really real” – what we might otherwise describe as “the view from nowhere.” Much the same goes for Michael Pollan’s attempt at a “plant’s eye view of the world” – for “he too seeks to valorize the apple or the potato only to mobilize them in critiques of the human practices of horticulture, nutrition, and industrialism.” 

We get the message: any perspective that is in any way articulated to a human interest is ruled out in advance.  There is something disconcertingly incoherent about the Bogost two-step: step one is the unquestioned assumption that we might “wish to understand a microcomputer or a mountain range or a radio astronomy observatory or a thermonuclear weapon or a capsaicinoid [he apparently loves peppers] on its own terms.” Step two rules out the appeal to a subject who might wish to do something like this. He writes off science studies, for example, for retaining “some human agent at the center of the analysis.” OK, we get the point, Bogost wants to think about really thingy things and not those other things called human scientists or engineers.  But it’s pretty clear that what’s driving the whole show is the desire on the part of humans to experience things as things (other than human things) – even if this desire is anthropomorphically projected upon (non-human) things.

And so we are left with the thorny question of why such a perspective might be interesting. The philosopher Theodor Adorno neatly described the dialectic of autonomy: a fantasy of independence combined with the utterly irrational form this had taken. For Adorno, the autonomous artwork rehearsed capitalism’s crazy (aestheticized) embrace of production for production’s sake. What is left but to read Bogost’s injunction along the same lines: theory for the sake of everything and thus for nothing. It is a pure position, perhaps too pure, insofar as it does little to interrogate the goal of purity itself. The result is that the argument’s normative framing takes the form of recurring and somewhat mysterious demands on the reader: “the heroin spoon demands as much intrigue as the institutional dysfunctions that intersect it.” Why? To whom? These are questions that go unanswered – or perhaps such demands are only available to those who hear them, which poses a challenge for any attempt to impose them on the rest of us. 

In the book’s conclusion, Bogost briefly nods towards Levi Bryant’s claim that Object Oriented Ontology envisions “a new sort of humanism” in which “humans will be liberated from the crushing correlational system.” But after the wholesale dismissal of any attempt to frame his approach in terms that serve human interests, it’s difficult to buy into this meta-correlational gesture: the claim that we should surpass the attempt to relate knowledge to human interests, because it might be in our interest to do so (!?). Bogost slips this in so close to the final downhill run toward the blissful prospect of his argument’s end, that the reader’s tendency is to just coast though it rather than to give it the double-take it deserves. He follows with an explanation that sounds a bit more like the one that characterizes his own affinity for the extra-human – the “bored consumer” rationale: “Just as eating only oysters becomes gastronomically monotonous, so talking only about human behavior becomes intellectually monotonous.” This is not a particularly rare claim in some circles of the humanities, although one wonders just how widely distributed is the subject position that would take it as the most compelling reason to embrace a shiny new, if somewhat nonsensical perspective: a kind of intellectual ennui in search of the next big thing. Such a stance is surely associated with the somewhat sheltered subject position of gastronomic satiety, or surfeit. There is a certain luxury or self-anesthesis associated with the charge that thinking about humans and their problems is just a tad dreary. (“Why is it that one’s disregard for laundry, blogs, or elliptical trainers entails only metaphorical negligence,” Bogost asks, “while one’s neglect of cats, vagrants, or herb gardens is allowed the full burden of general disregard?”).

It is telling that Bogost’s ostensibly random lists of beings in the object world so often emphasize interesting sounding objects and words, both technical and natural. He lures the reader with bright, shiny, and mysteriously magical objects: “the obsidian fragment, the gypsum crystal, and the propane flame” (these are a few of his favorite things: musket buckshot,  gypsum, and space shuttles, redwoods, lichen and salamanders, Erlenmeyer flasks, rubber tired Metro rolling stock, the unicorn and the combine harvester, the color red and methyl alcohol, mountain summits and gypsum beds, chile roasters and buckshot, microprocessors, Harry Potter, keynote speeches, single-malt scotch, Land Rovers, lychee fruit, love affairs, asphalt sealcoat, and appletinis). We don’t hear much about toxic waste or shit stains. The object world is by definition an intricately rich and edifying one compared to that nasty, dank world of our own mind – an object still, to be sure, but not so salubrious or interesting  as the grassy meadows, iridescent shores and scoria cones. If “everything exists equally” for Bogost, some things clearly exist more equally than others.

Conspicuously absent from Bogost’s account is any explanation as to why being a philosopher today demands the rejection of what he terms correlationism. From what position is this demand made? Surely, given his round denunciation of “correlationalist” tendencies it cannot be made on the basis of anything having to do with us humans (despite the supposed benefits of escaping the dank prison of our minds). Such a perspective is ruled out in advance by the hubris-slaying, egotism-deflating thrust of anti-correlationalism.  Is the demand, then, made from the perspective of truth, based on the claim that this way of thinking accurately reflects the way things are for everything, everywhere, forever and therefore we must  adjust our own way of thinking to match the world (damn you, correlationism! Back again!). Well, why? What claim does reality have on us in Bogost’s universe?  Perhaps the claim is less a normative one (we should adopt the stance of anti-correlationalism) than a descriptive one: inevitably we will come to think this way thanks to the predictable and inexorable flow of certain types of entities called thoughts (and the claim exerted upon them by other beings). Such a perspective would embrace not a “new” materialism but the very oldest. The use of the word “must” would imply not an injunction but an inevitability: we must embrace object oriented ontology the way a stone in the earth’s gravitational field must, absent any obstruction, fall to the ground.  Such a formulation would certainly obviate the need for any kind of manifesto – (“a specter is haunting the object world: the specter of gravity!”).      

After ditching this book several times for failing to pass the basic coherence-of-thought test, I came to the realization that it is modeled much like the things it describes: unable to truly interact with other beings (like me), it simply recedes infinitely into itself. How else to understand statements like, “The construction and behavior of a computer system might interest engineers who wish to optimize or improve it, but rarely for the sake of understanding the machine itself, as if it were a buttercup or a soufflé.” He seems to be making an aesthetic point (along the lines of Kant, that dankest of thinkers): that his proposed way of understanding a buttercup is different from figuring out how a computer works because it is an apparently disinterested understanding – and emphatically not one that reflects what Kant described (realizing a certain logical necessity) as a disinterested interest. Once again, any sign of interest on our part runs the risk of channeling us back into a retrograde correlationism.

It is not surprising that one of the paradigmatic examples of the wonders of “the list” world invoked by Bogost is that of Roland Barthes’s like and dislikes, taken from his auto-biography. Ontography, Bogost style, takes the form of the database, and what is more characteristic of the database in its current market-driven configuration than the preference list?  Facebookers with their endless “likes” rehearse this list-building activity, as do databases of purchases, search terms and so on. IBM tells us that various digital sensors of all kinds gather the equivalent of 4,000 Libraries of Congress worth of data a day. But these are not books, poems, maps, plays, biographies, etc. Rather the data comes in the form of list-like collections in which the human and nonhuman mingle with the promiscuous abandon celebrated by Bogost:  credit card purchases, airline seating preferences, underground tremors, EZ Pass records, atmospheric pressure, geo-locational data, levels of particulate matter, stock market fluctuations, and so on. Such data collection rehearses the “virtue” espoused by Bogost: “the abandonment of anthropocentric narrative coherence in favor of worldly detail.” And, of course, experiencing this data flow becomes, necessarily, the job of various kinds of high-tech objects. Perhaps this is the appeal of Bogost’s theory in the digital era: the celebration of the very forms of post-human experience that characterize automated data collection (and the simultaneous de-valuation of narrowly human experiential and narrative alternatives).  

Suggestive in this regard is Bogost’s explicit rejection of the pursuit of knowledge as “metaphysically undesirable” because it violates the adherence to “A fundamental separation between objects…the irreconcilable separation between all objects, chasms we have no desire or hope of bridging – not by way of philosophy, not through theism, not thanks to science.” With a tweak to include information about things as well as humans, this formulation readily recalls Chris Anderson’s manifesto on “the end of theory” in the big database era: “Out with every theory of human behavior, from linguistics to sociology. Forget taxonomy, ontology, and psychology. Who knows why people [and things] do what they do? The point is they do it, and we can track and measure it with unprecedented fidelity.” 

Of course, “we” are not really doing the tracking here but are offloading it onto machines who do the work for us, offering up their experience of the endless litanies of information captured by a proliferating array of sensors. We might describe this reliance on the prosthetic extension of sensing, combined with its offloading onto the sensor array as a process of dronification: we oversee seemingly endless databases of information collected by remote sensing devices about everything from the online activity of consumers, to tweets, volcanic activity, carbon monoxide levels, ocean currents, subway locations, factory emissions, sales records, and on and on. The experience of our sensors can be summed up in terms of Bogost’s broadened definition: “The experience of things can be characterized only by tracing the exhaust of their effects on the surrounding world.” Such a formulation has been explicitly embraced by the data mining world in the term “data exhaust” – which does the added work of treating data as something cast off, an almost passive byproduct (but something that can be captured and recycled by those with the resources). 

Bogost goes on to suggest that the tracings of thing-exhaust can serve as the basis for speculation, “about the coupling between that black noise and the experiences internal to an object.” This is the part that is lopped off by Anderson’s formulation, which in its fascination with instrumental efficacy has little interest in such speculations. Rather the interest in capturing all available data embraces what Bogost describes as “a general inscriptive strategy, one that uncovers the repleteness of units and their interobjectivity.” He calls this process one of ontography: the writing of being, which “involves the revelation of object relationships without necessarily offering clarifying description of any kind.” Isn’t this the logic of big data mining, which unearths patterns of relationship without explanation?  

Clearly, Bogost would differentiate his goal of pure philosophical reflection from those of data mining, insofar as the latter (as outlined by Anderson) are crassly correlationist since the generated patterns are only of interest to the extent that they serve human interests (epidemiology, earthquake modeling, threat detection, marketing, etc.). And yet, the form of “knowledge” on offer, reduced to an object-agnostic tracing of the impact of objects “on the surrounding ether” models the “knowledge” generated by the database. Indeed, if we could imagine a data mining operation devoted to simply generating patterns independent of their utility to humans, we would come quite close to the process of ontography described by Bogost. He calls it alien experience, but given the ongoing development of new forms of object sensors (which preoccupy Bogost in his discussion of digital photography), we might call it simply drone experience.    

One of the more baffling – and perhaps telling – moments in the book is Bogost’s diatribe against academic writing. In tone, his critique takes the familiar form of charges against pedantry, obscure writing, and, predictably, a cloistered reluctance to pry one’s head out of the books and “visit the great outdoors” (that again!). Academics, he tells us, are relentlessly crappy writers who, even in public, insist on, “reading esoteric and inscrutable prose aloud before an audience struggling to follow, heads in hands.” He implicitly embraces the ready rejoinder that such critiques rehearse a familiar and fatigued set of clichés with the observation that, “Clichés also bear truth, after all.” Fair enough, but not ones that are interesting enough to warrant a multi-page chapter introduction.

Things start to get a bit dicier when he proposes his alternative: we need to start relating to the world not only through language, but through the things that we make, through our practice in the world (as if language, writing, etc. are not really real practices): “If a physician is someone who practices medicine, perhaps a metaphysician ought to be someone who practices ontology.” Academics he suggests in a distant echo of Thesis Eleven, spend too much time writing, and not enough time doing. He notes in passing that it seems “ironic” to even suggest such a thing in a book (rather than simply doing it, perhaps). We might take this as a call for diversity – let’s not limit ourselves to just one mode of object production (books); rather let’s make other kinds of objects (computer programs, motorcycles, maybe even some sturdy walnut shelves for all those books).

But the argument does not stop at the call for diversity – it actively disparages writing (as a form of doing that doesn’t quite count as one) by comparison with other forms of doing. At this point a somewhat confounding binarism slips into the argument. Why might it be “ironic” to advocate the making of things in a book? Isn’t making a book just as much a form of doing as other forms of doing? For Bogost, a book (or at least its ideational content – as opposed to, say, its binding) turns out not to be really a thing in the way that other things (tables, motorcycles, computer programs, unicorns?) are. Why not? According to Bogost, “carpentry” (by which he apparently means making something out of anything other than words), “might offer a more rigorous kind of philosophical creativity, precisely because it rejects the correlationist agenda by definition, refusing to address only the human reader’s ability to pass eyeballs over words and intellect over notions they contain.”

Unlike really thingy things, moreover, “philosophical works generally do not perpetrate their philosophical positions through their forms as books” (that is, their more material attributes: page texture, shape, binding glue, etc. For a book to really perpetrate its position this way, you’d have to be literally struck by it). By contrast the maker of material things (like software?), “must contend with the material resistance of his or her chosen form, making the object itself become the philosophy.” We might describe this set of oppositions as “the separate but equal” clause of Bogost’s book. He puts it this way, “all things exist equally exist, yet they do not exist equally,” by which he means, from what I can gather, that although things do not exist in precisely the same way, one group cannot be privileged over another – or, more specifically, human beings ought not to be privileged over other entities from a philosophical perspective, and vice versa.

And yet, why are those beings called books less “philosophical” in their construction than objects (like bookshelves and computer applications) crafted by philosophical “carpenters”? What makes the “immaterial” object less philosophical than the material? It is hard to extract any answer from Bogost’s argument other than that ideas are less philosophical than things precisely because their significance emerges through their relationship to humans (whereas material things relate not just to humans but to other things as well). In other words, humans are less equal than other things from a philosophical perspective, because their form of relating (as opposed to that between, say a stone and a stream) invokes a particular relation in which the mental capacity of humans is involved.

Perhaps the thrust of the argument here is corrective: we spend too much time thinking of beings for humans and not enough of beings of all kinds for one another. But the substance outstrips the tone of the argument, suggesting that as soon as humans enter the equation in their ideational (as opposed to material) form of relating, a relationship becomes necessarily less philosophical. Software (Bogost’s chosen form of “carpentry”) escapes the fate of writing because it is more “material” – that is, there is apparently more resistance in the symbolic substrate of machine language than that of human language. As in the case of, say, truing a bike wheel, or building a bridge, it is harder to make things work at a basic level when writing code than when writing theory. And yet, Bogost’s own book provides a compelling example of how, even in the realm of ideas (as in that of more material things), “simply getting something to work at the most basic level is nearly impossible.” It turns out that arguments and words can be just as recalcitrant as more material things.
If human cognitive experience gets devalued vis-à-vis that associated with the objects of “carpentry” – philosophers’ products come in for a healthy degree of scorn: “For too long, philosophers have spun waste like a goldfish’s sphincter, rather than spinning yarn like a charka.” 

Crap, it turns out, is less equal than yarn in the court of flat ontology, although this valuation reeks of an allegedly surpassed anthropocentrism: by what measure other than some presumably surpassed correlationism is yarn more desirable as a product than goldfish waste? What does that comparison even mean from the viewpoint of flat ontology – is there a ready-made imperative that differentiates spinning yarn from spouting crap? If Bogost imagines he’s doing the latter when he writes books it would have helped to warn the reader at the outset. 

Monday, August 5, 2013


One of the distinctive punditry patterns that has emerged from the response to the Snowden revelations (and that recall Manning's leaks) is the focus on the "narcissistic arrogance" of some young whipper-snapper who thinks he knows better than all those four-star generals and security muckety-mucks. This approach clearly marks the pundit who is not particularly interested in addressing the question at hand: whether a democratic society can still live up to the name when it starts promulgating secret interpretations of laws that amount, in the end, to secret laws. Or, more specifically, whether the decision to implement a plan of total information surveillance merits public deliberation, or, on the contrary, is best left in the hands of those who like creating secret laws. Two classic examples of the "whipper-snapper" dismissal are Jeffrey Toobin's CNN takedown of Snowden and perhaps somewhat more surprisingly, Josh Marshall's reaction on Talking Points Memo. Both follow more or less the same pattern, but Marshall's starts off by sounding a bit more even-handed and nuanced. In the end, however, it boils down to this for him:
Who gets to decide? The totality of the officeholders who’ve been elected democratically - for better or worse - to make these decisions? Or Edward Snowden, some young guy I’ve never heard of before who espouses a political philosophy I don’t agree with and is now seeking refuge abroad for breaking the law?"
 Toobin makes a bit more explicit the appeal to patriotism that infuses Marshall's account:
Every 29-year-old who doesn't agree with what the government is doing doesn't get permission to break the law, damage national security, and then run off to China when it’s done. It is not the way you protest in the United States...stealing documents from the NSA and turning them over to glenn greenwald is simply not the American way. 
Well, OK, he's a relatively young guy (younger than them, in any case), and they don't know him from Adam, and he clearly has some instincts of self-preservation combined with an awareness about what happens to national security leakers these days. But that does not quite get to the heart of the matter: were the facts he disclosed legitimate instances of government malfeasance, including over-classification, promulgating secret laws, and creating a surveillance state behind the backs of the American public? Do these matters rise to the level of whistle-blowing? Might they be discussed without harming national security? Would it harm the nation not to discuss them? The "whipper-snapper" dismissal works to background these questions and foreground a sense of indignation over the sheer gall of today's youth. Moreover, it works to background the concerns about total surveillance that were once a part of the political mainstream, but that have faded into the background of the promised wizardry of digital surveillance. In his recent article on the NSA, James Bamford recalls the Nixon-era concerns raised by Sen. Frank Church:
That capability at any time could be turned around on the American people and no American would have any privacy left, such [is] the capability to monitor everything: telephone conversations, telegrams, it doesn’t matter. There would be no place to hide. If this government ever became a tyranny, if a dictator ever took charge in this country, the technological capacity that the intelligence community has given the government could enable it to impose total tyranny, and there would be no way to fight back, because the most careful effort to combine together in resistance to the government, no matter how privately it was done, is within the reach of the government to know. Such is the capability of this technology…. I don’t want to see this country ever go across the bridge. I know the capacity that is there to make tyranny total in America, and we must see to it that this agency and all agencies that possess this technology operate within the law and under proper supervision, so that we never cross over that abyss. That is the abyss from which there is no return.
Who are these young whipper-snappers to remind us of such concerns?  

Sunday, August 4, 2013

The Target is the Population

During his remarks to the Senate Floor on the National Security Agency’s domestic surveillance program, Senator Ron Wyden exhibited a characteristic and well-meaning, but somewhat misleading response to mass data collection and data mining for security purposes. The substance of his challenge to the “dragnet” collection of data about the Americans (and others) was that it was unnecessary because, “In every instance in which the NSA has searched through these bulk phone records, it had enough evidence to get a court order for the information it was searching for.” In other words, the NSA already knew who its targets were, already had enough evidence to get a warrant, and was simply using the program to bypass the inconvenience of having to write one up and get it approved.

While Wyden’s observations may be accurate, they miss the heart of the shift in the mentality that guides new database-driven forms of surveillance. In the era of big data and predictive analytics, the standard logic of surveillance is reversed. You don’t first identify a target and then unleash the full force of the surveillance apparatus. You start with the population (and some priorities and preconceptions) and mine data about it in order to generate leads and suspects. This may not have been the approach taken by the investigations Wyden described, and it may not even have had any demonstrated successes (otherwise, we would likely have heard intimations of them during the defense of NSA surveillance mounted by the Obama administration in the wake of the Snowden revelations), but it is the speculative model upon which the intelligence apparatus is building its case for mass data collection. 

The model is not an unfamiliar one: it is borrowed from marketing strategies that use data patterns to identify potential targets, as the CIA’s Chief Technology Officer, Gus Hunt, has enthusiastically noted: We have these astounding commercial capabilities that have emerged in the market space that allow us to do things with information we’ve never been able to do before.” The paradigm shift he learned from Google et alia, is based on what he describes as the importance of, “moving away from search as a paradigm to pre-correlating data in advance to tell us what’s going on.” This is a somewhat opaque way of referring to data mining generally and predictive analytics in particular: the goal is not to find out more information about a particular target, but to learn from the data who should be a target in the first place.

In this context, it is not quite right to say that just because everyone is monitored, everyone is being treated as a suspect (although no one is ruled out in advance). Rather it is to understand that the vast majority of data will necessarily be collected about non-suspects in order to provide the background against which the actual suspects emerge. For data mining purposes the target is the population: the entire population, the full range of data about it which can be collected for as long as possible. The "complete" picture is needed in order to allow the clearest patterns to emerge over time. As the CIA’s Gus Hunt put it, “The value of any piece of information is only known when you can connect it with something else that arrives at a future point in time...Since you can't connect dots you don't have, it drives us into a mode of, we fundamentally try to collect everything and hang on to it forever.” 

These are the watchwords of the new data surveillance era: “collect everything about everyone forever – or at least within the limits of the current sensing apparatus and storage capabilities.  

The fact that the target is the population means that critiques based on particular individual targets (“since you already knew so-and-so was a ‘person of interest’ you could have gotten a warrant”) are unlikely to have much purchase upon a system that has already committed itself (without much in the way of concrete, publicly available evidence, as of yet) to the possibility that data mining might help generate new targets and pre-empt threats in advance rather than simply providing evidence to act against existing ones. The reversal of the relationship between targeting and surveillance means we are unlikely to see the surveillance sector back away from the programs revealed by Snowden and others (and others). Rather the pressure will go the other way: toward flexible access to an ever-growing range of data about everyone.