Archive for the blogging Category

Self-Criticism

Posted in blogging on September 20, 2010 by traxus4420

This is in part a response to commenters here, though the assumptions informing what I said there are elaborated upon in the following post.

On further reflection, I think the argument in these two posts is flawed, and those flaws come from a reactionary chain of reasoning. They take some not-bad impulses, and instead of analyzing them, use them as the basis for far-reaching generalizations. They’re blog posts, in other words, but it seemed a good idea to make a few points about how they went wrong.

The impulses are these: it is really easy and thus really common for anyone left of Karl Rove to have a knee-jerk loathing of the American populist right, i.e. the Tea Party. It is really easy to feel superior by bashing them in public relying on the same condescending assumptions that have been reserved for any large group of people the speaker doesn’t like since time immemorial: they don’t accept institutionally approved forms of knowledge, they’re the brainwashed tools of elites, they’re full of irrational hatreds and prejudices, they don’t know how good they have it, they’re greedy little vermin who just want more, more, more. It’s really easy to let these insults take the place of analysis, or even of considering them as people  — that’s what they’re there for, after all. So the first impulse is to, at the very least, reach out to the target of all this vitriol, if only to understand exactly why they should be excommunicated. Why is the enemy of my enemy not my friend?

The second impulse has to do with ‘the spectacle.’ The best theories of how whatever this is works are dinosaurs. The most sophisticated are mostly instrumental — how to collect more accurate market data, neurological responses to visual and other stimuli, vaguely Freudian rules of thumb about who likes what, the politics of content regulation, intellectual property, etc. If they tell you anything useful, they don’t tell you how to gain power (‘cultural influence’) without strengthening the institutions and conventions that make the media what it is, i.e. they have no room for serious criticism. The Marxist theories tend to treat the spectacle (aka the media, aka Big Media) as a Borg-like mass, or an inchoate alternate universe full of vague opportunities for ‘revolution.’ Opposition to the great powers tend to concentrate into boycotts (the spectacle can’t help, only hurt, its misinformation should be resisted by facts and community organizing) or appropriation (insert made-up ‘hacking’ jargon here), both usually poorly thought out.

But mostly it is the uncanny effect of being compelled (via social pressures of all kinds) to make one set of arguments in one direction: anti-elitist, pro-populist critiques of dominant institutions against liberals who, it is increasingly obvious, are too invested in them to enact even modest reforms, and a contradictory set of arguments in the opposite direction: basically Enlightenment-type debunking of irrational bad faith skepticism against a right that’s continually rewarded for not thinking.

And the problem with following these impulses is that the ‘objective situation’ of commodity culture is one of universal ignorance — no matter how distorted or superficial media images get, how misleading their implications, or how interested their uses, if they’re useful they can’t help but matter. Racism, for example, has to be denounced in whatever form, no matter how impossible it seems for anyone to accept its legitimacy. If you’re lucky, you might have time to sneak in a comment about how anti-racism of some low-cost kind can be instrumentalized to distract White People from worse racism, but only if you’re lucky. It’s very eaThere is a very real fog of war in play that can only be passed through with intense, disciplined effort. It’s no good to refute a lazy generalization about the Tea Party with another lazy generalization that cheaply points out the hypocrisy of the first. In war, everyone is a hypocrite. And it can be easy, on a minor little blog — that cheapest of soapboxes — to forget the facts of war.

Read This Instead

Posted in blogging, Cultural Theory on August 18, 2010 by traxus4420

If anyone still checks this thing, you should read my acquaintance Christian Thorne’s blog instead — he’s doing what was doing a while ago (and may once again), except better.

(Ideal) Self-Recognition

Posted in Art, blogging, Cultural Theory, The Internet on October 29, 2009 by traxus4420

Hoardings
In recent years, some people have adopted the list form only to strip it to its foundation, yielding ultra-simple pages consisting of sequences of images cobbled together with little or no explanation, each image radically different from its neighbors, each likely to confound, amuse, or disquiet. These web pages are often “personal” pages belonging to artists or groups of artists. Text is relegated to minimal captions in these Internet wunderkammern, and sometimes abolished entirely.

Let’s call such a page a hoarding. The word can refer to a stash of collected goods, but can also mean a billboard, or the temporary wall thrown up around a construction site. The look of the hoarding is similar to that of a particular type of artist’s book that has flourished in the last 15 years or so, featuring page after page of heterogeneous images, a jumble of magazine scans, amateur snapshots, downloaded jpegs, swipes from pop culture and art history alike, some small, some full-bleed, none with explication. The similarity is not coincidental, for “the last 15 years or so” defines the Internet age as we know it, with its ubiquitous, colorful mosaics, evidently a powerful influence on publishing of all kinds.

What can we say about the experience of scrolling through a hoarding, trying to understand the procession of pictures? As in traditional fashion magazines, we find excitement and confusion in equal measure, with one catalyzing the other. Beyond that, it often seems that any information or knowledge in these pages is glimpsed only through a slight fog of uncertainty. Has an image been spirited out of the military defense community, or is it journalism; is it medical imaging, or pornography; an optical-illusion, or a graph; is it hilarious, disturbing, boring; is it doctored, tweaked, hue-saturated, multiplied, divided; is it a ghost or a vampire? In any event, the ultimate effect is: “What the fuck am I looking at?” Something that hovers in your peripheral vision.

One might ask, how does this depart from the queasily ambivalent celebration of the image that has characterized the last fifty years of pop culture, possibly the last century and a half of mass media? It could be the muteness of the offering, the lack of justification or context. But the observation that modern media divorce phenomena from context is a commonplace, and usually an invitation to reflect on the increasingly fragmented nature of experience. A hoarding is notable because while it is a public representation of a performed, elective identity, it is demonstrated through what appears to be blankness, or at least the generically blank frenzy of media.

This may be a response to the embarrassing and stupid demands of interactivity itself, which foists an infantilizing rationality on all “Internet art,” and possibly Internet use generally, by prioritizing the logic of the connection, thereby endorsing smooth functioning and well-greased transit. Recourse to the almost mystically inscrutable may be understood as a block to the common sensical insistence on the opposition of information to noise, and as a form of ritualized unknowing.

It could also be a dismissal of the ethos of self-consciously generous transparency that characterizes “web 2.0”: the freely offered opinions, the jokey self-effacement, the lapses into folksiness in the name of a desire to forge reasoned agreement and common experience among strangers. It is wise to mistrust this earnest ethos, which is inevitably accompanied by sudden and furious policing of breaches in supposedly normative behavior. This is not to argue that such consensus building is disingenuous, rather that it is simply politics, in the sense that politics is at heart concerned with separating out friends from enemies. In this view, the hard-fought equilibrium of an orderly on-line discussion is indistinguishable from its scourge, the flame war: reasonably or violently, both aim at resolution and a kind of confirmation of established precepts. Might a hoarding—a public billboard that declines to offer a coherent position, a temporary wall that blocks reasoned discourse—escape the duty to engage ratio and mores and resolution, in a kind of negative utopian critique? No, it probably cannot. But the perversity of its arrangement of pictures speaks for itself, and what it speaks of is manipulation.

Seth Price

One cannot just set the pro forma Schmittian (just to give it a proper name) logic of this piece aside, but it is a rather elegant illustration. A ready made image for someone else’s ‘hoard,’ and my first revision would be to replace that 18th-century insult with a coinage from one of blogdom’s dearly departed, an Arcades Blog. Which is itself another reference, which is the whole point. Why does a series of captionless images have to be irrational or perverse? One can imagine future art historians concluding that the age of mass marketing’s greatest achievement lay in convincing the world’s consumers that images (and through the backdoor, ambiguity) are a priori the language of unreason. Certainly images can be used to think. More pernicious is the idea that images which are ‘simply’ affect manipulators (that is, have ‘nonsense’ as their manifest content) are for that reason lacking in logical sequence.

Immaturity. Escape. Vertigo. The cynical romance of commodities.

Though I have made frequent use of the photo montage on this blog, a more concentrated experiment can be found here. Even something like this, an image or two posted every now and then, sometimes with words, sometimes without, all apparently fitting the idea of the ‘hoard,’ is not without pattern or immune to meaning. If the wunderkammern were overdetermined by the excessive display of strange and uncommon objects, the image blog (here‘s one of my favorites; here‘s another) is a collection of moments of an all-too familiar process of circulation, captured, and in that moment of capture recirculated as something novel, their significance altered. ‘Defamiliarized,’ even. Even when their authorial anchor is just an arbitrary sign: traxus4420.

My naive intent for the tumble blog is the same as with this one: for each post to be useful as part of a new process of thought. Failing that, it is also made to be ignored. Is a challenge to ‘common sense’ possible with these things at all? If so, it can only be by demanding different kinds of attention and different kinds of thinking. Because the facade of irrationality that merely prompts us to “reflect on the increasingly fragmented nature of experience” is advertising. Though it might be all that separates one from the other is the presence or absence of a product.

Blogs, Form and Sense: A Compendium

Posted in blogging, Cultural Theory, Lacan, Media, Parody, Political Theory, The Internet with tags , on August 21, 2009 by traxus4420

Maybe in the early days of blogging the medium seemed poised to open new dimensions of creative expression, where all sorts of people could express anything from themselves to other stuff. In reality, human creativity is rarely marketable as such beyond the scope of individuals and small groups. It probably has to do with being a human myself, but from the proverbial birds-eye view people and their actions look less like unique liberated snowflakes and more like snow.

Now we know there is a finite number of genres available to the entry level blogger. What is less often acknowledged is that just like corporate news, each of these genres carry with them their own structural logic of representation, which manifests as their own built-in ‘slant.’

To stay objective, we’ll avoid immediate issues (like health care) and pick some old news. Here‘s a topical AP piece from last month:

UNITED NATIONS — Out of genocides past and Africa’s tumult a controversial but seldom-used diplomatic tool is emerging: The concept that the world has a “responsibility to protect” civilians against their own brutal governments.

At the U.N. General Assembly, Secretary-General Ban Ki-moon pushed Tuesday for more intervention for the sake of protection.

“The question before us is not whether, but how,” Ban told the assembly, recalling two visits since 2006 to Kigali, Rwanda. The genocide memorial he saw there marks 100 days of horror in which more than half a million members of the Tutsi ethnic minority and moderates from the Hutu majority were slaughtered.

“It is high time to turn the promise of the ‘responsibility to protect’ into practice,” Ban said.

How does the blogosphere respond? I limit myself to blogs of the ‘left-of-center’ persuasion — whatever differences in ideology they may have are also differences in style. That, at least, is my working hypothesis.

The linkblog:

Unhappy Monday links:

– Think we’re out of the recession? Doug Henwood says think again.

‘Expert warns against advent of ‘Terminator’-style military robots.’ If you’re unemployed, don’t sell your Playstation — there may be hope for you yet:

The US currently has 200 Predators and 30 Reapers and next year alone will be spending US$5.5bn (€3.84bn) on unmanned combat vehicles.

At present these weapons are still operated remotely by humans sitting in front of computer screens. RAF pilots on secondment were among the more experienced controllers used by the US military, while others only had six weeks training, said Prof Sharkey. “If you’re good at computer games, you’re in,” he added.

Ender’s Game, here we come.

– In foreign policy news, the “responsibility to protect” doctrine has been getting more and more airtime. According to President Obama, there are “exceptional circumstances in which I think the need for international intervention becomes a moral imperative, the most obvious example being in a situation like Rwanda where genocide has occurred.”

As an on-again off-again pacifist, I’m deeply skeptical about any use of military force (particularly U.S.-led), but must confess not knowing nearly enough about the situation in Rwanda to make a sound judgment on that score.

– To compensate for your worries of U.N.-backed robot takeover, say hello to TOFU, “the ponderously eyebrowed robot fuzz owl with OLED eyes and some seriously rhythmic body jams.” Via (who else?) BoingBoing Gadgets.

The libblog:

I know we tend to stick to domestic politics around here, but if the Afghanistan/Iraq debacles have taught us anything, it’s that in this country we can’t afford to treat foreign and domestic policy as completely separate issues. The corporate media try to make it easy by chronically underreporting anything they can get away with, but this conditioned state of ignorance is unsustainable. The state of one affects the state of the other.

In the field of international relations, the issues of sovereignty and the right of other nations to intervene is a highly vexed issue. How do we legitimate ‘good’ uses of force, like Kosovo and Haiti, while preventing ‘bad’ ones, like Iraq? How do we reliably prevent acts of genocide, as in Rwanda or (arguably) Darfur, without risking the misuse of the same rhetoric for neo-imperialist purposes?

An increasingly important potential solution is emerging, known as ‘responsibility to protect,’ or R2P.

[here follow about 1,500 words of analysis of policy documents with links to the original pdfs)

In conclusion, a renewed liberal international order is our only hope. There is a real difference between a liberal, internationalist hegemony and an imperial, nationalist one; in fact it’s all the difference in the world. And we have the power to push our nation’s policy and culture toward the one and away from the other. Not just the power, I would argue, but the duty: religious fundamentalism and the Bush White House’s excessive response to it have shown us that universalism without tolerance is a recipe for global catastrophe.

I know I can’t speak for all of you on this one. It’s something that as liberals we need to discuss, and I urge you to get the ball rolling in the comments below. Keep it respectful, y’all.

The professional ‘expert’ as editorialist or someone who blogs under the assumption that their (usually well-respected) professional specialty gives them unique insight into events that often have little or nothing to do with that specialty:

…in my book, Twitsturbation Nation: How the Internet Generates Community, I made the argument that traditionalist notions of autocratic sovereignty would be the first major casualty of the Internet’s production of society from below, one narcissistic avatar at a time. Today, even the biggest figures in international leadership are keenly aware that Web access is changing the way politics works at all levels, from policy to advocacy, from elections to revolution. “You cannot have Rwanda again,” Gordon Brown said last month, “because information would come out far more quickly about what is actually going on and the public opinion would grow to the point where action would need to be taken. Foreign policy can no longer be the province of just a few elites.”

I don’t say this simply to brag about my foresight, but to make an important point about how attitudes change. Not long ago, the U.N. held a conference on the ‘responsibility to protect,’ a new doctrine that would set new standards for humanitarian intervention. In the past, even to attempt such a thing would have been immediately (and wrongheadedly) denounced as ‘imperialism’ by most liberals, and, post-Somalia, as sheer folly by realists. But we live in a different age. Life on the Internet is changing the way we think about the responsibility we have to one another, regardless of race, nationality, gender, or religious differences. How else could a stolen election in Iran generate such spontaneous support among the youth of its national enemy, the U.S.? It’s true that many suffering people don’t have access to the Internet, much less platforms like Twitter. But our imaginations have expanded to include them, and aid programs are not far behind. If this talk of responsibility sounds terribly old-fashioned, perhaps  one should draw comfort from another ancient adage: the more things change, the more they stay the same.

The ‘literary’ editorialist or the blogger who, motivated by a frustrated ambition to be a novelist (successful novelists don’t have time for ‘real’ blogging, see below), attempts a form of online commentary that is literature in its own right :

“May you live in interesting times.” So goes the ancient Chinese proverb which is not a blessing, but a curse. And yet, even after the amused Western reader recognizes this, that ‘interesting’ retains its double edge. For we must admit that most suffering is not interesting whatsoever, even to the sufferers themselves. Suffering is common. Suffering is boring.

So it is almost surprising to the typical U.S.-ian solipsist (yours truly) to read about occasions like this, when serious policy thinkers debate in serious policy language the future of ‘humanitarian intervention,’ justifying the refocusing of the war machine with shocked, shocked descriptions of brutal, nay, genocidal violence still going on in darkest Africa. As if its persistence were in violation of some cosmic ordinance and not just the willfully impoverished cant of Empire, the Beast that rapes the already pillaged; as if the history of suffering had not already been printed in history books, academic journals, even (cough!) newspapers.

Though this is perhaps not so surprising: because politics is boring too.

And I, I struggle once again for inspiration, and the nerve (the blessed, unholy nerve) to write once again the already written.

The propagandist:

Another day, another insult to sanity:

[Ban Ki-Moon] advised limiting U.N. action under the ‘responsibility to protect’ concept to safeguarding civilians against genocide, war crimes, ethnic cleansing and crimes against humanity. He acknowledged the possibility of some nations “misusing these principles” as excuses to intervene unnecessarily, but said the challenge before the U.N. is to show that “sovereignty and responsibility are mutually reinforcing principles.”

This is the same old messianic language of imperial violence, rephrased to appeal to latte-sipping Hardt-Negrian shills. All states are on the verge of ‘failure,’ and can only be evaluated by external criteria. Never mind the totally negligible and contingent fact that some states are ‘too big to fail.’ People are suffering, dammit!

Far from a universal degradation of sovereignty, what this amounts to is the invisible justification of a few ueber-powerful states, based on two mutually defining concepts of ‘failure.’ Under this proposed division of governmental labor, a country like the U.S. has a ‘responsibility’ (entirely unrelated to its ‘excesses’) to ‘supply’ military force to nations that, whether because sanctioned by the U.S. or on the wrong end of the international ‘free’ market, are unable or unwilling to prevent human rights abuses. ‘Success’ means either a) all nations magically achieve the status of liberal capitalist states with their militaries outsourced to the U.S./U.N. or b) the U.S./U.N. ‘intervenes’ and punishes the evildoers.

Disgusting.

I’d go into more depth, but Lenin’s Tomb has beaten me to the punch — make sure you check out these two typically awesome and well-researched posts.

One last thing: good to see folks getting disillusioned with Obama’s domestic politics, but his ideological misreading of Rwanda and tacit support for ‘R2P’ once again reinforces the obvious: that he’s just as firm a supporter of imperialist intervention as Bush, despite his pragmatic reservations.

Let’s keep fighting, y’all.

The Critical Theorist:

The following video clip illustrates a salient point I want to make about ‘the call’ to humanitarian intervention (periodically resurrected in mainstream political discourse despite frequent criticisms; for an example see the increasing popularity in policy circles of the odious ‘new’ doctrine of ‘responsibility to protect’) as a standard ideological gesture, in Jameson’s terminology an ideologeme, “the smallest intelligible unit of the essentially antagonistic collective discourses of social classes.”

The sublime moment comes when we are told that “This is Reality” precisely because “We Have The Power To Change It.” The shift involved is purely anamorphotic, a shift internal to our own perspective: we cease to be the moral subject negatively threatened with a loss of reality in relation to the significance of its categorical claims and become this subject of transformative Power in relation to this (subsequent) representation of (the True) Reality alongside essentially Sublime objects. But this sudden shift in the phenomenological value of the image content becomes one of utopian positivity when and since it prominently features an anonymous, well-funded team of caring ‘peace core types.’ To follow Zizek’s reading of ideologically sublime objects, the paradox is hence that “pure difference” between the form of our moral subjectivity with its impossible categorical mandates and the real conditions of objective violence which underlie it as revealed by the ad’s negative perception, become the point of their greatest Truth, of their sublime Identity with ‘Reality’ through the supplement-object as the representation of the other ‘subject supposed to care.’ The starving African children become the Real of moral tragedy not because of the descriptive content their images are supposed to represent (that they are in fact in Africa which is in fact plagued by readily observable and structurally necessary economic and political instability) but precisely because they signify the point at which our categorical moral claims become meaningless, because Africa is the place in which ‘inalienable human rights’ become inapplicable in the face of the objective historical necessity of incomplete ‘development,’ and the ostensibly ‘supplementary’ fiction of the ‘NGOther’ becomes essential.

This is the paradoxical moment of the Kantian sublime, in which “a[n enjoyable] representation arises where we would least expect it” of the Truth beneath our avowed categorical moral claims, at the point of the very impossibility of fully realizing our formal categorical moral subjectivization within the symbolic order. Following Lacan’s famous reading of Kant via Sade, we can say that this reveals the truly Sadean dimension of the Kantian moral law. Just as the Sadean fantasy imposes upon the subject the impossible pathological injunction to enjoy his victim’s sublime body without any regard for the limitations imposed upon it by real mortality, the Kantian categorical moral law is “the Real of an unconditional imperative which takes no regard for the limitations imposed upon us by reality—it is [a formally equivalent] impossible injunction.” Hence the subject is ‘freed’ from burden of its impossible demands through the presentation of this very impossibility, by submitting to the ad’s ‘irrational’ categorical imperative, and thus it only fully assumes this identity in a disavowed, ‘properly distanced’ manner, through the moral object supposed to care, the transcendentally ‘free’ subject of transformative Power whose ‘gear’ begins to fill the screen.  In this sense the sublime experience is, following Zizek, strictly one of false inter-activity: as our traumatic kernel of real-life impotence/passivity is transcended by the little other(qua imaginary subject supposed to care)’s enacted desire, our real-life activity becomes structurally equivocal with the enactment of this desire in the gaze of an impersonal, unconsciously assumed big Other.

The act of donation is hence properly a phenomenon of surplus jouissance, literally the enjoyment of sense, of the (material and hence significant) making of sense: ‘joui-sense’ is precisely this sublime experience of a signification who’s meaning is only truly known by the Other object-supplement (its imaginary referent) but is formally assumed by the subject as its ‘efficient’ cause. But from the very beginning of the ad we are already ‘sublimely’ subject to the obscene injunction to enjoy ‘our’ own subjective position precisely as a barred subject, as the contingent content of the enunciation of a categorical ‘You’ that perhaps also enjoys what has now come to be the simulacral myth of Michael Jackson-type innocence: one that survives despite being foreclosed from the formal Law as such. The realized injunction to donate is hence not only a ‘truly sincere’ investiture in the sublime meaning produced, but the assumption of this impossible-real objective presentation as a subjectively necessary condition for this ‘meaning’ to exist as the retroactively attributed Truth to ‘Your’ ‘real’ activity. Is this not the perfect analogue to injunctions of ‘international law’ and their justifications? The point is to realize that both ‘support’ for any given ersatz ‘law’ devised in the interests of global capitalism’s elite oligarchs and individual donations to humanitarian NGOs are made effectively real for the subject only by passively making what is, in fact, a ‘purely symbolic’ gesture for the gaze of an assumed big Other, and that the sublime enjoyment we gather from our fundamentally passive ‘participation’ is that of producing a signification of this Other’s desire, of assuming the subjective role of an object-cause for this Other’s active enjoyment.

The hipster editorialist:

Yall. Starting to get annoyed seeing sOO many blogs and ‘articles’ about celebrities trying 2 ‘make a difference’ by applying their personal brands to ‘3rd world shitholes’ (i.e. Hotel Rwanda). I feel like ‘activism’ oriented vaycays have prior brand identity as what MSTRMers and meaningfulcore bros do in college over the summer to ‘find themselves.’ Feels ‘unfair’ for celebrities with private jets to make 10x of a difference in 1 weekend than u and me ever could in our entire lives.

It’s kinda weird how ur supposed to go somewhere where ppl are ‘less fortunate than u’ at some point in ur life. Whether it is Africa, New Orleans, Detroit, or rural Missouri, there are people who are less fortunate than ‘us’ every where. Just want to appreciate my family + personal social networks on the internet more than ever when I see people who are ’suffering’, ‘uneducated’, ‘hungry’, and ‘0% self-aware.’

Sort of feel bad that I dont ‘get’ ‘what the big deal is’ about Africa. Not sure why I’m supposed to care about ‘millions’ of ‘lil negroes’ who don’t add value to my lifestyle/product lines. It’s hard 2 integrate ‘giving a shit about the world’ with my post-chillwave personal brand. But there comes a time when every entity with a ‘public voice’ has to use their voice 4 good. I don’t know what cause I’m going to rally around, but it will probably be something tangible/meaningful in my ‘personal life.’

How bout yall?
Do yall feel like Africa should be ‘first on the agenda’ for 2k10?
Does ‘the West’ (via Barry Obama) have a ‘responsibility 2 protect’ ‘troubled regions’?
Any ideas 4 how 2 spread chill values like ‘human rights’ and access to sweet social networks to places where ‘folks can’t read’ and/or vote?
Should ppl just ‘mind their own damn business’?

The self-promoter – all blogs are fundamentally tools for self-promotion. But some bloggers are of such elite status that they don’t have time for anything else. This status can’t be gained purely through blogging, only by taking advantage of the blog’s effect on one’s career. The struggling novelist publishes with Harper Collins. The professional editorialist begins to appear on TV. Etc. While sometimes difficult to tell apart from the linkblog, the promoter”s slightly higher ratio of self-disclosure (treating the blog literally as an ‘online journal’)  is one sign that they are in fact of two distinct species — celebrities and normals. It is at any rate the final stage of evolution for all blogs:

This morning I woke up to this outside my window. Ah, Brazil. How I loathe to leave thee.

– Launch party for the new book next Friday at the Hive. Open bar after my reading. I’ll see you there.

– Interview up at DesignBlog.

– My good friends Ted Brand and Sylvie are performing tonight at the Pinhook. Mp3s available here. I (obviously) can’t make it, but that doesn’t mean you shouldn’t. Show some love!

– Thanks to Kamau for bringing this link to my attention: some interesting debates going on in the U.N. about international responsibility post-Rwanda. Speaking of which, donate money to this site.

God bless.

Internet Generation

Posted in blogging, Media, The Internet on March 2, 2009 by traxus4420

At a talk, a familiar thread: the younger generation and their computers; they don’t have a connection to materiality, they don’t think about medium, or place, or tradition, or history. This time it was poets; the last time architects. Their products are equally ungrounded, and to someone with even a modicum of ‘local’ historical knowledge exude an unavoidable sense of pointlessness. One of the speakers argues that their stylistic concerns are probably alien to an Internet generation confronted instead with the ability to “let it all go,” a kind of terrible freedom where one can throw out margins, typeface, privacy, manners, the whole deal. I try to identify. Sure, when I travel I sometimes have a hard time telling the location apart from the facebook album, though the same has been said about photography in general. I can’t really remember anything, I don’t really care where I live, nor do I really understand how to get worked up anymore over matters of taste — having been able to get any sort of music imaginable since college has taught me I can ‘like’ almost anything with minimal effort — but I can’t say my experience of any of these things comes with a greater sense of ‘freedom.’

Is there any mode of writing more constricting than Internet writing? I mean in terms of form, of course, not (unless one is dealing with censorship) content. Another helpful analogy can be drawn to taste. It’s often claimed that the Internet offers its users unprecedented possibilities for self-fashioning, by opening an ever-expanding archive of culture to  sampling, editing, remixing, reproduction, etc. The problem is how to filter all this information in interesting and/or useful ways; essentially how to theorize it.

I find this perspective superficial. For one, it assumes that variety automatically equals freedom. Even if we go along with its implicit restriction of our view to the field of consumption, we have to acknowledge that all the Internet does is reduce the distance between advertisement and product and expand its potential reach, thus accelerating the cycle for each individual product. In order to follow a scene, I’m immediately obligated to become conversant in whatever it is the second it shows up on the blogs — nothing is hard to find anymore, I have no excuse not to know it and have little time to develop a personal taste that is any more than irrelevant dilettantism; if something is encountered by ‘chance’ it has no time to sink in, only to immediately become part of a scene or disappear. There is an expanding universe of tastes, but they are not individual. The work of constructing and participating in one or more scenes is increasingly the point, leaving far behind the old humanist ideal of self-knowledge through a deep personal experience of art. The Internet is another terrain for the capture of subjectivities, is able to do so more quickly and in some ways more comprehensively than print, cinema, or television, and leaves even less room for personal ‘freedom.’ If the avant-gardes were split between l’art pour l’art and its destruction through collision with everyday life, today we could say the principle of motion for art in the Internet Age is scenes for their own sake. An anemic conception to be sure, but poised on a powder keg, or, depending on your interests, a big pile of money.

Which brings me to the next faulty assumption: that all this variety should lead to increased creativity. If we mean creativity in the kind of general market-friendly sense that every Flickr photo is creative, then obviously it does. However, the Internet is a giant parody of the idea that novelty, as the engine of cultural development, is produced by recombining previous material into new forms. Though it seems nice and rational, its assumption that everything is always already translatable (that everything can potentially be ‘recombined’) inevitably leads to the romantic notion of original ideas as mysterious, uncaused, etc. We can call this the entropic-messianic theory of cultural production, wherein all means of establishing sense are assumed to have collapsed into an equilibrium state and we’re all just waiting around for Godot. Or we could just call it postmodernism. But significant art — the language is so outmoded — is generated through struggle with tradition or with something else, not a full shopping cart. The components have to mean something before they can be used for anything besides derivatives.  Comparatively information-deprived regional cultures and their unpredictable relations have produced most of humanity’s stock of ‘masterpieces.’ As an aside, maybe it’s time to look beyond (or before) novelty as the ultimate standard for culture.

Writing on the Internet immediately threatens ‘authors’ with their ‘audience’ — the moment one stops thinking of oneself as an isolated performer on stage is when conversation can begin, but doing this requires the abandonment of all concern for developing one’s ‘craft.’ When language fully enters a sphere of universal equivalence as text, (and can be translated, quantified, plugged into search algorithms); then communicability and transparency of meaning are finally God; one retires from the divine the more readily one can define one’s addressees against the universal, ‘common reader’ (which is not to say that there is such a thing). I should add that putting writing on the Internet is not quite the same thing as Internet writing. The former is frequently the object, but never the subject of the latter, and can always be skipped.

But we are (as ever) rapidly approaching a redefinition of the term ‘art,’ and at the moment I’m forced to beat a hasty retreat.

Language as Concealment

Posted in attention, Bergson, blogging, Ethics, structuralism, The Internet, U.S. Politics on October 11, 2007 by traxus4420

“You might think your anonymous online rants are oh-so-clever. But they’ll give you away, too. A federally-funded artificial intelligence lab is figuring out how to track people over the Internet, based on how they write.The University of Arizona’s ultra-ambitious “Dark Web” project “aims to systematically collect and analyze all terrorist-generated content on the Web,” the National Science Foundation notes. And that analysis, according to the Arizona Star, includes a program which “identif[ies] and track[s] individual authors by their writing styles.”

That component, called Writeprint, helps combat the Web’s anonymity by studying thousands of lingual, structural and semantic features in online postings. With 95 percent certainty, it can attribute multiple postings to a single author.
From there, Dark Web has the ability to track a single person over time as his views become radicalized.The project analyzes which types of individuals might be more susceptible to recruitment by extremist groups, and which messages or rhetoric are more effective in radicalizing people.”

Do You Write Like A Terrorist?

(there are some great links here, especially the link to the Google ‘archive’ at the bottom)

*

One of the charms of blogging is the thought of writing without support. Anonymity, or throwing off the symbolic scaffolding of the proper name, is an important part of this, but so is self-publishing, so is the offhand, ‘spontaneous’ nature of the writing that blogging lends itself to, the lack of responsibility to any particular genre or audience. Lurking in the background, I confess, is the thought of finally being able to approach the old modernist ideal of “pure writing.” As Mark Kaplan so aptly put all of this a while ago, there is almost an ethical obligation to accept blogging’s many potentials: “The words themselves must speak, can circulate freely.” It hadn’t occurred to me in the early going why any writer whose name or profession did not in itself grant them privilege would be anxious or have a problem with such an experiment, though naturally some readers would object to the masquerade, would not be willing to play along.

But in blogging these two roles are more than ever played by the same actor, and here is where the problem arises. Writing is not a world apart, and in a single, uniform world, concealment of any kind is at best pretension, at worst terror. My compromise, since the old chat room days, has been to write under a pseudonym (though ‘handle’ is a better term for a label that does not even try to pass as a name) while not making any special effort to hide my offline identity. This makes anonymity less ‘political’; less interpretable as an affront and more of a fun game. It allows me to continue the experiment in unreasonable, irresponsible writing without even the support of the secret.

It’s occurred to me lately though that the experiment itself — the effects on the blogger in the process of blogging — is much more dangerous than its potential effects on ‘others’ or whether it obeys certain standards of propriety and accuracy, the concerns of writing that retains ties to conventional authority as legitimated by some institution or other. Aside from the question of professional decorum, which hasn’t been a concern so far, there is the problem of style. This blog has no subject, there is no ‘appropriate’ style. But the problem remains. Whose style do I write in? Though there is no need for a consistent ‘I’ for the ‘content’ and ‘bias’ of the writing as a whole to be beholden to, each word and sentence is still beholden to the ones preceding it, even as it prepares for the line’s steady advance across the screen. Perhaps this question is easier, or even nonexistent, for those who came to blogging ‘already developed.’ I, on the other hand, am embarrassed by my earliest posts, only a couple of years old, to the point of wanting to delete them all. But has my writing really ‘developed’ since then? If so, in what sense? As I write, and I almost always write while ‘browsing,’ I sense my writing taking on new forms, appropriating new stylistic tics to match the influx of new thought patterns. My writing sometimes feels like the aggregate sum of these and other, previous micro-imitations. I know from experience that such a lazy approach leads to cliches, the repetition of others’ ideas, and worst of all, ‘faking it.’ And yet I rarely bother to make the ‘corrections’ that would make the writing uniquely ‘mine.’

Isn’t this the form of Internet writing in general? On the blog, and increasingly in my other writing, I’ve fallen into the habit of considering the simple marking of a reference — a ‘link’ — the equal of integration into the argument or general movement of the work (‘the’ work — another term that becomes obsolete when taken online). It’s as if I expect the links to do the writing for me. There is a grammar to the placement of a link, a way of reinforcing a point even if the reader does not bother to click on it (for example: “…but as we all know by now, the current administration is liable to say anything” — lenin by the way is really a master of this). The link replaces the citation as the mark of legitimacy. Many quite effective blogs consist of no more than a series of links and excerpts with the minimal commentary just there to frame the link in the sense I mentioned above. Isn’t the term ‘writing,’ signaling private subjective expression, a nostalgic affectation anyway? It’s used in the way ‘speaking’ is still used in letters; we’re typing, not writing. There is a difference.

*

“Pure writing” disintegrates in the context of the link, the remediation of text by the digital image, and the exponential growth of a self-publishing audience, its very conditions of possibility. It is replaced by free association (in the broadest possible sense), directed by emergent necessity. No stand-alone expression of personal truth can be made by an individual typer, however impassioned or polemical, without decaying into self-parody — or vanishing altogether, buried beneath immaterial mountains of ‘further reading.’

*

bergson.jpg

Bergson defines consciousness as the process of discernment, of subtraction and revision around a center of representation, which he relates to the body and its practical, material needs (the “necessary poverty” of conscious perception). This spatial language is itself a metaphorical representation, product of the need to locate individual objects according to their useful properties.

“Questions relating to subject and object, to their distinction and their union, should be put in terms of time rather than of space.” Bergson, the first great theorist of cinema, privileged the non-representational image over the symbolic-referential word, for what he saw as written language’s tendency to fix time in space, to dogmatize the distinction between the symbolic and the real. Still, consciousness is compelled by necessity to divide the real, to conceive of it as divisible: “Consequently we must throw beneath the continuity of sensible qualities, that is to say, beneath concrete extensity, a network, of which the meshes may be altered to any shape whatsoever and become as small as we please: this substratum which is merely conceived, this wholly ideal diagram of arbitrary and infinite divisibility, is homogeneous space.” The synchronic division of time into individual moments, logically connected, with nothing relevant left out — the production of space, in another language — is “the diagrammatic design of our eventual action upon matter.”

It is the play of ‘pure perception’ (procedural memory) and ‘pure memory’ (episodic memory) that synchronizes our discontinuous impressions:

“Theoretically, we said, the part played by consciousness in external perception would be to join together, by the continuous thread of memory, instantaneous visions of the real. But, in fact, there is nothing for us that is instantaneous. In all that goes by that name there is already some work of our memory, and consequently, of our consciousness, which prolongs into each other, so as to grasp them in one relatively simple intuition, an endless number of moments of an endlessly divisible time.

“The qualitative heterogeneity of our successive perceptions of the universe results from the fact that each, in itself, extends over a certain depth of duration and that memory condenses in each an enormous multiplicity of vibrations which appear to us all at once, although they are successive. If we were only to divide, ideally, this undivided depth of time, to distinguish in it the necessary multiplicity of moments, in a word, to eliminate all memory, we should pass thereby from perception to matter, from the subject to the object. Then matter, becoming more and more homogeneous as our extended sensations spread themselves over a greater number of moments, would tend more and more toward that system of homogeneous vibrations of which realism tells us, although it would never coincide entirely with them. There would be no need to assume, on the other hand, space with unperceived movements, and, on the other, consciousness with extended sensations. Subject and object would unite in an extended perception, the subjective side of perception being the contraction effected by memory, and the objective reality of matter fusing with the multitudinous and successive vibrations into which this perception can be internally broken up.”

This ‘extended perception,’ or the apprehension of the heterogeneous mixture of space and time, rather than only being able to think in terms of one by forgetting the other, has been attempted through cinema, but we can see now where the cinema was lacking. The moving image can only communicate space intuitively, temporally, sensually, within a frame that, as Deleuze writes, “ensures a deterritorialization of the image…gives a common standard of measurement to things which do not have one.” Where writing and the diagram erred on the side of space, fixing empirical data and verbal imagery alike into idealized symbols, cinema errs on the side of time, only capable of conveying information by imprinting itself on the senses, each new shock erasing the last.

Have Bergson’s speculations been actualized in the Internet? The metaphors we use to describe the Web — Website, MySpace, cyberspace, etc. — are all spatial, even though what they are trying to describe has nothing to do with space. Indeed, what all these words ‘signify’ is exactly what they subsume and delete from consciousness — Web Space is the negation of space. The Internet presents itself, interface by interface, as a set of multimedia images which presuppose an infinite set of potentially accessible, interconnected multimedia images, existing simultaneously, virtually. The passage of time is the result of conscious action: though new content is ‘added’ every second, for you nothing changes, time does not advance, until you click a link, until you ‘travel’ to a new ‘site.’ Not only does this imply countless temporalities, but user activity both adds to (develops) the Internet and causes it to exist, to be thinkable as a coherent entity. The intrusion of memory as ‘content’, which is at the same time an addition and/or alteration to a virtual infinity beyond time (represented by the metaphor ‘database archive’), the replacement of unconscious or spontaneous ‘recollection’ (the madeleine episode in Proust), is made possible only by the pressure of real fingers on real plastic. In Bergson’s language, the present has attained maximum intensity: it is capable of retaining as much of the past as has been recorded (we are assuming an ideal situation without data loss) and of contracting any number of external moments into its duration, which is determined entirely by conscious, material action. Virtually speaking, every click is an act of universal renewal. It is also one of absolute risk.

*

Why do people keep clicking? The most general, useless answer would be human motive, or the messy, heterogeneous flow of freedom and necessity. As more and more interests vie for attention, it becomes difficult for the single user to individuate itself, or to ‘manifest presence.’ Without the assurance of a body, the user only appears as its activity, which, once uploaded, is open to modification and ‘misreading.’ The user is infinitely corruptible, teetering on the edge of becoming whatever other users wish it to be. What if this increased confusion means Bergson’s ‘center of representation’ no longer must rest on an individual body? He himself argues that this relationship is not intrinsic; action, conditioned by necessity, not ‘subjectivity,’ produces the location of a central, cohesive motivating force — the outcome of subtraction from the ‘periphery’ rather than expansion from a center.

As Jodi Dean has been noting at length (just look through the Sept./Oct. posts for recent examples), the Internet is full of users trying to manifest a unique presence wherever they can. Services designed to support/condition these desires — you know, Facebook, MySpace, Blogspot, blah blah blah — are tremendously profitable and rising in power (even against Google!). A cursory glance at these sites — and I assure you I have taken much more than a glance — makes it obvious that, like Windows and Macs, they were designed for those who are used to being the stars of their own television shows, produced and directed by themselves, and used to relating to others by sharing these very habits as ‘culture.’ The ‘distributed subject’ of new media theory is the self-as-brand identity, marketing itself through skillful associations and aggressive expansion.

What if these games of revelation and concealment — all of them, despite differences — privacy vs. confession, the taking of positions only to abandon them, flaming vs. earnest moralizing, insisting on the propriety of secrets in the midst of obscene self-exposure — the techniques of seduction necessary for maintaining the brand’s allure — consist in just the reterritorialization of the impersonal potential of the Internet back onto the subject, the reflex action of those who depend the most on such a formation in their day-to-day lives? The specter of pure homogeneity, usually with Orwellian overtones, is raised in defense of this behavior, demanding that the individual user differentiate itself as much as possible from the herd, but legitimately, gainfully, through integration into some sort of pre-arranged, ‘democratic’ network — the various identity shopping carts of the network society: iTunes, Facebook, etc., or the information profiles more and more sites require in exchange for content and services — the means by which the online ‘masses’ are constituted (Herd 2.0).

The fantasies generated by this contradictory imperative toward ever greater self-differentiation reach a kind of climax in the terrorist chic of The Matrix, where identity within the system, revealed to be nothing but illusion, is countered by the pure artifice of self-aware individuals who draw their authenticity and collective purpose from the real reality beneath the digital world. Unlike the ‘normals,’ their selves are constituted not through the arbitrary rearrangement of predetermined traits but through struggle, and especially (indiscriminate) violence. Their knowledge that reality is a secret is one of their greatest powers, and perhaps also one of their greatest pleasures.

screenshot_207.jpg

Besides the sinister homogeneity of pure digital illusion (hyped by Baudrillard and exploited by The Matrix), there is another type of postmodern hell — the universe where everyone knows your name. This is the nightmare of a reality that has been perfectly mapped, ultimately just as threatening as one of pure artifice, and a trope that stories about escalating surveillance often rely on for effect. Common to both kinds of ‘bad homogeneity’ is a lack of secrets, of mystery, of fundamental ambiguity. Mystery and the hidden are in turn tied to freedom, a shadowy network which links in morality and the possibility of an ethical life.

surveillance-cameras-400.jpg

Any resistance to the imagined totality of The Same then must involve secrecy and privacy, the guarantors of morality that can at any point come under attack as a source of terror. Every traditional repository of secrets — the Soul, Consciousness, Woman, or simply the Unknown — has its rogue’s gallery — Evil, Madness, the Maenads (and their descendants), Fear. Secrecy can of course be actively used as a weapon or a tool of seduction. But so can ‘transparency.’ Resistance to the terror of secrets is often, as should be obvious by now, also terror.

Language, which reveals and conceals in equal measure, is a fitting target for the Dark Web program, which began with the insight that to demolish an opponent one can ignore the content of the utterance and focus entirely on form, not as an end in itself but as a means to unveiling that entity so often derided as fictional by literary criticism: the author. What Derrida identified as the trace is targeted by Dark Web analysts as the unmistakable signature of identity. Meaning is irrelevant here; the author is judged as a user, by the conditions of its appearance. Where do its avatars appear? In what context? What is the aggregate history, what are the statistical trends of these appearances? Do they lead us to the subject’s vital information? Signifiers are treated not as the bearers or even producers of meaning, but as acts, with consequences retroactively determined by the dominating structure. Only allies debate; enemies attack, and from this perspective the anonymous author can only be treated as a potential enemy. The subject, just because it is a theoretical/legal construct, is the only target that is certain. Consider the hint of a threat behind this paean to normalcy, where an unfortunate blogger is informed that true communication will not be possible until after he/she has created an identity that is readable as a subject (even despite the fact that his/her handle just is a professional identity).

This is because freedom is not really legitimate until it is subjectified, or until it is the possession of a recognizable human subject. The ‘inalienable’ rights and ‘self-evident’ truths of the Declaration of Independence are the criteria by which their bearer is to be conceived, a subject who holds and is responsible for rights collected under the name ‘freedom.’ Like the right to own land, freedom, along with privacy, mystique, and uniqueness (the shadow government of the Spirit of Liberty), can be revoked if the bearer fails to meet the standards of ‘responsibility,’ or morality, decency, consistency — transparency.

shadowy_figure.jpg

The problem is not that some or all do not possess ‘real’ freedom, or that we are losing our privacy, or that the mystery of human consciousness is becoming less and less of a mystery. It is that we are conceived only as possessors of things, and that this is, or must be, the essence of what we are, the seat of our motivations and their root cause. After the poststructuralist critique, the nature of the subject comes down to two capacities: to have and to observe. It ‘is’ free — after a fashion — only because it (retroactively or otherwise) claims its freedom, which, in lived reality, takes the form of specific rights and ritual behaviors. Echoing existentialism, the subject (free because ethical) is precisely its claim of responsibility over its ‘own’ actions.

The Internet has changed things, though these changes have not yet been fully processed. The user, for example, is not immediately a subject. If it will not identify itself in ‘subjective’ terms, as a thinking and feeling being in full control of its constitutive, contentless ‘secret’ (its consciousness, its ethical core), it must be forced into this role. This is just the symbolic order in action, terror countering terror. But with the Internet, an old possibility has become newly obvious. Subjectivity, a social function no different from authorship, does indeed involve a choice. It is forced, but only in most cases, not intrinsically. This means it can be otherwise. Isn’t it now much easier to recollect intention and causation around shared rather than so-called original purpose? Hasn’t this always been an option, however difficult? What is a subject other than an ID tag, the name of a network, its hub? New Media/Facebook’s distributed subject is a retrenchment, a move as old as history. No need to live without a center of representation, no need to ‘invent’ new ones — only to recognize ‘subject’ as the form of bad faith, beyond Sartre, that it always was, to accept that mystery, morality, and freedom are active only within communication and action and inextricable from them, for they are life, and are always motivated by the needs of the world of life in all their boundlessness — to know that nothing whatsoever is given.