Archive for the The Internet Category

(Ideal) Self-Recognition

Posted in Art, blogging, Cultural Theory, The Internet on October 29, 2009 by traxus4420

In recent years, some people have adopted the list form only to strip it to its foundation, yielding ultra-simple pages consisting of sequences of images cobbled together with little or no explanation, each image radically different from its neighbors, each likely to confound, amuse, or disquiet. These web pages are often “personal” pages belonging to artists or groups of artists. Text is relegated to minimal captions in these Internet wunderkammern, and sometimes abolished entirely.

Let’s call such a page a hoarding. The word can refer to a stash of collected goods, but can also mean a billboard, or the temporary wall thrown up around a construction site. The look of the hoarding is similar to that of a particular type of artist’s book that has flourished in the last 15 years or so, featuring page after page of heterogeneous images, a jumble of magazine scans, amateur snapshots, downloaded jpegs, swipes from pop culture and art history alike, some small, some full-bleed, none with explication. The similarity is not coincidental, for “the last 15 years or so” defines the Internet age as we know it, with its ubiquitous, colorful mosaics, evidently a powerful influence on publishing of all kinds.

What can we say about the experience of scrolling through a hoarding, trying to understand the procession of pictures? As in traditional fashion magazines, we find excitement and confusion in equal measure, with one catalyzing the other. Beyond that, it often seems that any information or knowledge in these pages is glimpsed only through a slight fog of uncertainty. Has an image been spirited out of the military defense community, or is it journalism; is it medical imaging, or pornography; an optical-illusion, or a graph; is it hilarious, disturbing, boring; is it doctored, tweaked, hue-saturated, multiplied, divided; is it a ghost or a vampire? In any event, the ultimate effect is: “What the fuck am I looking at?” Something that hovers in your peripheral vision.

One might ask, how does this depart from the queasily ambivalent celebration of the image that has characterized the last fifty years of pop culture, possibly the last century and a half of mass media? It could be the muteness of the offering, the lack of justification or context. But the observation that modern media divorce phenomena from context is a commonplace, and usually an invitation to reflect on the increasingly fragmented nature of experience. A hoarding is notable because while it is a public representation of a performed, elective identity, it is demonstrated through what appears to be blankness, or at least the generically blank frenzy of media.

This may be a response to the embarrassing and stupid demands of interactivity itself, which foists an infantilizing rationality on all “Internet art,” and possibly Internet use generally, by prioritizing the logic of the connection, thereby endorsing smooth functioning and well-greased transit. Recourse to the almost mystically inscrutable may be understood as a block to the common sensical insistence on the opposition of information to noise, and as a form of ritualized unknowing.

It could also be a dismissal of the ethos of self-consciously generous transparency that characterizes “web 2.0”: the freely offered opinions, the jokey self-effacement, the lapses into folksiness in the name of a desire to forge reasoned agreement and common experience among strangers. It is wise to mistrust this earnest ethos, which is inevitably accompanied by sudden and furious policing of breaches in supposedly normative behavior. This is not to argue that such consensus building is disingenuous, rather that it is simply politics, in the sense that politics is at heart concerned with separating out friends from enemies. In this view, the hard-fought equilibrium of an orderly on-line discussion is indistinguishable from its scourge, the flame war: reasonably or violently, both aim at resolution and a kind of confirmation of established precepts. Might a hoarding—a public billboard that declines to offer a coherent position, a temporary wall that blocks reasoned discourse—escape the duty to engage ratio and mores and resolution, in a kind of negative utopian critique? No, it probably cannot. But the perversity of its arrangement of pictures speaks for itself, and what it speaks of is manipulation.

Seth Price

One cannot just set the pro forma Schmittian (just to give it a proper name) logic of this piece aside, but it is a rather elegant illustration. A ready made image for someone else’s ‘hoard,’ and my first revision would be to replace that 18th-century insult with a coinage from one of blogdom’s dearly departed, an Arcades Blog. Which is itself another reference, which is the whole point. Why does a series of captionless images have to be irrational or perverse? One can imagine future art historians concluding that the age of mass marketing’s greatest achievement lay in convincing the world’s consumers that images (and through the backdoor, ambiguity) are a priori the language of unreason. Certainly images can be used to think. More pernicious is the idea that images which are ‘simply’ affect manipulators (that is, have ‘nonsense’ as their manifest content) are for that reason lacking in logical sequence.

Immaturity. Escape. Vertigo. The cynical romance of commodities.

Though I have made frequent use of the photo montage on this blog, a more concentrated experiment can be found here. Even something like this, an image or two posted every now and then, sometimes with words, sometimes without, all apparently fitting the idea of the ‘hoard,’ is not without pattern or immune to meaning. If the wunderkammern were overdetermined by the excessive display of strange and uncommon objects, the image blog (here‘s one of my favorites; here‘s another) is a collection of moments of an all-too familiar process of circulation, captured, and in that moment of capture recirculated as something novel, their significance altered. ‘Defamiliarized,’ even. Even when their authorial anchor is just an arbitrary sign: traxus4420.

My naive intent for the tumble blog is the same as with this one: for each post to be useful as part of a new process of thought. Failing that, it is also made to be ignored. Is a challenge to ‘common sense’ possible with these things at all? If so, it can only be by demanding different kinds of attention and different kinds of thinking. Because the facade of irrationality that merely prompts us to “reflect on the increasingly fragmented nature of experience” is advertising. Though it might be all that separates one from the other is the presence or absence of a product.

Speculative Activism

Posted in Activism, Cultural Theory, current events, The Internet, U.S. Politics, Utopia on September 5, 2009 by traxus4420

This post is in response to a funny thing that happened a couple days ago on Facebook. Gerry Canavan comments on it here:  throughout the day, “thousands” of Facebook users posted a pro-health care-reform message as their ‘status update’ in  a sudden outbreak of ‘viral activism.’ The whole thing peaked when Obama himself joined in. Watch the virus spread here.

As Gerry puts it:

We saw the same phenomenon early in the summer during Iran’s so-called Twitter Revolution, which had two overlapping and sometimes conflicting modes: the use of Twitter by people within Iran as a organizing and news-distributing tool and the use by people *outside* Iran for the purposes of vicarious participation in political struggle. Then, as now, the important thing is to signal you’re on the right side of a fight in which you are otherwise just a spectactor — then by tinting your Twitter avatar green and now by posting a shared slogan as your status update and then leaving it altered for the rest of the day. We could go back to 2008 and 2004 elections, or to any number of other charged moments, and find similar memes at play.

The question posed by this sort of thing is clear enough: should it count as ‘real’ activism or is it just a mass twitch  in the general direction of utopia, a show put on for the official media and for ourselves.

Any answer has  to start by considering it as quite literally a form of consumerism. It’s a full step further in that direction than the email activism of organizations like MoveOn, which rely on the recipient to take some sort of minimal action, like making a phone call, writing a protest email, signing a petition, which MoveOn transfers directly to its prearranged target, usually a professional decision maker. These older forms are carried onto Facebook as well, but they’re weaker on this platform, easier to ignore, and require different techniques to get them to work. A ‘status update’ or a ‘tweet’ can superficially seem more democratic — after all, no institution is telling the user what to do. But in practice this ‘act’ is identical to the ‘choice’ of the market.

That our very existences on social networking sites are commodities is an often overlooked fact. Given an existence wholly circumscribed by a virtual marketplace, everything we do, everything we post, is potentially a commodity by virtue of its link to ‘us.’ In ‘viral activism,’ by reproducing a more or less homogeneous message (a ‘meme,’ one of the few instances where the word actually refers to something), a population makes itself available as a single commodity for use by others in exchange for  individual use of the same message as a ‘status update’: an advertisement that promotes a certain identity to their ‘friends’ (and to themselves). The only difference between this and any other Facebook content is that this ‘mega-meme’ is produced ‘from the ground up.’

These are not simply semantic distinctions — they have consequences.  Virtual activists do not organize themselves in the way real activists do, i.e. form permanent or temporary political units such as parties, mobs, parades, whatever, directed toward a specific set of goals. Even when activists remain law-abiding their actions are intended to stage a confrontation, to disrespect boundaries that may not be acknowledged by the law. A social division is made, exchange relations dependent on certain forms of equivalence are foreclosed (i.e. politeness, personal space, a traffic intersection, etc.). As long as it’s part of a larger strategy from the beginning, this is true even of petition-signing. Virtual activists on the other hand are always responding to/initiating various types of interpolation from within an institutional setting (the site’s apparatus) that automatically neutralizes all it touches,  like ‘interactive’ television. A Facebook group is just a passive ‘tag,’ another identity accessory for the individual user and a commodity that passively awaits outside use (a social ad). As long as their virtual existence  is immanent with that institution (they remain members), all actions are wholly included within it, with zero remainder.

What are social ads good for? By aggregating the many status updates into a single product, they provide something for the bigger blogs and journalists to ‘report’ on (really just an outgrowth of tagging), and  from which a political meaning can be derived or invented. First and foremost they generate conversation, and since most of it will refer to Facebook if not occur on its platform they also indirectly generate more Facebook use and more prestige, a ‘status update’ for Facebook itself. Whether or not any of this can ‘make a difference’ is dependent upon how these commodities are employed by others.

The effects of this latest capture of the social reflect how our tiny plots of spectacular real estate turn us into micro-celebrities, where even to contemplate ‘action’ forces us into a narcissistic obsession with our public image, no matter how inconsequential it may be. Celebreality shows and the higher profile of porn stars in recent years show us that has-beens and nobodies fighting for table scraps will play the game of recognition even more ferociously than Hollywood royalty. The public face of this private complex is when celebrities, politicians-as-celebrities, or now you-as-celebrity endorse certain causes, ultimately all responsibility rests on YOU to act, even as the possibilities for action of the relatively elite YOU being addressed (the YOU who can be expected to take Them seriously) are increasingly observed, micromanaged, routed into narrower and more regulated pathways.


A final comparison to opinion polling is helpful in getting at the ideological function of social activertising. Unlike polls, the opinions of users don’t appear as already existing truths, dependent on the work of experts on ‘real’ demographics, but those truths actively expressed. Where a poll is employed in speculation — what x group ‘really thinks’ at a given moment is valuable as evidence for what actions they might take in the future — a wave of status updates or green-tinted Twitter profiles appear to assert themselves as political facts. No research or fact-checking need be done to evaluate truth claims when the phenomena is just the free and unsolicited manifestation of truth, like votes or sales figures. These ‘actions’ thus merge the legitimacy of a poll with the immediacy of activism. Virtual activism is more real than statistics (which are ‘always’ rigged), more legitimate than protests (which are ‘always’ dangerous).

Jonathan Singer (see link above):

While the vast majority of the political organizing I see on Facebook tends to come from the same names — friends working in politics on a full time basis — what is remarkable here is that these status updates containing a strong and clear message in favor of healthcare reform are coming not only from the political community but also from those whose lives are not immersed in these fights. These are regular young people, all around the country, speaking out in favor of reform. This movement is impressive and surprising, and, at least from this vantage, quite newsworthy.

This is what everyone said about Iran, the rhetoric directing us to understand these movements as made up of “everyday” people, free of the supposed dangers and ‘biases’ of ‘professional’ activists. Of course there is a selection process for which ideas can ‘filter up’ from the social network ‘netroots’ and what kinds of users can do what that tends not to be acknowledged. This selection process is, broadly speaking, class-based.

Here is a great article on one example of how class manifests online, the great divide between Myspace and Facebook with some very illuminating (and horrifying) quotes from teenagers. Facebook has clearly won the PR battle, easy to do when the New York Times’ reporting staff and most of its readership is made up of Facebook users. Facebook is the appropriate platform for politics, just as Myspace is the appropriate platform for your ex’s rock band and various sex offenders. This doesn’t even count the selection process for who gets to be on the Internet to begin with. And yet, through the magic of social networking, it is the Facebook community which is quickly establishing itself in the 24-hour image universe as the new legal-utopian definition of ‘the people.’ The obvious impossibility of this fantasy doesn’t mean it won’t have certain effects.

For a demonstration, let’s put on some ruling class spectacles and look at some pictures. Isn’t this:


infinitely preferable to this?


See? You didn’t even have to think about it.

Blogs, Form and Sense: A Compendium

Posted in blogging, Cultural Theory, Lacan, Media, Parody, Political Theory, The Internet with tags , on August 21, 2009 by traxus4420

Maybe in the early days of blogging the medium seemed poised to open new dimensions of creative expression, where all sorts of people could express anything from themselves to other stuff. In reality, human creativity is rarely marketable as such beyond the scope of individuals and small groups. It probably has to do with being a human myself, but from the proverbial birds-eye view people and their actions look less like unique liberated snowflakes and more like snow.

Now we know there is a finite number of genres available to the entry level blogger. What is less often acknowledged is that just like corporate news, each of these genres carry with them their own structural logic of representation, which manifests as their own built-in ‘slant.’

To stay objective, we’ll avoid immediate issues (like health care) and pick some old news. Here‘s a topical AP piece from last month:

UNITED NATIONS — Out of genocides past and Africa’s tumult a controversial but seldom-used diplomatic tool is emerging: The concept that the world has a “responsibility to protect” civilians against their own brutal governments.

At the U.N. General Assembly, Secretary-General Ban Ki-moon pushed Tuesday for more intervention for the sake of protection.

“The question before us is not whether, but how,” Ban told the assembly, recalling two visits since 2006 to Kigali, Rwanda. The genocide memorial he saw there marks 100 days of horror in which more than half a million members of the Tutsi ethnic minority and moderates from the Hutu majority were slaughtered.

“It is high time to turn the promise of the ‘responsibility to protect’ into practice,” Ban said.

How does the blogosphere respond? I limit myself to blogs of the ‘left-of-center’ persuasion — whatever differences in ideology they may have are also differences in style. That, at least, is my working hypothesis.

The linkblog:

Unhappy Monday links:

– Think we’re out of the recession? Doug Henwood says think again.

‘Expert warns against advent of ‘Terminator’-style military robots.’ If you’re unemployed, don’t sell your Playstation — there may be hope for you yet:

The US currently has 200 Predators and 30 Reapers and next year alone will be spending US$5.5bn (€3.84bn) on unmanned combat vehicles.

At present these weapons are still operated remotely by humans sitting in front of computer screens. RAF pilots on secondment were among the more experienced controllers used by the US military, while others only had six weeks training, said Prof Sharkey. “If you’re good at computer games, you’re in,” he added.

Ender’s Game, here we come.

– In foreign policy news, the “responsibility to protect” doctrine has been getting more and more airtime. According to President Obama, there are “exceptional circumstances in which I think the need for international intervention becomes a moral imperative, the most obvious example being in a situation like Rwanda where genocide has occurred.”

As an on-again off-again pacifist, I’m deeply skeptical about any use of military force (particularly U.S.-led), but must confess not knowing nearly enough about the situation in Rwanda to make a sound judgment on that score.

– To compensate for your worries of U.N.-backed robot takeover, say hello to TOFU, “the ponderously eyebrowed robot fuzz owl with OLED eyes and some seriously rhythmic body jams.” Via (who else?) BoingBoing Gadgets.

The libblog:

I know we tend to stick to domestic politics around here, but if the Afghanistan/Iraq debacles have taught us anything, it’s that in this country we can’t afford to treat foreign and domestic policy as completely separate issues. The corporate media try to make it easy by chronically underreporting anything they can get away with, but this conditioned state of ignorance is unsustainable. The state of one affects the state of the other.

In the field of international relations, the issues of sovereignty and the right of other nations to intervene is a highly vexed issue. How do we legitimate ‘good’ uses of force, like Kosovo and Haiti, while preventing ‘bad’ ones, like Iraq? How do we reliably prevent acts of genocide, as in Rwanda or (arguably) Darfur, without risking the misuse of the same rhetoric for neo-imperialist purposes?

An increasingly important potential solution is emerging, known as ‘responsibility to protect,’ or R2P.

[here follow about 1,500 words of analysis of policy documents with links to the original pdfs)

In conclusion, a renewed liberal international order is our only hope. There is a real difference between a liberal, internationalist hegemony and an imperial, nationalist one; in fact it’s all the difference in the world. And we have the power to push our nation’s policy and culture toward the one and away from the other. Not just the power, I would argue, but the duty: religious fundamentalism and the Bush White House’s excessive response to it have shown us that universalism without tolerance is a recipe for global catastrophe.

I know I can’t speak for all of you on this one. It’s something that as liberals we need to discuss, and I urge you to get the ball rolling in the comments below. Keep it respectful, y’all.

The professional ‘expert’ as editorialist or someone who blogs under the assumption that their (usually well-respected) professional specialty gives them unique insight into events that often have little or nothing to do with that specialty:

…in my book, Twitsturbation Nation: How the Internet Generates Community, I made the argument that traditionalist notions of autocratic sovereignty would be the first major casualty of the Internet’s production of society from below, one narcissistic avatar at a time. Today, even the biggest figures in international leadership are keenly aware that Web access is changing the way politics works at all levels, from policy to advocacy, from elections to revolution. “You cannot have Rwanda again,” Gordon Brown said last month, “because information would come out far more quickly about what is actually going on and the public opinion would grow to the point where action would need to be taken. Foreign policy can no longer be the province of just a few elites.”

I don’t say this simply to brag about my foresight, but to make an important point about how attitudes change. Not long ago, the U.N. held a conference on the ‘responsibility to protect,’ a new doctrine that would set new standards for humanitarian intervention. In the past, even to attempt such a thing would have been immediately (and wrongheadedly) denounced as ‘imperialism’ by most liberals, and, post-Somalia, as sheer folly by realists. But we live in a different age. Life on the Internet is changing the way we think about the responsibility we have to one another, regardless of race, nationality, gender, or religious differences. How else could a stolen election in Iran generate such spontaneous support among the youth of its national enemy, the U.S.? It’s true that many suffering people don’t have access to the Internet, much less platforms like Twitter. But our imaginations have expanded to include them, and aid programs are not far behind. If this talk of responsibility sounds terribly old-fashioned, perhaps  one should draw comfort from another ancient adage: the more things change, the more they stay the same.

The ‘literary’ editorialist or the blogger who, motivated by a frustrated ambition to be a novelist (successful novelists don’t have time for ‘real’ blogging, see below), attempts a form of online commentary that is literature in its own right :

“May you live in interesting times.” So goes the ancient Chinese proverb which is not a blessing, but a curse. And yet, even after the amused Western reader recognizes this, that ‘interesting’ retains its double edge. For we must admit that most suffering is not interesting whatsoever, even to the sufferers themselves. Suffering is common. Suffering is boring.

So it is almost surprising to the typical U.S.-ian solipsist (yours truly) to read about occasions like this, when serious policy thinkers debate in serious policy language the future of ‘humanitarian intervention,’ justifying the refocusing of the war machine with shocked, shocked descriptions of brutal, nay, genocidal violence still going on in darkest Africa. As if its persistence were in violation of some cosmic ordinance and not just the willfully impoverished cant of Empire, the Beast that rapes the already pillaged; as if the history of suffering had not already been printed in history books, academic journals, even (cough!) newspapers.

Though this is perhaps not so surprising: because politics is boring too.

And I, I struggle once again for inspiration, and the nerve (the blessed, unholy nerve) to write once again the already written.

The propagandist:

Another day, another insult to sanity:

[Ban Ki-Moon] advised limiting U.N. action under the ‘responsibility to protect’ concept to safeguarding civilians against genocide, war crimes, ethnic cleansing and crimes against humanity. He acknowledged the possibility of some nations “misusing these principles” as excuses to intervene unnecessarily, but said the challenge before the U.N. is to show that “sovereignty and responsibility are mutually reinforcing principles.”

This is the same old messianic language of imperial violence, rephrased to appeal to latte-sipping Hardt-Negrian shills. All states are on the verge of ‘failure,’ and can only be evaluated by external criteria. Never mind the totally negligible and contingent fact that some states are ‘too big to fail.’ People are suffering, dammit!

Far from a universal degradation of sovereignty, what this amounts to is the invisible justification of a few ueber-powerful states, based on two mutually defining concepts of ‘failure.’ Under this proposed division of governmental labor, a country like the U.S. has a ‘responsibility’ (entirely unrelated to its ‘excesses’) to ‘supply’ military force to nations that, whether because sanctioned by the U.S. or on the wrong end of the international ‘free’ market, are unable or unwilling to prevent human rights abuses. ‘Success’ means either a) all nations magically achieve the status of liberal capitalist states with their militaries outsourced to the U.S./U.N. or b) the U.S./U.N. ‘intervenes’ and punishes the evildoers.


I’d go into more depth, but Lenin’s Tomb has beaten me to the punch — make sure you check out these two typically awesome and well-researched posts.

One last thing: good to see folks getting disillusioned with Obama’s domestic politics, but his ideological misreading of Rwanda and tacit support for ‘R2P’ once again reinforces the obvious: that he’s just as firm a supporter of imperialist intervention as Bush, despite his pragmatic reservations.

Let’s keep fighting, y’all.

The Critical Theorist:

The following video clip illustrates a salient point I want to make about ‘the call’ to humanitarian intervention (periodically resurrected in mainstream political discourse despite frequent criticisms; for an example see the increasing popularity in policy circles of the odious ‘new’ doctrine of ‘responsibility to protect’) as a standard ideological gesture, in Jameson’s terminology an ideologeme, “the smallest intelligible unit of the essentially antagonistic collective discourses of social classes.”

The sublime moment comes when we are told that “This is Reality” precisely because “We Have The Power To Change It.” The shift involved is purely anamorphotic, a shift internal to our own perspective: we cease to be the moral subject negatively threatened with a loss of reality in relation to the significance of its categorical claims and become this subject of transformative Power in relation to this (subsequent) representation of (the True) Reality alongside essentially Sublime objects. But this sudden shift in the phenomenological value of the image content becomes one of utopian positivity when and since it prominently features an anonymous, well-funded team of caring ‘peace core types.’ To follow Zizek’s reading of ideologically sublime objects, the paradox is hence that “pure difference” between the form of our moral subjectivity with its impossible categorical mandates and the real conditions of objective violence which underlie it as revealed by the ad’s negative perception, become the point of their greatest Truth, of their sublime Identity with ‘Reality’ through the supplement-object as the representation of the other ‘subject supposed to care.’ The starving African children become the Real of moral tragedy not because of the descriptive content their images are supposed to represent (that they are in fact in Africa which is in fact plagued by readily observable and structurally necessary economic and political instability) but precisely because they signify the point at which our categorical moral claims become meaningless, because Africa is the place in which ‘inalienable human rights’ become inapplicable in the face of the objective historical necessity of incomplete ‘development,’ and the ostensibly ‘supplementary’ fiction of the ‘NGOther’ becomes essential.

This is the paradoxical moment of the Kantian sublime, in which “a[n enjoyable] representation arises where we would least expect it” of the Truth beneath our avowed categorical moral claims, at the point of the very impossibility of fully realizing our formal categorical moral subjectivization within the symbolic order. Following Lacan’s famous reading of Kant via Sade, we can say that this reveals the truly Sadean dimension of the Kantian moral law. Just as the Sadean fantasy imposes upon the subject the impossible pathological injunction to enjoy his victim’s sublime body without any regard for the limitations imposed upon it by real mortality, the Kantian categorical moral law is “the Real of an unconditional imperative which takes no regard for the limitations imposed upon us by reality—it is [a formally equivalent] impossible injunction.” Hence the subject is ‘freed’ from burden of its impossible demands through the presentation of this very impossibility, by submitting to the ad’s ‘irrational’ categorical imperative, and thus it only fully assumes this identity in a disavowed, ‘properly distanced’ manner, through the moral object supposed to care, the transcendentally ‘free’ subject of transformative Power whose ‘gear’ begins to fill the screen.  In this sense the sublime experience is, following Zizek, strictly one of false inter-activity: as our traumatic kernel of real-life impotence/passivity is transcended by the little other(qua imaginary subject supposed to care)’s enacted desire, our real-life activity becomes structurally equivocal with the enactment of this desire in the gaze of an impersonal, unconsciously assumed big Other.

The act of donation is hence properly a phenomenon of surplus jouissance, literally the enjoyment of sense, of the (material and hence significant) making of sense: ‘joui-sense’ is precisely this sublime experience of a signification who’s meaning is only truly known by the Other object-supplement (its imaginary referent) but is formally assumed by the subject as its ‘efficient’ cause. But from the very beginning of the ad we are already ‘sublimely’ subject to the obscene injunction to enjoy ‘our’ own subjective position precisely as a barred subject, as the contingent content of the enunciation of a categorical ‘You’ that perhaps also enjoys what has now come to be the simulacral myth of Michael Jackson-type innocence: one that survives despite being foreclosed from the formal Law as such. The realized injunction to donate is hence not only a ‘truly sincere’ investiture in the sublime meaning produced, but the assumption of this impossible-real objective presentation as a subjectively necessary condition for this ‘meaning’ to exist as the retroactively attributed Truth to ‘Your’ ‘real’ activity. Is this not the perfect analogue to injunctions of ‘international law’ and their justifications? The point is to realize that both ‘support’ for any given ersatz ‘law’ devised in the interests of global capitalism’s elite oligarchs and individual donations to humanitarian NGOs are made effectively real for the subject only by passively making what is, in fact, a ‘purely symbolic’ gesture for the gaze of an assumed big Other, and that the sublime enjoyment we gather from our fundamentally passive ‘participation’ is that of producing a signification of this Other’s desire, of assuming the subjective role of an object-cause for this Other’s active enjoyment.

The hipster editorialist:

Yall. Starting to get annoyed seeing sOO many blogs and ‘articles’ about celebrities trying 2 ‘make a difference’ by applying their personal brands to ‘3rd world shitholes’ (i.e. Hotel Rwanda). I feel like ‘activism’ oriented vaycays have prior brand identity as what MSTRMers and meaningfulcore bros do in college over the summer to ‘find themselves.’ Feels ‘unfair’ for celebrities with private jets to make 10x of a difference in 1 weekend than u and me ever could in our entire lives.

It’s kinda weird how ur supposed to go somewhere where ppl are ‘less fortunate than u’ at some point in ur life. Whether it is Africa, New Orleans, Detroit, or rural Missouri, there are people who are less fortunate than ‘us’ every where. Just want to appreciate my family + personal social networks on the internet more than ever when I see people who are ’suffering’, ‘uneducated’, ‘hungry’, and ‘0% self-aware.’

Sort of feel bad that I dont ‘get’ ‘what the big deal is’ about Africa. Not sure why I’m supposed to care about ‘millions’ of ‘lil negroes’ who don’t add value to my lifestyle/product lines. It’s hard 2 integrate ‘giving a shit about the world’ with my post-chillwave personal brand. But there comes a time when every entity with a ‘public voice’ has to use their voice 4 good. I don’t know what cause I’m going to rally around, but it will probably be something tangible/meaningful in my ‘personal life.’

How bout yall?
Do yall feel like Africa should be ‘first on the agenda’ for 2k10?
Does ‘the West’ (via Barry Obama) have a ‘responsibility 2 protect’ ‘troubled regions’?
Any ideas 4 how 2 spread chill values like ‘human rights’ and access to sweet social networks to places where ‘folks can’t read’ and/or vote?
Should ppl just ‘mind their own damn business’?

The self-promoter – all blogs are fundamentally tools for self-promotion. But some bloggers are of such elite status that they don’t have time for anything else. This status can’t be gained purely through blogging, only by taking advantage of the blog’s effect on one’s career. The struggling novelist publishes with Harper Collins. The professional editorialist begins to appear on TV. Etc. While sometimes difficult to tell apart from the linkblog, the promoter”s slightly higher ratio of self-disclosure (treating the blog literally as an ‘online journal’)  is one sign that they are in fact of two distinct species — celebrities and normals. It is at any rate the final stage of evolution for all blogs:

This morning I woke up to this outside my window. Ah, Brazil. How I loathe to leave thee.

– Launch party for the new book next Friday at the Hive. Open bar after my reading. I’ll see you there.

– Interview up at DesignBlog.

– My good friends Ted Brand and Sylvie are performing tonight at the Pinhook. Mp3s available here. I (obviously) can’t make it, but that doesn’t mean you shouldn’t. Show some love!

– Thanks to Kamau for bringing this link to my attention: some interesting debates going on in the U.N. about international responsibility post-Rwanda. Speaking of which, donate money to this site.

God bless.

Responsibilities of a pundit

Posted in Activism, Cultural Theory, Media, The Internet with tags , on June 14, 2009 by traxus4420

Struggling to keep up with events in Iran yesterday occasioned some good discussions with friends, which in turn generated a few thoughts on responsibility. I’ll try to keep them brief.

The idea that allegiance to one side or another is a universal responsibility is usually taken to be constitutive of politics. Much like the injunction to get a job, this demand is usually preceded by an acknowledgment that one really wants something else: a pure utopia of some kind, or just to be lazy, ‘absolutely’ free, to give the finger to someone in authority, etc.

That is, responsible politics tends to be articulated from a position of more or less tragic realism.

An example. Mainstream commentators on the Iranian post-election protests think the election was obviously rigged in favor of a politician they were already contemptuous of, Ahmadinejad. The people on the street are therefore ‘good’ rioters who just want freedom from tyranny, like the CIA-backed ‘popular struggle’ against Chavez. Insofar as they support Mousavi, the pro-economic liberalization reform opponent, veering no further left, they will remain good. Liberal reform is “realistic,” “college-educated,” “urban,” “tech-savvy,” and “at least it’s better than Islamofascism.”

On the other side, a number of left commentators are willing to at least water down critique of Ahmadinejad’s reactionary views and repressive policies in order to resist this sort of propagandizing appropriation by the western press. I’ve even seen it argued in the past that it’s “every socialist’s responsibility” to “support” the Islamic state, along with the Taliban, Hezbollah, etc. But generally with Iran, and conservative or radical Islamic political actors overall, there is a good deal of confusion over what side leftists should take.

It’s still of course too early to tell exactly in what direction things are going, if the election really was rigged, what the strength of the anti-Ahmadinejad protests are, who is involved, to what extent they’re being irresponsibly inflated (probably a lot).

UPDATE: Then again, perception is reality, etc. (2nd link via Canavan)

UPDATE2: Some election results (via arabawy)  — check everywhere else for criticism.

But the point is there’s a relationship between wanting freedom for others and claiming freedom for oneself. Especially for anyone who considers themself a radical egalitarian, in this world siding with a national party should always be the option of last resort. I see no reason to voluntarily submit to the stupidity of bad against worse in another country when most of us are already pressured to do so in our own. It’s not ‘strategic’ for an actor in the spectacle (a blogger, say) to compromise his or her political or moral views to vicariously ‘participate’ in other peoples’ struggles. Defending Hamas or Hezbollah’s resistance (an extreme example) to Israeli aggression makes the defender neither a subject nor an official ally. On the contrary, protest is necessary when your country is vicariously participating in other peoples’ struggles. Solidarity is with people. Not their states or their twitter profiles. I find it a pretty warped idea of politics that refusal to make a show of obedience to someone else’s party line, especially when there are no material consequences for oneself either way, should be looked on as weakness, incoherence, dilettantism, or ‘bourgeois’ vanity. The opposite is closer to the truth — it is after all the MSM’s favorite propaganda tool to associate its critics with fictional cabals, while affirming the “true desire for freedom and democracy” of “the people.” The mark of the informed-but-still-ignorant pundit is to think of everyone else as the conscious or unconscious minion of a higher power, and of himself as a ghost.

To make an even more general point, I don’t pretend to know what’s best for Iranians, autoworkers, women, or illegal immigrants in their capacity as Iranians, autoworkers, women, or illegal immigrants. Being a media consumer of other peoples’ problems is a privilege. It’s a privilege to be informed free of direct involvement, not to be forced to take a side contrary to one’s real interests and desires. Which is why I am automatically suspicious of any attempt to convince me to give it up in the name of some greater responsibility that has little or nothing to do with my material existence. The ‘irresponsible’ fantasies and inner urges presumed by tragic realism (utopias, lands of Cockaygne, ‘savagery’) are figments of its own foreclosed imagination. As a blogger/pundit (an even greater privilege), my only ‘job’ — which in all but the most exceptional cases can only carry hobby status — is to listen, transmit what I hear, and attack lies told at the expense of those struggling to defend themselves.

This is all potentially useful, and I accept no guilt for voyeurism as such. But I can’t “identify” with the televised other, or “see the world through their eyes.” No revelation of exploitative supply chains, no tearjerking column in the New York Times by an ‘authentic’ refugee, no Oscar-winning independent documentary, and no Facebook group, however informative or compelling, can permit me to be them. The media’s most powerful feature requires so little discernible effort by users as to qualify as its ‘unconscious’ effect, what makes both its truths and lies maximally productive. The power to make your problems look like those of other people, and other peoples’ problems look like yours.


Culture = So ’90s

Posted in Apocalypse Porn, Cultural Theory, The Internet, The Singularity with tags , , , , , , , on May 3, 2009 by traxus4420

The ’90s were culture’s last hurrah. I won’t bore you with the list of ‘cultural innovations’ that originated in the ’90s (ok here’s a few: rave, mashup, jungle, reality TV, Onion/Daily Show joke news, MMORPGs, indie rock, Britpop, crunk, grunge, hipsters). In a conversation with some friends we agreed on 9/11/2001 as the ’90s’ proper cutoff date (as 11/11-12/1989 was its proper beginning). The dominant form of radical politics in the West — anti-capitalist, direct action oriented anarchism — had its climax in the 1999 action in Seattle, and has been drawing on more or less the same operational principles. K-punk has been complaining about this for a while; see here for his latest:

Are cultural resources running out in the same way as natural resources are?

Those of us who grew up in the decades between the 1960s and the 1990s became accustomed to rapid changes in popular culture. Theorists of future shock such as Alvin Toffler and Marshall McLuhan plausibly claimed that our nervous systems were themselves sped up by these developments, which were driven by the development and proliferation of technologies. Popular artefacts were marked with a technological signature that dated them quite precisely: new technology was clearly audible and visible, so that it would be practically impossible, say, to confuse a film or a record from the early 1960s with one from even half a decade later.

The current decade, however, has been characterised by an abrupt sense of deceleration. A thought experiment makes the point. Imagine going back 15 years in time to play records from the latest dance genres – dubstep, or funky, for example – to a fan of jungle. One can only conclude that they would have been stunned – not by how much things had changed, but by how little things have moved on. Something like jungle was scarcely imaginable in 1989, but dubstep or funky, while by no means pastiches, sound like extrapolations from the matrix of sounds established a decade and a half ago.

Needless to say, it is not that technology has ceased developing. What has happened, however, is that technology has been decalibrated from cultural form.

And just to drive the point home, here’s something awful that just came out:

Just execrable, really. All the latest tricks of pop music from the last two years: the beefed-up 808 beats, the vocoder/auto-tune gliss, the sci-fi aesthetic, the marketable female vocalist teamed with inoffensive hip-hoppers (all of which really date back to the ’70s or ’80s and electro), wrapped into a single iTunes ready package, the function of a product like this is to fill up the club, announce to hipsters that the latest phase is ‘dead,’ and prepare the way for Kanye to introduce the next mild remix of the pop culture of the last 50 years. I found this video here (a ‘culturally relevant’ blog), as part of a funny bit of consumer advocacy/tastemaking. The blogger, inspired by the brazen cynicism of the above, fantasizes about being naive enough to straightforwardly enjoy it:

Sort of just want to be ‘a stupid mainstreamer’ who gets pumped up when I heard a song like Gnarls Barkley’s “Crazy” or the song “Let’s Get it Started.”

I want to be a white person
who ‘gets jacked up’
whenever a black person in ‘cool clothes’
comes out, waves their arms
rhymes into a mic
and tells the crowd to ‘get on their feet.’

I will go home
buy this popular song on iTunes
listen to it when I ’start my workout’
to ‘get fucking pumped up’
then choose a slow song 4 my ‘cool down’

What this ‘poem’ recognizes in its very act of critique is that there is no such thing as ‘straightforward enjoyment’ of this song, which is objectively tired — either you admit it or you don’t, or you don’t listen to pop music enough to care either way. Setting aside fan loyalty to the band (theoretically possible), it’s not a song that inspires partisanship; even to those who profess to like it, it can be nothing more than a mass product; enjoyment of it is strictly culinary. As per the norm of modernist criticism, the loss mourned by both k-punk and ‘carles’ of hipsterrunoff is the attitude that would deny this ‘general’ or ‘common’ complacency toward culture a position of normative authority.

What has already been noted is that this narrative of cultural vacancy is overwhelmingly white, (probably) male, not a little techno-centric, and elitist in a way that feels quixotic (all registered ironically by the latter commenter and not at all by the very earnest k-punk). ‘Cultural form,’ when used as the scoring rubric according to which the ’00’s are lacking, should not be confused with a mere empirical analytic. Built into this notion are the aesthetic criteria for determining excellence, the potential for the emergence of masterpieces, and the legitimacy of criticism, criteria which only make sense within the history of European bourgeois aesthetics (see Francis Mulhern on the authority granted to culture here).

There is a left version of this among critical modernists like Fredric Jameson, where a culture is expected to go beyond the ideology of the New and produce the means of criticizing, or at least ‘mapping’ (though the visual metaphor is not really apt) its context; the equivalent of a masterpiece here would be anything from which a critic could derive knowledge beyond the fact(s) of the object itself, and could come from virtually any cultural sector; from popular entertainment to the avant-garde. A Jamesonian masterpiece is didactic, albeit in a special, often ‘unconscious’ sense. A masterpiece within postmodernism — a general situation where masterpieces are impossible — would have to be some sort of throwback, somehow outside of its own time. Which, or so I like to fantasize, is why Jameson became a theorist and not a novelist.

“Yet once this initial disjunction between the present and the New is granted, the inevitable stages of a decline, the progressive decadence of an inauthentic modernism, follow logically enough. For the New, and the break it stages with tradition, now quickly unmasks itself as a commitment, not to the present but to the future. It thereby generates spurious narratives about the development of art in general, in which the discredited bourgeois value of progress is secretly or not so secretly installed in the aesthetic realm.” (Jameson, “Transformations of the Image”)

He goes on to attack the anti-theoretical pseudo-aestheticism that tends to replace the rejection of (pseudo) modernism, and which feeds into the “nostalgia film” and the spurious, neo-Romantic ‘return’ to beauty, religion, and folk culture. The point is that there is no simple alternative direction being offered; it’s an impasse, its causes more or less identified. Which is why k-punk’s use of Jamesonian motifs have always seemed to me to fall within the latter’s critique of postmodern pastiche. It is only possible to assert Jameson himself as an arbiter of taste if the untimely irony of his style is erased, if the Adornian dialectics are dropped out and he is turned into a kind of Spinozist. This isn’t necessarily a bad thing, but would necessitate the abandonment of any pretension to critical theory and the very notion of cultural politics as a quasi-autonomous sphere of activity.

The avant-garde modernism of forms is not an ‘alternative;’ it is the historical background of our current situation. In a properly Spinozist universe, if the public intellectual can’t or won’t engage directly in left politics he must become an adman.


But let’s return to the argument about the disjunct between technological and cultural development. YouTube, blogs, and the rest of the Web 2.0 apparatus are mere technical platforms according to this logic: the means of production of cultural form and not forms themselves. This entails a rather strange relation between culture and technology, where changes in the latter are supposed to catalyze changes in the former without  determining them, and the depth of cultural change is to be evaluated solely from within the inherited cultural discourses: those of art, music, and pop culture criticism. A new technology is not a new cultural form. Culture’s “decalibration” from technology, then, indicates the failure of new ‘technological’ products to meet ‘cultural’ criteria. By this I mean criticism cannot read any of these new objects as even potentially masterpieces without straining credibility.

Jameson’s theory of postmodernism names this cultural failure and connects it to developments in capitalist political economy since WWII and the rise of consumer society. What I want to suggest is that the ideological discourse of novelty and innovation should be understood to include the actual development of new technologies. For the past 30 years at least, technological innovation has been tied to the fluctuating demands of financialized consumer capitalism. Like the television, the iPod is both a technical and an ideological product. On what basis could theories of cultural innovation even theoretically be divorced from those of technical innovation, or financial innovation for that matter? Culture’s calibration with technology is postmodern ideology.


The shift over the past 20 years or so is in how we are now trained to experience both technological and cultural development, and how the ‘innovations’ themselves are designed to look, feel, and function: not as a series of revolutionary shocks a la Toffler and McLuhan, but as a numb, predictable wave of tweaks and ‘updates.’ Cable to wireless; iPod to iPhone; MySpace to Facebook; Blogspot to Tumblr. As velocity increases, change of every kind is normalized, routine, invisible. Ray Kurzweil, the Toffler of our day, gives us the fantasized radical telos for all this high-speed incrementalism: “what will the Singularity look like to people who want to remain biological? The answer is that they really won’t notice it, except for the fact that machine intelligence will appear to biological humanity to be their transcendent servants….there’s a lot that, in fact, biological humanity won’t actually notice.” It’s capitalist common sense that the promise of  ‘authentic’ change on the modernist model is how various competing interests perpetuate themselves. Every attempt to read it otherwise is just another ad.

We aren’t living through the breakdown of culture’s dependence on technology but its culmination, the near-fusion of content and form and the omnipresence of culture as novelty. It’s therefore pointless to expect new ‘cultural forms’ to emerge from new media, where it’s all about the platform. With Web 2.0, McLuhan’s “medium is the message” slogan is now true on the most banal level. We may have, gradually, inched our way past the context in which ideas like ‘cognitive mapping’ and ‘new cultural forms’ had purchase as critical aesthetics, centered as they were around distinct works, around individual, intellectual comprehension and consumption. Mapping now is distributed among multiple ‘works’ — instances of participation — each of which is unthinkable on its own. This new connectivity has a variety of possible uses as well as risks, most of which are illegible to 20th century aesthetic theories incapable of acknowledging anything not a celebrity masterpiece, war, or revolution.

Traditional cultural (and political) practice of course continues at all levels, continuing to demonstrate, despite what the hipsters say, that the trajectory of mass culture is not destiny. And this whole chain of reflections was inspired by the apocalyptic troika of economic, ecological, and energy crisis, the awareness that the bourgeois progress narrative in all its various generic forms is a destructive and suicidal fantasy, a junky dream. Seeing these imminent disasters as a challenge for ‘culture’ to regain ‘symbolic efficacy’ is to remain enlisted in its reproduction. The full-color, hi-def imagination of real alternatives cannot precede the behaviors that actually discover them. That’s because the other worlds, the ones outside whatever features one might hate about capitalist popular culture and its various ghettoes of self-righteous self-loathing, have always already been here.

Moretti and the Humanities; Or, More Meta-Blogging

Posted in Education, Literary Criticism, Science, The Internet with tags on April 9, 2009 by traxus4420

Franco Moretti’s proposal for a sociology of literature has been coming up in more and more (offline) conversations. More than that, some friends (enemies, rivals, etc.) have been trying to put his ideas into practice. Certainly if one is going to take him seriously one must either do that or reject them outright. As I see janedark pointed out a month ago, Moretti is not shy about insisting on the complete reconstruction of the field of literary study in the image of social science.

Thinking about Moretti’s project focuses attention on the most basic fact about literary criticism: that beyond ‘mere’ scholarship, what defines it as a distinct activity is interpretation (I was going to add the objects of lit crit as a distinguishing factor — but these days historians and sociologists read novels and poems too). To that end it has developed several specialized techniques for interpretation. Though I don’t do much of it on this blog, for a while I’ve thought that much of what even scholarly critics do is, in theory at least, better accomplished through this platform than through the usual institutional vehicles (journals, conferences, etc.), and not just because of all the obnoxious barriers to entry. To the extent that criticism is individual reflection on consumption, its ability to produce knowledge is limited by a whole host of factors: sample size, ‘bias,’ an incapacity to reliably tell when any given judgment is the exercise of taste, which is itself a more important object of study than any individual text. Traditionally the gaps and excesses of individual readings are filled in or marked out by other readings, though without the capacity to say much that is reasonably conclusive. This is not to say that interpretation is meaningless (a funny thing to say, to be sure), that social scientists don’t interpret, etc. Just that literary critics are only sporadically able to make claims that permit general evaluation.

I think these rather pedestrian, even naive observations take on a whole new significance in light of the possibility of large-scale institutional projects where entire genres and regions of literature can be ‘read’ simultaneously. Read like a computer reads – saving interpretation for the end, when the critic has a body of publicly available evidence to make meaningful (“years of analysis for a day of synthesis”). Then debates over irreconcilable differences in the always unique, personal, imaginary experience of reading stop looking like the necessary labor of a discipline groping its way to some semblance of enlightenment and begin to seem – common. That is, lacking methodology, being instead the kind of thing that ‘anyone’ could do, and does.

I would want to push Moretti’s provocation further, to include debates in politics, philosophy, ideology, and even history, where the central piece of evidence is the personal experience of a text or small set of texts. Of any genre. I’m inclined to think that these activities are common, ‘now more than ever’ since blogs connect and make public the conversations of anyone with Internet access. Such published, recorded dialogues, themselves potentially the objects of distant reading, better serve some form of statistical analysis for the same reason they enhance the experience of the participants: by being easy to access and widely inclusive.

To cut to the chase: why should anyone be paid to argue about books? One usual reply is that professors are really paid to teach, and the research is a form of social capital. But even if we reaffirm the importance of a liberal arts education, why shouldn’t the social capital of research be eliminated? It is a more or less disavowed notion of expertise that keeps not only literary study, but the humanities in general, from admitting they are what everyone says they are: the subsidization of a largely inconsequential, largely unread pseudo-elite, who claim to deserve the privilege of a ‘life of the mind’ by proving to one another they’re the best at it. The truth of course is that most professors are not even remotely free from the day-to-day stresses of the ‘real world,’ though they must still pretend to be in order to legitimate their exploitation.

It’s telling that within the academy, the arguments against Moretti’s method tend to involve defenses of reading. This is necessarily an elitist argument. Not to mention idealist. There is nothing demonstrably superior or necessary about readings provided by academics as a professional class, not only on largely discredited humanistic grounds, but also in terms of politics. Just as academics perform culture, they also perform politics, ostensibly freeing others from the responsibility by pretending to deny them the independent capacity. Just as there are expert economists, there are expert idealists, expert critics, expert thinkers. We might agree that reading and discussing the best literature and philosophy, watching the best films, are just as vital and necessary as defenders say they are, and for that reason everyone should be able to take part in the discussion on equal grounds, or at least on the basis of status hierarchies produced through practice and not guaranteed by credentials. We might go so far as to say that the published, vetted interpretation of culture by an academic is always a gesture of institutional authority, and that without accessible evidence, this authority is always illegitimate.

Though imperfect and still dependent on expert commentary in its current formulation, the no-brainer combination of Moretti + searchable digital databases further points the way to a science of culture that actually makes practical use of institutional backing, with a collectivity dependent on shared resources rather than elite assertion. Behind the cheap façade of humanist intellectualism (which is in reality its enclosure) is yet another ruthlessly exploitative service industry, the edu-factory. And again, the same applies to politics. Academic radicalism, in the sense of one who is supposedly radical in his or her capacity as an academic and not an academic who is also an active radical, is to my mind a performative contradiction, and only significant for being permitted and financially supported. As yet another type of media work, it falls under any appropriately updated theory of the spectacle.

Internet Generation

Posted in blogging, Media, The Internet on March 2, 2009 by traxus4420

At a talk, a familiar thread: the younger generation and their computers; they don’t have a connection to materiality, they don’t think about medium, or place, or tradition, or history. This time it was poets; the last time architects. Their products are equally ungrounded, and to someone with even a modicum of ‘local’ historical knowledge exude an unavoidable sense of pointlessness. One of the speakers argues that their stylistic concerns are probably alien to an Internet generation confronted instead with the ability to “let it all go,” a kind of terrible freedom where one can throw out margins, typeface, privacy, manners, the whole deal. I try to identify. Sure, when I travel I sometimes have a hard time telling the location apart from the facebook album, though the same has been said about photography in general. I can’t really remember anything, I don’t really care where I live, nor do I really understand how to get worked up anymore over matters of taste — having been able to get any sort of music imaginable since college has taught me I can ‘like’ almost anything with minimal effort — but I can’t say my experience of any of these things comes with a greater sense of ‘freedom.’

Is there any mode of writing more constricting than Internet writing? I mean in terms of form, of course, not (unless one is dealing with censorship) content. Another helpful analogy can be drawn to taste. It’s often claimed that the Internet offers its users unprecedented possibilities for self-fashioning, by opening an ever-expanding archive of culture to  sampling, editing, remixing, reproduction, etc. The problem is how to filter all this information in interesting and/or useful ways; essentially how to theorize it.

I find this perspective superficial. For one, it assumes that variety automatically equals freedom. Even if we go along with its implicit restriction of our view to the field of consumption, we have to acknowledge that all the Internet does is reduce the distance between advertisement and product and expand its potential reach, thus accelerating the cycle for each individual product. In order to follow a scene, I’m immediately obligated to become conversant in whatever it is the second it shows up on the blogs — nothing is hard to find anymore, I have no excuse not to know it and have little time to develop a personal taste that is any more than irrelevant dilettantism; if something is encountered by ‘chance’ it has no time to sink in, only to immediately become part of a scene or disappear. There is an expanding universe of tastes, but they are not individual. The work of constructing and participating in one or more scenes is increasingly the point, leaving far behind the old humanist ideal of self-knowledge through a deep personal experience of art. The Internet is another terrain for the capture of subjectivities, is able to do so more quickly and in some ways more comprehensively than print, cinema, or television, and leaves even less room for personal ‘freedom.’ If the avant-gardes were split between l’art pour l’art and its destruction through collision with everyday life, today we could say the principle of motion for art in the Internet Age is scenes for their own sake. An anemic conception to be sure, but poised on a powder keg, or, depending on your interests, a big pile of money.

Which brings me to the next faulty assumption: that all this variety should lead to increased creativity. If we mean creativity in the kind of general market-friendly sense that every Flickr photo is creative, then obviously it does. However, the Internet is a giant parody of the idea that novelty, as the engine of cultural development, is produced by recombining previous material into new forms. Though it seems nice and rational, its assumption that everything is always already translatable (that everything can potentially be ‘recombined’) inevitably leads to the romantic notion of original ideas as mysterious, uncaused, etc. We can call this the entropic-messianic theory of cultural production, wherein all means of establishing sense are assumed to have collapsed into an equilibrium state and we’re all just waiting around for Godot. Or we could just call it postmodernism. But significant art — the language is so outmoded — is generated through struggle with tradition or with something else, not a full shopping cart. The components have to mean something before they can be used for anything besides derivatives.  Comparatively information-deprived regional cultures and their unpredictable relations have produced most of humanity’s stock of ‘masterpieces.’ As an aside, maybe it’s time to look beyond (or before) novelty as the ultimate standard for culture.

Writing on the Internet immediately threatens ‘authors’ with their ‘audience’ — the moment one stops thinking of oneself as an isolated performer on stage is when conversation can begin, but doing this requires the abandonment of all concern for developing one’s ‘craft.’ When language fully enters a sphere of universal equivalence as text, (and can be translated, quantified, plugged into search algorithms); then communicability and transparency of meaning are finally God; one retires from the divine the more readily one can define one’s addressees against the universal, ‘common reader’ (which is not to say that there is such a thing). I should add that putting writing on the Internet is not quite the same thing as Internet writing. The former is frequently the object, but never the subject of the latter, and can always be skipped.

But we are (as ever) rapidly approaching a redefinition of the term ‘art,’ and at the moment I’m forced to beat a hasty retreat.