Thursday, January 31, 2013

couple of bros, chattin' about stuff

wat

The next time someone tells me there's too much irony in the world, I'm going to throw them in a river. Grantland's Brian Phillips:
After "Beautiful Day," U2 played "MLK" and "Where the Streets Have No Name" while a giant screen behind them showed the names of all the victims of September 11. Bono screamed "America!" and ran around the heart-shaped track while the crowd went insane and the names of the dead scrolled up into the sky. U2, as a band, was essentially born for this moment. Nobody's lip-syncing here. "Where the Streets Have No Name" is another song about escape, but it has a bigger canvas and a more equivocal story. It doesn't mean anything; it's a vehicle for producing chills. I'm guessing more than a billion people felt chills at the same time during this performance. It's not really even fair to compare it to any other Super Bowl halftime show. It took place on a different plane. When I watched this live I thought I was going to pass out. The moment when the screen comes tumbling down and Bono holds open his jacket to reveal an American-flag lining is probably the high point in the history of rock music's work in the service of public healing. I'm not sure how I feel about this, but it may be the moment that most encapsulates what America was like after September 11 — the intensity of raw emotion, the need for mass spectacle, the sense that something genuine and vulnerable was coexisting uneasily with cell-phone commercials and an air of vague corporatization. (As I said: U2 was born for this.) And yet it made you feel like the top of your head had been taken off. 
I need some Negativland to cleanse myself of this pomposity.
      

Wednesday, January 30, 2013

actually, Matt Lewis is full of poop

A few people have sent me this Matt Lewis piece about hating Twitter, expecting me to like it, as I have complained about Twitter myself. Kind of strange that they would think so; the piece is one of those calls for civility and decorum that get published every six months. Well, civility is bullshit and decorum is bullshit and comity is bullshit and going along to get along is bullshit and let's all be friends is bullshit. That's democracy, that's adulthood.

Actually, my problem has never been with Twitter itself, it's been with a particular attitude about its use-- which Lewis's piece exemplifies. Lewis talks about the ways in which Twitter has raised his profile and publicized his work. That's the good thing about a public medium! But when you say things in public, some times mean people say mean things about what you've said in public. That's the bad thing about a public medium! There are options you have to get the publicity; there are options you have to avoid the meanies. There are no options that give you both. Lewis completely undoes his own point when he mentions having a private, invite-only Twitter feed. Don't like mean people? There you go, Matt: make your tweets private. But he wants both, the publicity and the protection, and you can't have that. Lewis's problem is not with Twitter. It's with that basic contradiction.

If you don't like what Lewis said about Twitter, your best path forward is to do what he refuses to do: choose to have a public Twitter, or choose to have a private Twitter, and act accordingly. The funny thing is that I think Lewis's biggest critics here are often guilty of the same thing, which is acting like a public Twitter is a private conversation between friends. It isn't. When you have a public Twitter feed, your tweets show up on a freely-accessible website. (I believe it's called "Twitter.com.") I can see how you might lose track of the fact that your Twitter conversations are public, but sorry, that's the nature of the beast. Ask Nir Rosen about it sometime. Sometimes people get on me when I hit back at people who have talked shit about me on Twitter. (In fact, I had a certain blogger darling send me a very unhappy email about it.) Sorry, you guys: you put something out there in public, it's public. If you come at me, I'll come right back. Don't want that? Save it for your fucking diary.

Public or private-- it's up to you. But you have to choose.

Update: Regular emailer Zed: "Blogger darling who complained has to be Sunkara, right?"

No way. Bhaskar is in my bloc.

the UBI and socializing finance

One of my aforementioned posts that I've written many words about and then keep mothballing is about a universal basic income and how it could not only provide material security for all people, in keeping with humanitarian/ethical needs, but also how it could strengthen those who remain in the conventional employment world. I remain worried, though, about disparities in power and the persistence of a cultural attachment to work (as in a conventional job) as the only valid method of human flourishing. I hope I can get it out in a form I like soon.

Until then, you should check out this conversation between the brilliant Seth Ackerman and Mike Konczal, one of my favorite liberals. It's a great discussion of our macroeconomy by a couple of informed and committed guys.

    

slippery things

Kevin Drum:
No matter what motivates you—realpolitik, humanitarianism, nationalism, whatever—interventionism doesn't make sense if it doesn't work. And the lesson of the past decade, at the very least, is that interventionism is really, really hard to do well, even if your bar for "well" is really, really low.
The first question for any kind of action in any sphere of human behavior is, will it work? If the answer is yes, then you can move on to arguments about when, whether, and what kind of action might be appropriate. But if the answer is no, all those arguments are moot. In the case of U.S. military interventions, the answer might not quite be an unqualified no, but it sure seems to be pretty damn close. This makes the rest of the argument futile.
This is related to commenter Charles's recent point: arguments for intervention always gravitate towards abstraction, because actual recent history is so resolutely discouraging towards intervening.

I am never surprised that our past failed interventions haven't turned people, particularly liberals, into committed non-interventionists. I am always surprised that our past failed interventions haven't inculcated skepticism and prudence. Peter Beinart wrote a famous reevaluation of Iraq:
It begins with a painful realization about the United States: We can’t be the country those Iraqis wanted us to be. We lack the wisdom and the virtue to remake the world through preventive war. That’s why a liberal international order, like a liberal domestic one, restrains the use of force—because it assumes that no nation is governed by angels, including our own. And it's why liberals must be anti-utopian, because the United States cannot be a benign power and a messianic one at the same time.
I have not observed any injections of wisdom and virtue into the American character since then. We are still not that country, not the country which ends suffering thanks to its righteousness and its strength. Recent history suggests that we are in fact still that country that causes suffering thanks to its clumsy and oblivious nature. And that fact is the one I expect people to hold onto, not yet seven years after Beinart wrote this essay. The cruel joke of Beinart and his piece, of course, is that he could only arrive at it after he had said despicable things about those who had opposed the war, and only after untold thousands were dead. That's always the way of things: they rediscover the nature of American power only after the bodies are buried. The unintended consequences reverberate for decades.

I see people latching onto Mali-- yet another good war for people desperate to find them-- and I wonder how much time they spend thinking about just how wrong things can go, just how little it matters that their hearts are pure. They will find themselves surprised to find that they live in the same old fallen world, home to the same old United States.

Tuesday, January 29, 2013

stuffed up

I am suffering not so much from writer's block as from the sudden conviction that something I've spent hours writing is of no use to anyone. I would estimate that I've written some 6,000 words in the last three weeks that were intended for this blog or other publications that eventually I ended up discarding, wondering why I'd bothered. It always passes.

ah, consistency

Gizmodo's Gary Cutlack:

"The standard Windows 8 Surface tablets came in for some stick, thanks to the Windows files eating up 13GB of hard drive space. That's nothing compared to Windows 8 Pro, which requires an astonishing 45GB of the Surface Pro's disk space for its files. The numbers, obtained from Microsoft by Softpedia, would make an absolute mockery of the 64GB version of the Surface Pro tablet if replicated there, with the machine possibly only having 19GB of space for users to use if Windows 8 eats up a similar chunk of drive space on the more affordable Pro option."

About four posts ahead of that one, Gizmodo's Leslie Horn:

"Unlike the heady days of 2007, your music and movies and Don't Trust the B— downloads live in the cloud now, not on your device. That's where Apple and everyone else has been pushing people for years, precisely because gigundo-storage devices are expensive and absurd and absurdly expensive for the common man."

So in other words, local storage isn't important, unless it can be used as an excuse to engage in Gizmodo's typical anti-Microsoft ax-grinding.

Update:  In case this weren't clear, this is a topic on which I can reliably be expected to be irrational. So.

Monday, January 28, 2013

high school is nearly everybody

I tried very hard for several days to write long form about this New York Magazine piece by Jennifer Senior; I have failed, rather deeply, to produce something worthy of publishing.  I was viscerally unhappy with the article when I first read it, and since then I've been working hard to generate a more sympathetic reading. Certainly, I can understand many of her points, and I do think that it's always worthwhile to talk about alienation and loneliness. But I simply don't believe that Senior's depiction of high school-- as a place of ceaseless torment and unhappiness for anyone who isn't at the pinnacle of the popularity pyramid-- is accurate. Senior does some reviews of the social scientific literature, but while she makes a compelling argument about the tendency of high school to exacerbate social problems, she does essentially no work to prove that high school is as bad for as many people as she claims. Again and again in the piece, she talks as if most everybody agrees that high school is necessarily a pit of sadness. And I think the truth is that, as most of us do, she's just extrapolating from her own unhappiness. I get that impulse; we all want to see in the masses a reflection of our own internal life. (See what I did there?) But the piece ends up seeming like an act of resentful axe-grinding due to Senior's exaggerations, and it's not aided by the headline (probably thought up by an editor). The "you" in "Why You Truly Never Leave High School" is about as presumptuous as such a thing can be.

(By the way: "It was a really small study. I wouldn’t necessarily read too much into it. But its results sum up the entire high-school experience." It was a really small study that you shouldn't read too much into, but it sums up a vast diversity of the human experience. There are not enough face palms.)

I also think that there is a deeper problem in her attitude. Because what the piece really is about is less the failings of high school and more the failings of other people. The dominant impression of Senior's essay is not that the structure of high school failed her but that the people around her did not meet her standards. More than anything, after reading the article I just wanted to say to her, "I'm sorry that the people around you have failed to meet your expectations. Perhaps you should look deeper to see if they're feeling pain similar to yours." As you can probably guess, this is a case of me finding particularly aggravating flaws I identify in myself. I struggle constantly to balance a necessary criticism of all of the fucked up bullshit without falling into a flat, useless misanthropy. (Not for anyone else, but for myself.) So feel free to find this hypocritical, or take it as someone who is working on it.

Ultimately, my disagreement lies most in Senior's corrosive notion that the problem with high school is the way it exposes us to difference:
In fact, one of the reasons that high schools may produce such peculiar value systems is precisely because the people there have little in common, except their ages. “These are people in a large box without any clear, predetermined way of sorting out status,” says Robert Faris, a sociologist at UC Davis who’s spent a lot of time studying high-school aggression. “There’s no natural connection between them.” Such a situation, in his view, is likely to reward aggression. Absent established hierarchies and power structures (apart from the privileges that naturally accrue from being an upperclassman), kids create them on their own
This isn't a problem with high school. It is the best thing about high school. Compelling people to spend time with others who are not like them is an essential function of schooling, one that the affluent frequently avoid by sending their kids to private school or home schooling them. And, not surprisingly, many kids who went to private school or were home schooled grow into the kind of adults with no sense of what the world is like outside of their social milieu-- which further dulls the sense of communal responsibility. Senior's notion that there is something wrong with being exposed to people across legitimate differences is truly corrosive to democracy, to egalitarianism, to society itself. We already have become such a siloed, segregated culture. So many of the products and services you can access online now are geared towards eliminating your interactions with people who are genuinely not like you. (At its worst, homeschooling takes this logic to its extreme, along with the typical arrogance of parenthood: my child is simply too precious to be exposed to the unworthy.) I simply don't believe that a civil society that is as diverse as ours can survive when we have walled off our lives from those who are not like us. And while I don't blame people for not undertaking such a process of exposure artificially, it's essential to the long-term health and fairness of society for it to be a part of our education and socialization. Democracy has consequences, diversity has consequences, and while I'd never wish it on anyone, the reality of diversity is that sometimes our encounters across difference will be unhappy.

Finally, there's this: Senior is allowed to complain about the throng because she positions herself as punching up, because she tells us that she was unpopular in high school and is thus permitted, in that vague way, to cast her judgments. Certainly, that's the lesson of most high school movies: the unpopular people are the sensitive dreamers who are gifted with the right to tell the story, while the popular people are cruel and vain, and thus not eligible. I was pretty popular in high school, so I suppose I shouldn't be the one making this argument. But there's nothing inherently more accurate or perceptive about the observations of those on the bottom than those on the top. One thing that I realized long ago: the "losers" in high school are often not any more fair, open-minded,  or generous than those at the top. They simply lack the power to do anything about it. Now, if we're talking about addressing problems of cruelty and abuse, certainly, my sympathy and support goes to the people who are the subject of it. But when we're attempting what Senior is attempting, and trying to take a bird's eye view, we need to avoid the temptation to take the Hollywood path and assert the superior virtue of the more oppressed.

Senior's self-identified status as a high school loser animates the whole piece. It reminds me that there is a profound narcissism in those who constantly self-identify as social outcasts. Take the fleets of people who make videos saying "I am a true geek." They claim to be arguing that they are responding to the perception that they are unworthy. I think instead they are simply saying, I am great, and I deserve to be recognized for it. The only reason the behavior is permitted is because of the preemptive self-branding as a geek or loser. Ask yourself: would New York have ever run the essay, if it was the perspective of one of the winners, complaining about the profound lack of character and low moral fiber of those below?

Saturday, January 26, 2013

I need to tame this wild tongue if I'm to touch these white streets

Having a blast of irrational optimism and a feeling, generally unknown to me, that this species might be able to get it together and organize itself in a way that provides for both the material security and the dignity of all humankind, a way that harnesses all of that wonderful productive capacity, that desire to create and to share what one has created. But I am saving these thoughts for a special occasion coming up soon. More info to come.

Thursday, January 24, 2013

stuff

  • This AV Club piece by Noel Murray-- which I think is well done-- lends more credence to my feeling that, if you spend long enough engaged with popular culture and the media that covers it, you eventually lose your ability to just react to stuff.
  • A couple years ago I asked if more form factors were inevitable, given the dictates of the tech world economy. Today, Jesus Diaz, an inveterate Apple fanboy, seems to say so. "Now, to keep the behemoth breathing and killing, Apple needs to create a new market. Virgin territory, a new gold rush. They can no longer profit at the same pace in saturated markets. They need to create new product categories." I don't question Diaz's accuracy of the necessity for new products for Apple's bottom line. But does this not seem a little bit nuts? How many doodads are going to be in our arsenals? You've already got crazy redundancy, between laptops and tablets and desktops and e-readers and smartphones and smart TVs.... Especially at a time where more and more of the gains of our economy are going to a tiny slice of our people, it doesn't seem sustainable.
  • Lifehacker is a kind of ridiculous site, and I'm generally opposed to the notion of life as a perfectible enterprise. Plus, a ton of their posts seem to boil down to "many containers can hold things other than their original contents!" A friend of mine and I used to joke that we'd start a parody site called Lifeheckler. I do find some projects and tips worthwhile, though. I found this post to be interesting, mostly because I am already engaged in just about the only career (or "career," if you're feeling uncharitable) I've ever wanted, but they have already ruled that out as "overcommitting.: The thing to bear in mind is that, for pretty much any life you'd really want to chase after, they're gonna make fun of you for it, so fuck 'em. It can be hard being a grad student, given the way that many people act like that's just inherently ridiculous. But the alternative is to spend your time in some horrible cubicle somewhere... if you're lucky. The trying always exposes you to ridicule, always.
  • I am digging this new look, but the columns are too thin and can't be adjusted, and several people have called for the return of the header picture. I'm gonna poke around and see what I can come up with this weekend. If only I knew basic HTML skills.
  • I love Film Crit Hulk's writing, although I disagree with him fairly often. I thought it was cool of him to start this discussion. I was bummed out, though, to see a majority of respondents essentially saying, no, you're right to have a double standard, for X, Y, and Z reasons. This is a hobby horse of mine. But I just wish more people who avow progressive social principles would be more likely to say, nope, not gonna accept any excuses, whether it's about Odd Future or Louis CK or whoever.
  • Everybody always plays the acoustic version, but I like this version a great deal more.
         

norms of control

I very rarely ever use terms like "imperialist" when I discuss American foreign policy, in large part because it's the type of language that causes people to just shut down completely and stop listening. But there are ideas hidden within them that are very important, and I think this Foreign Policy essay does a really good (if largely inadvertent) job of demonstrating what I mean: the almost entirely unvoiced assumption that the United States has the license to do whatever it wants militarily, wherever it wants.

Observe:
During the four-day siege of the In Amenas gas field, which culminated in an opaque takeover by the Algerian military that reportedly killed dozens, several pundits and journalists asked why the U.S. military did not send drones or special operations forces to free the hostages or kill the Islamist militants holding them. One CNN anchor asked Mike Rogers, who chairs the House Permanent Select Committee on Intelligence, "I'm curious as to your perceptions whether the U.S. is taking too much of a back seat." The following day, another CNN anchor seemed puzzled as to why Algeria would only permit the United States to fly unarmed drones over its territory, to which Pentagon correspondent Barbara Starr noted: "The U.S. view is that the Algerians would have to grant permission for U.S. troops, U.S. military force, to go in there."
CNN should not have been surprised. Neither the Bush nor Obama administrations received blanket permission to transit Algerian airspace with surveillance planes or drones; instead, they received authorization only on a case-by-case basis and with advance notice. According to journalist Craig Whitlock, the U.S. military relies on a fleet of civilian-looking unarmed aircraft to spy on suspected Islamist groups in North Africa, because they are less conspicuous -- and therefore less politically sensitive for host nations -- than drones. Moreover, even if the United States received flyover rights for armed drones, it has been unable to secure a base in southern Europe or northern Africa from which it would be permitted to conduct drone strikes
Now, Micah Zenko's point is largely to remind people that the United States is constrained in principle (if not in fact) by international law and rules of engagement, so I'm not blaming him. I'm just depressed that he needs to write a piece like this in the first place. Can anyone imagine a piece in Foreign Policy that features a spokesman from the Mexican military saying, "The Mexican view is that the Americans would have to grant permission for Mexican troops, Mexican military force, to go in there"? And would permission ever be granted in any circumstance, even on a case-by-case basis? Would we find anything odd about Ecuador failing to secure a base in southern Europe or northern Africa?
The White House can choose to act -- in Algeria or elsewhere -- without a state's permission, and deal with the political consequences and likely reduction in diplomatic and intelligence cooperation if U.S. involvement is exposed. Given that 94 percent of the Earth's land mass is not U.S. territory, the sovereign right to say "no" is one that advocates of using military force must keep in mind. 
To live in a country where people must be reminded of such things is to understand the meaning of the word hegemony-- even if the use of that term gets you laughed at. (Especially because the use of the term gets you laughed at.) These vague impressions of the normalcy of limitless force projection are as dangerous as any coherent philosophy of militarism. That's the American perspective, today: not that America is extraordinary for how often it breaks the rules, but rather that it's extraordinary that America ever follows them at all.

Wednesday, January 23, 2013

the quiet insistence of the real

As happens more often that I'd care to admit, a commenter recently expressed a point better than I could. In a recent post, commenter Charles quoted another commenter saying

"I'd only like to add that I think a position that allows one to act, or abstain, or fall anywhere along that continuum will always be a more effective and rational approach to world affairs than one that forecloses any possibility of intervention from the outset."

And responded:

"Yeah, it sounds perfectly rational when put in the most abstract terms possible.

But actually there are two real, not-abstract options here. 1) A major power half the world away, that doesn't ultimately give a good goddamn about what happens to most of the people in the region being invaded, which has consistently misunderstood the political and social conditions in the places it has invaded over a period of fifty years, can engage in destructive military intervention and almost certainly do more harm than good, or 2) Uhm, not that.

The abstract version doesn't matter. It's pure fantasy. It has nothing to do with anything real. We aren't talking about a position "that allows one to act" or any such nonsense. We're talking about the most powerful military in the world having a consistent track record of reliably fucking up intervention. The only things you can reliably predict with regards to American intervention is that lots of people will die, that most of them will be civilians, and that things won't go the way you hope they will. You can take that to the bank."

As a pacifist, I am subject to a constant drip of hypotheticals, counterfactuals, and fantasy scenarios that are designed to test the limits of the commitment to nonviolence. ("YOU'VE GOT HITLER IN THE CROSSHAIRS FREDDIE WHADDAYAGONNADO") I can reliably be bullied into answering them, and usually that means providing the answer that allows the questioner to return to the assumption of my unseriousness. But it is very telling that so much of the philosophical architecture of "liberal intervention" is based upon theoretical situations and elaborate setups, or pure theory divested from the history of American foreign policy, Western militarism, or past interventions. It's like arguing with Descartes-- all theory, no history. Beware the intellect that lives in abstraction. And this tendency only exacerbates the profoundly limited time frames in which liberal interventionists work, declaring victory months or weeks into complex and shifting situations. We are still living with the consequences of deposing Mohammed Mossadegh; that happened 60 years ago. I'm sure arming the mujahideen seemed like a great success in 1992.

I don't blame people for dreaming big. I do blame them for letting those dreams overwhelm their critical capacity. I hear liberal interventionists wax idealistic about all the good the good guys could do, and I just want to shake them-- America is not that country, violence is not that instrument, this is not that world.

a reason to care about those high definition screens

As something of a technoskeptic, I want to be careful to give credit where due. I do think that we're reaching a point where wider broadband access, more powerful processing, and the development of protocols and languages is producing really beautiful web design. For a long while I was bummed that the Internet was becoming more functional but no more beautiful. I think people are recognizing that bringing outstanding visuals to their sites and services can distinguish themselves in a crowded industry. Flipboard, which recently got ported to tablets, is a good example. I mentioned the other day that the new Myspace (however inherently silly you find Myspace) has truly well-done web design. That's particularly funny, given Myspace's former status as an ugly, cluttered mess. People are finally grasping that what looks good is not a ton of small images and a million buttons but large, high-definition images and a few elegantly laid out choices.

Check out this interactive preview of a recent book on my field. Click the photo of the book to expand and manipulate. Beautiful! A bunch of people in my department study visual rhetoric, the ways in which arguments are embedded within visual presentations and how they are (or aren't) effective in persuasion. A lot of really interesting things are happening in the visual presentation of information and design languages. It's an exciting time.

you probably don't have an opinion on poetry

Following the second inauguration and the poem read by Richard Blanco, there has been a lot of joke-making and forced levity from the usual circles about poetry. (It says a lot about our times that people so often feel a burning need to be clever on command.) As you'd expect, making fun of poetry was a bipartisan activity. Personally, I can't separate my aesthetic take on that particular poem from my take on the day's politics, nor would it be intelligent to try, so I won't offer an appraisal. But it seems that poetry is being discussed in large part as a stand-in for a certain kind of culture, and that's interesting.

Now, I could easily write a critical take on all this. I read a poem that is better than 2,000 years old the other day and enjoyed it; you can make your own judgment about the longevity of the Downton Abbey recap you just wrote. But really, it's not necessary. Poetry-- like long-form fiction, orchestral music, abstract visual art, and difficult movies-- needs no defense. Poetry is to be enjoyed. If you don't enjoy it, don't read it. I read hand-wringing pieces about whether the kids these days are reading great novels, and I'm always perplexed. Who cares? I read them. I don't care about whether other people are interested, nor do I associate reading novels with virtue. It affects whether I'm interested in someone romantically and impacts whether I find someone interesting, sure, but outside of the aggregate impulses of friendship, why does it matter?

Where things become dicey is in the enforcement of dislike. Many people experience anxiety about media and culture they have a vague notion they're supposed to like. I am sorry that they feel that anxiety. They shouldn't! I think young people should be exposed to lots of different kinds of art and media, but once you're past the age of taking a curriculum, yuo should enjoy what you enjoy and not worry about the rest. But that doesn't excuse efforts to undermine other people's enjoyment. Depending on how charitably you read this piece, you can call this either the Dan Kois effect or the reverse Dan Kois effect: I consider myself a sophisticate, yet I don't like a certain kind of artwork, therefore I must undermine the very possibility of someone else liking it. This is uncharitable, and bound up with the endless signalling and personality assembly that underlies our digital media. Because I'm a masochist, I did a Twitter search for "poetry sucks" and "poetry is bad" and a few similar terms after the inauguration. The results were as you might expect.

(I would suggest that if as an adult you saying something that is indistinguishable from an eight grader complaining about his English teacher, you might reconsider.)

Now, I can't manage to occupy the headspace that would dismiss an entire medium; it's like saying "movies are bad." I haven't seen every movie so I can't say what all movies are like. You can certainly say "I don't like movies," and while I would suggest that you try a few more, it's your prerogative. And you can say "most movies don't really rise to a level of being good art." I've said the same about video games, in the process of defending them as an art form. But even with those more limited statements, you'd probably want to be vaguely familiar with the landscape. I mean I wouldn't want to say that most movies are bad if I haven't seen more than a dozen in the past year.

So my suggestion is that if you can't name five contemporary poets writing today, if you haven't read a collection published in the last five years, if there was no poem that could have been read that would have made you happy, if you don't have favorite poets-- you probably should just stick to "I didn't like that poem" or "I don't enjoy poetry." In that case, jam on; it takes all kinds. It's okay not to like things. The universalizing impulse is a problem, though, and likely not generated by an honest reaction but rather to be seen in a certain way by your peers. I would argue that if you know nothing about poetry and say that it's all bad, you don't have an opinion, a reaction, or a feeling. You have a signal.

Tuesday, January 22, 2013

oh savage hearts

Here is a post from Rod Dreher in its entirety:


Truly, I tremble at the thought of living in a country that is not above poverty or the death penalty.

As a Christian, I'm sure Dreher is familiar with Matthew 7:3.

I am at your disposal

So I got a (rather breathless) email just now about some argument that's going on in Twitterland that involves something I wrote. The emailer asked for a defense, but I'm afraid I can't provide one because I'm not on Twitter. Which doesn't, of course, mean that people can't complain about me, it's there their prerogative. Part of the appeal of Twitter is that it offers a gated community, one in which you are free to make public address but one in which you can also utilize the follower system to only listen to people you're interested in talking to. There's nothing sinister about that. Some people use Twitter to engage with all comers, some only with friends, and that's fine. But it does make explicit a certain asymmetry of access.

If anyone wants to actually engage with my ideas, that's what blogs are for. But you have to grant me the laurel of being important enough to talk about in the first place!

Monday, January 21, 2013

MLK and Stonewall are the rejection of gradualism

A brief point, and one that is both exceedingly obvious and routinely ignored.

Today is the day we commemorate Martin Luther King. It also happens to be the day of Obama's second inauguration. (Why we have inaugurations for sitting presidents is beyond me.) During Obama's speech, for which he is receiving the typically polar response, he name checked the Stonewall riots, Selma, and Seneca Falls. What's worth saying, not so much in regards to Obama but to the liberals who zealously defend him, is that Martin Luther King was adamantly opposed to gradualism, and as Ned Resnikoff pointed out, those events of resistance represent the rejection of the political process due to the urgency of profound oppression. Gradualism has become the cudgel with which liberal Democrats beat left-wing critics, and the partisan political process is advanced not merely as the most important route to change but as the only valid route to change. To ask for change in the face of injustice and suffering is to be called naive and sanctimonious; to advocate resistance that transcends voting once every four years is to be called a traitor. Yet the man who we celebrate today, and the events referenced by the very president who is defended in those terms, speak to the profound poverty of conscience that resides in the doctrine of the lesser evil.

King once wrote, "cowardice asks the question, is it expedient? And then expedience comes along and asks the question, is it politic? Vanity asks the question, is it popular? But conscience asks the question, is it right? There comes a time when one must take the position that is neither safe nor politic nor popular, but he must do it because conscience tells him it is right." That was not an idle position. It wasn't a rhetorical flourish. It wasn't some bit of starry idealism mixed in there for the crowds. It was an absolute linchpin of his moral philosophy, an existential attachment, a first principle. It appears in his published and public work again and again, for those who bother to actual read him, rather than to interpret him as some vague symbol of gravitas, robbed of his anger and his particularity. There can be so much disrespect hidden inside reverence. Well, many who insist on the doctrine of the lesser evil in all times and against all conscience no doubt today are celebrating King, just as they celebrate the memory of Stonewall, a wild and uncompromising expression of righteous anger and the rejection of "just a little bit better." How do these proud gradualists defend this tension, of taking as inspiration the people and movements who explicitly and angrily rejected the slow compromise the gradualists prefer? They don't. They never raise the people or the ideas to the level of critical discrimination, instead treating them as empty placeholders, as effigies to vague ideals. The right to never settle the contradictions within your own worldview-- such is the privilege of the status quo.

I'm told that Obama gave a liberal's speech. I now want him to govern like one. To get there, we are required to criticize him, as the most basic principles of democracy insist we do. With no elections left to win, will the pro-Obama liberals participate in that critical engagement, or will they continue to deny the validity of conscience, in defiance of the principles which have been celebrated today?

and now it's time to pay these guys

I mentioned something similar recently, but it's a point I think needs to be made again. If you'll forgive me, it gladdens the heart to see Jacobin and its staff getting love from the national press. It's hard to express how unlikely the rise of genuinely leftist publications getting this kind of attention would have seemed back in, say, 2004-- or 2008. I am particularly glad that the Times piece includes so many of the worthy people working at Jacobin, and name-checking Doug Henwood makes me a happy boy indeed.

But, look-- attention is not enough. It's time for all of this press heat to result in some funding. New York is a brutally expensive city, and you can't eat buzz. So it's time for people who admire Jacobin  to kick in financially. Neither subscription is expensive. (I'm a TNI subscriber myself, and I plan to add Jacobin next paycheck.) Left-wing political media is always going to be a hand-to-mouth existence, and that's okay. But the rent has to get paid.

So please: subscribe here.

(ps: you are allowed to make the "Marxists asking for money herp de derp!" complaint if and when we have the redistributive programs necessary to give them shelter, food, clothing, transportation, and medical care. Until then, shhhhhhhhhh.)

Sunday, January 20, 2013

I just fucked up in trying to prove that I didn't fuck up

So I got an email a minute ago from Alexis Madrigal, accusing me of deleting a comment he put up. I never, ever do this. Ever. You can ask anybody who has read my blog for a long time. And there's been some vicious shit written about me in the comments here.

So I dove into the post in question and his comment was there. I'm assuming he was just looking at the wrong post, as there were two that were about him back to back. Here's the comments that were there (click to embiggen):


That's what I saw.

Now, since I take this kind of accusation very seriously-- because I'm a big boy, I like to fight, and I believe really strongly in linking to criticism of me-- I went into the comments interface after I emailed him back and tried to monkey with a few things to make sure that he didn't have any comments in the spam folder. In the process, I just deleted the top 50 comments in the folder-- which included his comment, and a bunch of other real comments, and now I can't get them back. Which, I admit, does not look good. I will say that the comments deleted included a couple of other ones that were in fact critical of Madrigal. Luckily I took the screen cap of his, so here it is. But the ones on the post before it, and any others written in the last day or two, are gone.

So, yeah. I didn't delete his comment in the first place, but now I've accidentally deleted it in making sure that I didn't exclude anybody's comments, and on Blogger there's no way to fix it. I have a pretty absolute policy with this stuff: unless it's spam or out-and-out bigotry-- there's an anti-Semitic troll who has been popping up who has been a pain in the ass-- I don't delete anything. And I usually link to criticism of me. In conclusion, I don't know how to use the interface of the blogging platform I use.

Update: Madrigal reports that he actually left 2 comments. It's very weird that one of his was lost, and I am sorry for it, and for the commenters who have lost a couple now.

narrative delusions

I opened the Sunday New York Times magazine this morning and read this from Luke Mogelson:
A truck pulled up, and Lt. Col. Mohammad Daowood, the battalion commander, stepped out. Everyone waited to see what he would do. Daowood is a man alive to his environment and adept at adjusting his behavior by severe or subtle degrees. He can transform, instantaneously, from empathetic ally to vicious disciplinarian. To be with him is to be in constant suspense over the direction of his mood. At the same time, there is a calculation to his temper. You feel it is always deliberately, never capriciously, employed. This only adds to his authority and makes it impossible to imagine him in a situation of which he is not the master. A flicker of recognition in the deranged man’s eyes suggested that he intuited this. He approached Daowood almost bashfully; only as he closed within striking range did he seem to regain his lunatic energy, emitting a low, threatening moan. We waited for Daowood to hit him. Instead, Daowood began to clap and sing. Instantly, the man’s face reorganized itself. Tearful indignation became pure, childish joy. He started to dance.

This continued for a surprisingly long time. The commander clapping and singing. The deranged man lost in a kind of ecstatic, whirling performance, waving his prayer cap in the air, stamping his feet. When at last Daowood stopped, the man was his. He stood there — breathless and obsequious — waiting for what came next. 
 Tell me: what portion of this is actually observable? What percentage of what is written here is something that the journalist in question could prove if he were forced to? Very, very little. Please tell me how you would verify that someone is "alive to his environment," or when he has "transform[ed], instantaneously, from empathetic ally to vicious disciplinarian." I can imagine how that might be expressed in behavior. But you'd have to actually tell me. How you could tell that there is a calculation to someone's temper from the outside, I'll never know. "You feel it is always deliberately, never capriciously, employed. This only adds to his authority and makes it impossible to imagine him in a situation of which he is not the master." Who is "you," here? Is it really impossible to imagine that for everyone?

"A flicker of recognition in the deranged man’s eyes suggested that he intuited this." Bullshit. Bullshit. Even if I thought that a "flicker of recognition" was something that someone could actually observe, it would never in a million years suggest that someone has intuited that another man is calculating in his temper. Never. "The man was his." Ah. And you know that... how? Because he pointed to where the Taliban were, across the valley in some vague sense? How do you know that the colonel here wasn't getting played? That this wasn't some elaborate performance? Because of flickers in the eye and vague, totally unsubstantiated projections? The whole piece is constructed of similarly contrived accounts of feelings and conjecture, delivered in the language of journalistic authority. That's without even getting into the aesthetic horror of saying that someone's face reorganized itself.

This story has all of the hallmarks of good investigative journalism. It's published in one of our most high-profile and respected publications. It involves a journalist gathering facts on the ground in a war-torn and dangerous place. It must have taken hundreds of hours to research, organize, write, and edit. The reporter involved must have risked a great deal in writing the story. But this is shitty, shitty journalism, and indicative of a broad problem within our press corps, the popular tendency to portray what others are thinking or feeling, to hang whole pieces on vague notions of someone's character, aura, or ethos, and to generally express as fact ideas that could not be verified under the best circumstances. And given that this is the Times, I can only imagine that a small army of editors and fact checkers read it all and let it stand.

Easily the most dangerous example of this is the New Yorker piece on the killing of Osama bin Laden by Nicholas Schmidle, which is absolutely chock-full of the impressions, emotions, and thoughts of people who Schmidle never spoke to. I say dangerous because, of course, we have no official accounting of that hugely important event, only the censored and controlled accounts from the Obama administration and the deeply compromised Zero Dark Thirty. Journalists build our history, and they have a profound duty to deliver the facts in a way that is subject to verification, substantiation, and review. I understand the merits of New Journalism and creative nonfiction and the like. But I also understand that depending so heavily on the perception of another person's feelings, or through reference to character that have no expression in particular observable behaviors, is profoundly dangerous. And I'm afraid far too many readers will simply swallow it whole.

Update: To be clear, when I call it shitty journalism, I'm not referring to the whole story. I'm speaking specifically about how often the reporter makes claims that couldn't possibly be verified.

Saturday, January 19, 2013

in trouble again

Well, I'm told I was too mean to Alexi Madrigal, and it confused my point, and it looks like that's right. I'm sorry for that. I do think that my basic point holds: despite Madrigal's considerable derision for the idea of Internet brands, the magazine he works for is one of the clearest examples of the branded internet I can think of. Not merely because of The Atlantic's financial success, and not merely because their job listings suggest a company that is interested in creating brand identities, but because they have worked so hard to sell these Ideas Festivals (urge to flame... rising) and Food Summits and such as an extension of the character of the magazine. Part of the problem with the constant drip of "What is the internet now?" pieces is that they tend to work at more of an abstract remove than they need to. The popular idea that everyone is getting their content from Facebook and Twitter links and not from consistently (or "mindlessly," as Madrigal says) going to the same websites just does not seem right to me. I know many people who consistently check Slate and The Atlantic and Salon, etc. everyday. Yes, Facebook and Twitter supply necessary links to the outer internet, but people still seem loyal to particular aggregators and content generators. You can define a certain ethos for any of those websites I just named. That's branding.

Here on my own blog, I have a small core of consistent readers, and then big spikes in traffic when I get links or discussion from big sites or aggregators. I think that's how the Internet functions for a lot of people-- the daily checklist of sites, and then a lot of links from places that you'll only read when someone links to them for you.

Also: I really like Harpers, and I think gratuitous swipes from people at a publication that is more willing to accept the petty corruptions that monetizing for the Internet entails is gross and unnecessary.

That dolphin is actually cool, though, dammit.

Friday, January 18, 2013

Alexis Madrigal is peddling bullshit once again (and other wisdom from the unbranded Internet)

Writing for a professional publication, particularly a self-fellating august organization like The Atlantic, has both benefits and drawbacks. The benefit is that you can be paid for rehashing the conventional wisdom and calling it contrarian, trolling feminism, making embarrassingly false claims on issues of national security, and, if you're Alexis Madrigal, alternating between talking about how great life will be (for you and your whitebread upper class cohort, natch) when the kewl technologiez get here and whining about how technologiez are letting you down. You can also get paid to write intelligent and useful commentary. But we were talking about Alexis Madrigal. (zing!)


The downside to writing for this kind of magazine is that you're meant to take yourself very, very seriously. (In general, there is an inverse relationship between the quality of a given writer for The Atlantic and how often they write about how important the magazine is.) That means that you can't just go full flame war on people even when you really want to. So check out this piece from Madrigal on Rick McArthur of Harpers. He dresses it up with some airy talk about search and the Internet-- like all of his pieces, Madrigal explicitly tells you that he's smarter than someone else, an important activity when your writing itself is incapable of getting that job done-- but let's get real here: what Madrigal wants is to write a hit piece. The problem is that he's in such a self-serious venue that he can't actually just go ahead and do it. Worse for Madrigal is that the club he wants to beat McArther with is the use of branding on the Internet, saying "For just about every person, the Internet is not content brands that they return to mindlessly day after day."

Coming from The Atlantic, and particularly from someone who won't stop writing about how very important the magazine he writes for is, this is utterly shameless. The Atlantic is an institution that never stops branding! What does Madrigal think those "Ideas Festival" circle jerks are for? Does he think that people would overpay so wildly to attend a "Food Summit" if it wasn't tied to the carefully stage-managed gravitas that the publication endlessly pushes? On their website you're never far from someone telling you about how fortunate you are to be reading it. Madrigal's complaint is like saying that elf labor is inefficient when you work at Santa's workshop.

But don't take my word for it. How does The Atlantic actually feel about its own branding and the Internet? Let's check their job listings!


Now, I'm neither a Thought Leader nor part of the new breed of global business executive that mistakes terms like "start up mentality" for actual content, rather than the kind of language stupid people use to sound smart. But it sure looks to this poor country boy like The Atlantic is peddling exactly the kind of managed, branded, controlled, top-down, aristocratic, commerce-first vision of the Internet that Madrigal mocks McArthur for pursuing. The company has been very successful-- I know because they never stop saying so-- and that's because of its branding. I can only conclude that he is taking up this line out of petty resentment at another magazine, and one that values depth over hype, the arduous work of gaining knowledge over the facile assembly of glib narratives, the responsibility of adult discrimination over the cheap thrills of fanboy fawning and reckless optimism.

Madrigal is not wrong to celebrate the people-powered, brand-free side of the Internet. It's just that The Atlantic is truly the furthest thing possible from that. The unbranded Internet looks like this blog, not like The Atlantic, and as I will inevitably be scolded for writing a post as unserious as this one, you can see why it might be falling out of fashion. Madrigal is celebrating an Internet that his own company is slowly choking to death. He certainly is engaged in that process himself. I mean, I certainly hope that Madrigal he's pursuing some sort of editorial directive when he posts about  "The Coolest Looking Dolphin." (Otherwise, I might start to suspect that he's a shallow person.) And while Facebook and Google might not be content providers, they are most certainly brands, and the notion that they represent some sort of untamed vehicles of "human intelligence" is techno-utopian woowoo bullshit. But then, that's Madrigal's brand. Isn't it?

some links and such

  • I was very happy to see a very incisive take on my post about singular "their" over at the Economist's language blog Johnson (as in Samuel Johnson), by R.L.G. For the prosecution, we have Jen Doll at The Atlantic Wire, with an entertaining argument for the other side.
  • Gawker demonstrates again why, despite the constant frustrations and obsession with juvenile nonsense, I can't write it off. Too much good media criticism, like this piece from Mobutu putting the Scientology "advertorial" and Manti Te'o hoax into the broader contest of our media.
  • Bhaskar Sunkara has a typically intelligent take on the rise of deracinated technocracy on the left (or the "left"). I wrote about the folly of trying to get outside of politics and into "pure empiricism" a couple months ago.
  • This Atlantic piece by Lindsay Abrams on my hometown, and two events that have shaped it within my lifetime, was perceptive, well-written, and fair.
  • While Myspace has become synonymous with failure and obsolescence, I have to say, the new format is truly beautiful web design.

Thursday, January 17, 2013

they seem to know where they are going, the ones who walk away from Omelas


what are the rights of the disfavored?

Suppose, for a moment, that you're a supporter of the side in a civil conflict that is disfavored by the American foreign policy apparatus. Suppose, in other words, that you are a Malian that supports the insurgency, or a Syrian who supports Assad. Do you have rights? Do you have a right to providing political support for the side that you feel best represents the interests of your country, despite the preferences of the American government? I'm not talking about people actually involved in fighting, members of resistance movements or governments. I'm talking about people who have a preference that cuts against the dictates of America and NATO. Should you have the right to feel that way?

This is not an idle concern. Despite what you have heard in the American media, many millions of Syrians support the Assad regime. Their number include Alawites, Christians, secularists of many stripes, and a good chunk of Syrians who simply prefer the status quo. Do I understand supporting the Assad regime? I confess that I don't. But then, I'm not a Syrian, and the fundamental principle of democracy is that such supporters have a right to that stance irrespective of my disbelief. Western intervention robs them completely of their right to advocate for their political preference. Worse, it exposes them to the threat of violence for holding the political views they do, as every intervention inevitably results in reprisals against those who backed the wrong side-- it happened in Kosovo, in Iraq, in Libya. As Iraq proves, the self-same Western powers that can remove or defend an establishment government can't prevent mass murder of the losing side.

Last year, Vice interviewed a few Syrian skeptics of the insurgency. They rightly question the absurdly distorted Western media narrative, one which has left the average reader completely unaware that opposition to the rebels exists outside of the Syrian government at all. Says a man named Wafa-- who would not consent to having his picture published, on the sensible logic that he would be killed in retaliation for his views-- "Those 'rebels' killed six members of my family and we're not allowed to be mad at them. We're not denying the fact that Syria is a dictatorship or that the regime is far from democratic, but we don’t think that the rebels will ensure a better future for Syria." Given the influence of the Muslim brotherhood in the Syrian uprising, and the various sectarian struggles that are hidden within, well. Such caution is understandable, wouldn't you say?

What interventionists in America believe is that what this person says is not merely unconvincing. They believe that he must be written out of the process entirely, that his voice must be totally removed from the future of Syria. In its place stands... us. The benevolent Americans must dictate terms once again to the rest of the world. I said before that I can't understand supporting the Assad regime. But I also know that the opinion of an American in Indiana is totally irrelevant to the question, on any theory of democracy whatsoever. And yet perversely, in due time the opinions of an American voter like me may make more of a difference for the future of Syria than the opinion of Syrians. That supporters of intervention don't see the profound failure of such a situation speaks volumes.

I often think of what it must be like to live in a part of the world where your future is dictated by the whims of the American government; the imagination fails. That the US supports the government in Mali and the rebels in Syria is a trick of history. It could easily be reversed, and in that reversal hangs the balance of countless lives. To oppose a repressive government is to risk death; to support them is to risk death when the insurgency comes. Those are the stakes for people like Wafa, literal life and death. If you are, say, Ann Marie Slaughter, you believe not just in your own wisdom and benevolence, but that they are so pure that they allow you to dictate who has a voice and who doesn't, who lives and who dies. If you support intervention, you are obligated to answer the question: what rights do such people have? And what principle permits you to curtail those rights with force?

Oh, and-- when disfavored groups actually win elections, well, we know how that goes.

"liberal interventionists" care about establishment governments except when they don't

A typically charming Twitter exchange:



You see, the requests of status quo governments matter, when they are fighting a civil war against an Islamic insurgency-- unless those status quo governments happen to be those of Syria or Libyan. Then, all that matters is freedooooooooooooooooooooooom.

This Malian-Libyan dichotomy is like something cooked up in a lab to demonstrate the pure hypocrisy and utter lack of consistency among the liberal interventionist worldview. The Libyan and Syrian oppositions both have plainly Islamic characters, and if you're under the impression that the current Malian government is some beacon of Western values in the Saharan world... you should probably reconsider. Not that any of the reconsideration would get you to the truth, exactly; I'm sure your average Malian can't possibly possess the knowledge necessary to really understand the complex and shifting factions, motives, and dynamics of a fluid conflict in their own country. But I'm sure it's possible for Western "policy analysts" who don't speak the language and have never lived in the country to understand what's happening and know what will happen next-- like they did with Iraq, or with reinstalling the Shah, or supporting the Suharto government, or arming the Mujahadeen. They always know everything.

So why are we duty-bound to intervene on behalf of the Malian government and also duty-bound to intervene on the behalf of the Syrian opposition, just like we were duty-bound to intervene on the behalf of the Libyan opposition? The same reason we do anything in foreign policy: the influence of opportunistic ideologues, war profiteers and resource extractors, manipulable idealists, and career militarists, mixing together into a poisonous brew. So we'll contribute to the breaking of Mali like we helped to break Libya, and once it's broken we'll be responsible for nothing, and then Mali's neighbors will be left to deal with the consequences of the brokenness, just as Mali is dealing with the consequences of a broken Libya now. But those are the wages of a child's vision of goodies and baddies, applied without consistency or self-criticism through the deployment of ordnance.

But, you know, Greenwald and Friedersdorf are shrill, and they don't meet at the press club for cocktails or tell funny jokes about Game of Thrones on their Twitter feeds, so feel free to ignore them.

Wednesday, January 16, 2013

I hate to play to my image, but...

I take a lot of flak for being cranky and judgmental. This certainly won't help, but-- If you're seriously saying that a prominent linebacker in college football having an imaginary girlfriend is the "story of the year" on the same day it has become clear that we're getting drawn into another vicious conflict in the Muslim world, one that suggests that we'll be pulled deeper and deeper into African geopolitics, which is directly influenced by our prior intervention in another civil war, you're a fucking imbecile and the reason our country is broken.

Pardon me for being a scold. I just take the projection of American military force seriously, you know?

Tuesday, January 15, 2013

a handy guide to the use of "we"

Writing is complicated! In the scrum of trying to express your inner thoughts in print, it's easy to lose sight of details like "am I one person or more than one person?" These days, I often see writers using "we" or "us" to refer to themselves, which is problematic, on account of there is only one of them and all. To prevent this all-too-understandable error, I've prepared a handy guide for when you can use "we" to refer to yourself.

1. Is more than one of you writing the sentence in question? Is it a collaborative work, authored by more than one person? Are you in some sense speaking for a group which was involved in the production of that text, or the project which that text describes?
2. Is your text a narrative or story, written in first person, in which events occurred to both the narrator and others?
3. Are you a monarch, regent, or similar figure of aristocratic leadership in a tradition where the ruler is understood to embody the state?
4. Are you an insufferable twat?

If the answer to any of the above is "yes," congratulations! You may refer to yourself with "we." If the answer is no, stick to "I." Alright, che?

Monday, January 14, 2013

due credit

Scott Lemieux of Lawyers Guns & Money has published an important and troubling consideration of the Brennan nomination and its larger consequences for the Obama administration. It's exactly the kind of critical take on the Obama White House from liberal Democrats that we need to see more of.

Sunday, January 13, 2013

singular "their" and the grammar wars

English is, in many ways, a deeply strange language. One of its more overtly unusual features is its profound lack of inflection compared to other languages; that is, English lacks many morphological features that convey information about person, gender, and number. So our verbs, for example, inflect for number only in the case of third person present tense-- I walk, you walk, they walk, but he or she walks. Contrast this with a language like Italian; part of the reason an Italian poet like Dante could write a three-volume epic poem in terza rima is because the Italian morphology adds suffixes with similar endings far more often than English does. There are languages with even less morphology than English; most Chinese languages and dialects are almost entirely morphologically inert. On another extreme is a language like Latin, which has long terrorized high school students with its absurd number of cases, declensions, moods....

English's lack of inflectional morphology has consequences. One of them is the attendant need for a fairly rigid syntax, the structural position of morphemes within a sentence, to convey the necessary information needed to decode the sentence. If you're writing a poem in Latin, you have remarkable freedom to move words around within a sentence, as (usually) the inflections carry the information necessary to determine the relationships within the words. Case markings, for example, denote subject-object relationships. But in English, we only have case markings for pronouns (which here I will use to refer to personal pronouns)-- he/him, she/her, I/me. (In fact English had a broader case structure, but it was lost in the transition from Old English to Middle English; a vestigial example is found in the slowly dying "whom.") We also only have gender in singular pronouns and not in our verbs or nouns that aren't semantically assigned to one gender or sex. (By semantic gender I mean referring to actual gender or sex in the real world, and not purely linguistic gender, as in the assignation of gender to inanimate objects in languages like Spanish.)

Plural pronouns lack gender inflection, given that referring to any group might necessarily include people of both sexes. (Although note that there's no reason we couldn't have a plural pronoun that referred only to a specific sex or gender.) You/your, we/our,  they/their-- each is plural and gender indeterminate. Gendered possessives are found only in the singular-- he/his, she/hers. We can say, therefore, say "every girl threw her ball" or "every boy baked his cake" without trouble. But what about possessives that consider both genders? What do we make of a sentence like "Every child lost their marbles?" It's a question I considered for my semester project in a class in generative grammars and minimalist syntax last semester. I'm not a linguist myself, though I am in a related field, so please take this post with the requisite grains of salt. Here's what I found.

Saturday, January 12, 2013

Reactionary Minds in antiquity

I've been reading a lot about the Sophists lately, part of the rich tradition of ancient Greek thought that is unhelpfully and unfortunately lumped into the term "pre-Socratics." What's remarkable is how well some of the dynamics of ancient Athens fits with the Corey Robin thesis. Although we are more likely to identify them for their particular intellectual tradition, the Sophists were generally speaking wandering teachers, itinerants trained in the arts of rhetoric, poetics, and logic who would instruct whomever could pay their fee. In traditional Athenian society, education was provided for young men across social classes, but advanced education was reserved for the upper classes, who had the resources and connections necessary to pair their young men off individually with older mentors who would guide them through their educations. The Sophists democratized education, and for this reason they were feared by the Athenian ruling class, who at times acted to forbid Sophist teaching.

It wasn't merely the fact of teaching for those outside the aristocracy but also the content and purpose of such teaching that undermined the established order. As I.F. Stone puts it in The Trial of Socrates,
There is a strong element of class prejudice in the Socratic animosity towards the Sophists. They were teachers who found their markets in democratic cities like Athens among a rising middle class.... They wanted to be able to challenge the old landed aristocracy for leadership by learning the arts of rhetoric and logic so they could speak effectively in the assembly.... [H]igher education remained the monopoly of the aristocracy until the Sophists came along. They provoked upper-class antagonism by teaching the arts of rhetoric-- for an ability to speak well in public was the open door to middle-class political participation in the debates of the assembly and the higher offices of the city.
Sophists also threatened through the radicalism of their teachings. Although ancient Athenian religion was never as prescriptive or rigid as the monotheistic religions tend to be, the profound agnosticism found among major Sophistic thinkers like Protagoras must have rankled Athenian society. The Sophists also posed radical critiques of the meaning of knowledge and existence of truth, topics which provoke reactionary responses even today. Alcidamas of Elaea expressed the earliest known condemnation of slavery in human history, in a time when slavery was an essential part of the economy and the social structure. In every sense, the Sophists represented an intellectual tradition that challenged the status quo.

It's therefore no surprise that they have been shrouded in mistrust and dismissal by history. For centuries, the Sophists were dismissed as deceitful and illogical. Even today, "sophistry" is a term that refers to weaselly, conniving discourse. Rhetoric itself labors in an assumed worldview that distrusts it, under the inherently reactionary theory that the truth can be articulated plainly without the need for ornamentation. It's no accident that this long tradition of disrespect stands in contrast to the reverence for Socrates and Plato-- two explicitly anti-egalitarian thinkers who expressed a longing for authoritarian power and a distrust of the common man. Those in power write the history, and its up to us to recognize, even 2,500 years later, those whose arguments were belittled and misconstrued because they challenged the current power structure. And it behooves us to recognize one of the constants of human history: the power and persistence of class struggle.

so strange

Timothy B. Lee has written a remembrance of Aaron Swartz that contains a constant drip of bizarre, out-of-context American chauvinism, suggesting that Swartz's activism and curiosity are less indicative of his character and upbringing and more a consequence of his country of origin. It's a shame, because Lee's post hides a far more humane, far better point about genuine disobedience, one that's lost in the useless consideration of the fiction that is Americanness. It's like he can't help himself. To be fair, Lee is playing off of Paul Graham, but the whole enterprise is just profoundly strange. Lee equates Americanness with unruly experimentation, apparently having never heard of Alan Turing or Niels Bohr or Yoshihiro Yakamatsu, or that cultures like those in France or the Netherlands have a far deeper history of tolerance for actual disruption-- as opposed to that which only serves capitalism-- than in the United States. He is also apparently unaware that what Silicon Valley is most interested in producing is new ways to send people photos of your junk and social networks for cats.

More than the ugliness of the American chauvinism is the simple failure to make the argument he wants to make. I see absolutely no reason to think that Swartz would not have been the person he would have been had he been born in another country; no reason to believe that he would have been less likely to have been prosecuted in an earlier decade; and no reason to believe that this whole awful scenario says anything about some special Americanness and its lack among those in other countries. The whole piece uses the pitched emotions inspired by Swartz's death to obscure the fundamentally failures of Lee's argument. Lee would, no doubt, admit that there are brilliant, creative, and unruly thinkers the world over. So what provokes this sort of thing? He's writing this at a time of profound American failure-- military, economic, political. At such times, arguments inevitably sprout up to assert American superiority. I think it's a straightforward example of the patriotism of anxiety.

I cannot imagine occupying the mindset that responds to this tragedy by looking for an excuse to engage in nationalism. It's particularly unfortunate because so much of the ideology of digital activism and freedom of information that Swartz represented is explicitly, defiantly international, with no use for patriotism or borders. Indeed, few things could be less patriotic than disobedience; invocations of country are, after all, a mechanism of control.

academics want their work to be available

This is a minor point in the face of a tragedy, but I think it's worth making.

Aaron Swartz, the young information rights activist who was indicted for downloading millions of academic journal articles from MIT's network, has committed suicide. The charges against him were, frankly, insane. Yes, it was a mistake to break into a networking closet and access their network illegally-- the kind of mistake that should get you, say, a fine, community service, and probation. The feds were apparently out to make an example of Swartz at a time when they are under great pressure from media companies to enforce IP laws. It's impossible to say if his suicide was a direct result of his prosecution; mental health is enormously complicated and does not operate on a simplistic system of cause and effect. But it would be absurd not to assume that the prospect of years and years in prison contributed to his mental well-being at the time of his suicide.

Here's the point I want to make about journal archive access: I don't know a single academic who is opposed to open and free access of their work. And I know more than my fair share of academics. My father was a professor and his father was a professor, my family's social circle growing up was full of academics, I am a grad student who maintains friendships and connections with other grad students and professors at many universities, and I spend an awful lot of time talking about the university and its culture. And I have brought up the question of gated journal access constantly, because it's a subject of considerable interest to me. I know that this isn't a very rigorous standard of evidence, but my own experience is all I've got. I have never talked to anyone-- arts, professional schools, humanities, social sciences, or STEM-- who was opposed in theory to the idea of free access. You've got to do something to rebuild the revenue streams of the academic journals, many of which operate at a loss already. But as a principle, giving people free access to journal articles is as close to a universal stance as I can think of among academics. Why wouldn't it be? Researchers believe that their research has value, that it matters, and they want it to be read.

It's important to say: I am very far from a piracy apologist or advocate for totally free media. "Information wants to be free" is an entirely empty statement, an attempt to use a profound-sounding aphorism in the place of actual intellectual work. In my experience, those who advocate the freedom to pirate simply want whatever they want, whenever they want it, at no cost. That's not an adult stance, and to date I have never heard a piracy advocate articulate a system that would achieve that universal free access while still making the media we love practically possible, to say nothing of compensating creators for their talent and their work. I am not someone inclined to an "anything goes" attitude towards intellectual property. But this prosecution was ridiculous, this outcome tragic, and this restriction on the free dissemination of academic articles an affront not just to the ideals of scholarship but to the actual desires of most academics. Who was the government protecting in this prosecution? Who was it for?

against critical shorthand



I'm sure you're aware of the Manic Pixie Dream Girl, a trope identified and skewered by Nathan Rabin of the AV Club. I say that because it has since sprouted like mushrooms across the Internet, popping up all over the place. I think Rabin was identifying a real phenomenon with real problems, but I also think it was applicable to a limited number of movies, and the term long ago outlived any use. Now, when I see it, it's typically employed as a kind of meaningless and mindless dismissal, a piece of vague snark relying on borrowed cleverness. When I saw someone refer to Annie Hall as a MPDG, I wanted to throw something through a window. Any term that can equate the empty shell that is Natalie Portman's character in Garden State and Diane Keaton in Annie Hall is useless.

The problem, I think, is critical shorthand-- when terms or tropes are used as a way to avoid doing the work of careful, specific criticism. Effective and fair criticism always proceeds from making as sympathetic an interpretation as possible before recounting flaws. Such an accounting can't happen if criticism is expressed in a predigested idiom, especially if its one that is explicitly mocking and reductive. Being a critic entails, to me, a lot of responsibility, a lot of integrity. And since every piece of art is unique, each deserves the respect of a unique appraisal.

I bring this up because of a discussion in Slate's annual Movie Club, the increasingly-obnoxious roundtable between movie reviewers the website runs each January. (This year, it's Dana Stevens, Stephanie Zacharek, Wesley Morris, and Keith Phipps) The reviewers, in seeking to dismiss Beasts of the Southern Wild, repeatedly refer to it as "like a graduate thesis" or similar. And let me say: this means nothing. It contains no content. It's simply a way for the reviewers to pose as superior to what they're discussing. It's a perfect example of the preference for saying something that sounds clever rather than something revealing or insightful. What does that actually mean, "like a graduate thesis"? I have some idea, but some idea is not enough. It is the business of writers to express themselves, to make their intention plain, to fill in those gaps. Ezra Pound said that writers should go in fear of abstraction. You can take that advice, but you can also take it as an admonishment. This is what I mean by critical shorthand, when a critique becomes a mere device.

I believe, very strongly, in the power and value of film criticism, including negative criticism. That's true when movies make arguments that are wrong, or when they insult or condescend to their audience. It's also true when negative criticism helps to make us understand movies in a deeper way. I wanted very much to love Super Eight, and I didn't, and I struggled to understand why. Between the two of them, Devin Faraci and Film Crit Hulk of Badass Digest explained to me why I didn't like that movie more, and that has helped me appreciate movies I do love in a deeper way. It's not a question of severity, but a question of rigor, of critical integrity. As someone who thinks that art has a moral purpose and should be taken seriously in every sense, I have no problem at all with harsh criticism. What I have a problem with is lazy criticism, cheap criticism.

Look, I'm not going to pretend neutrality here: I think Beasts is a beautiful movie, wonderfully alive, wise, and unafraid. And what it's unafraid of is exactly the dismissal of critics like these. American movie criticism is, to my mind, deeply unhealthy, and that unhealthiness stems from a profoundly defensive stance, a deep fear on the part of critics that they are somehow being tricked, that they are being hoodwinked. I saw the movie at a special screening at Wesleyan, and while I loved it, I said to myself, "this is a movie that will certainly provoke a backlash." And that backlash has come because the movie actually tries something different, something that takes risk. Critics are endlessly knowing about Oscar bait, but terrible naive about critic bait, about how they themselves are manipulable. Manohla Dargis once wrote about the safest way to ensure critical respect:
American cinema is in the grip of a kind of moribund academicism, which helps explain why a fastidiously polished film like “No Country for Old Men” can receive such gushing praise from critics. “Southland Tales” isn’t as smooth and tightly tuned as “No Country,” a film I admire with few reservations. Even so, I would rather watch a young filmmaker like Mr. Kelly reach beyond the obvious, push past his and the audience’s comfort zones, than follow the example of the Coens and elegantly art-direct yet one more murder for your viewing pleasure and mine. Certainly “Southland Tales” has more ideas, visual and intellectual, in a single scene than most American independent films have in their entirety, though that perhaps goes without saying.
That was written more than five years ago, and it remains discouragingly true today. Filmmakers can combine minimalist direction with remorseless sociopaths and abstract scores and endless "artistic" violence, and receive plaudit after plaudit. But true credit belongs to people who stretch not just the easily-parodied dictates of the Oscars but the more subtle myopia of the critical community. It's hard to imagine a movie that better represents the opposite of what Dargis complained about than Beasts of the Southern Wild, which is wild and teeming and loud and passionate. What makes it all so much worse is that Beasts is what critics say they want: something vivid, something daring, something new. If that's what graduate theses are made of, give me many more.

Friday, January 11, 2013

house cleaning

Hey guys, I'm planning on taking the blog in a slightly different direction in the new year, so I'll be playing around with some stuff. I'm updating the template as we speak. I may just end up going back to the same old thing, which has been pretty much inert since 2008 or so. Give me feedback, please! Also, I'm afraid I'm going to have to set up a comment filter/login system if this spam persists. (Or if that weird pro-Zionist/anti-Semite troll persists.) I hate to do it, because I know it's been buggy in the past, and I really believe in the right of people to maintain anonymity online, but every post now seems to have a commenter telling us about this one weird trick they learned to make money playing online Yahtzee.

Anyway, change is afoot, and hopefully in a positive direction. Expect me to be playing around/changing stuff for most of this weekend.

Update: I'm digging this current template, so let me know what you think. The license of the template requires that I maintain the attribution in the footer, which is perfectly fine by me. But does anyone with a little HTML knowledge know how I can get rid of that Twitter and RS button duo up at the top right? The HTML is available here. Any help would be appreciated.

Update II: Thanks very much to Roger Burgess for the help.

Thursday, January 10, 2013

In greatest travesty of the 21st century, a pretty white lady is denied a golden trophy

I'm glad the world has people like Scott Mendelson, to tell us who the real victims of the post-9/11 world are: millionaire Hollywood insiders, currently jetting around the world to receive praise, trophies, and fabulous giftbags from other millionaires.

In a completely unrelated story, the United States has stepped up its campaign of aerial terror against the impoverished, powerless people of Afghanistan and Pakistan.

Update: Mendelson has provided a link to a piece he wrote about drones, which is important for context and fairness.

Wednesday, January 9, 2013

more reporting, less generalism, more beats, less objectivity

I thought this post at Naked Capitalism did a good job of pushing back against the kind of blog pessimism that I myself regularly partake in. It shows how the platinum coin idea-- on which I am agnostic, other than to point out that some of the outrage is funny, given that all human perceptions of value are made up and socially conceived-- began on blogs and spread into the national media consciousness over time. Very cool data presentation and tools in that post. For all of my pessimism I still think that there's a lot of room for smart political journalism.

How to do it, if you're a young writer on the come up? I don't have any advice for how to make money, but I do have a few ideas about what would be valuable.

1. More reporting. I know this gets trotted out a lot, but it happens to be true. There are a ton of people (like me) offering opinions, analysis, commentary, etc. What I think can really distinguish someone coming up is actual reporting-- interviewing, going to the scene of events, etc. Obviously, this is a lot of work, easier said than done, and still requires contact with old media for practical purposes. There's a lot of places you can't get access to and people who won't bother to talk to you without media affiliation. But if you want to make a direct impact on the stories that matter, that's the most direct path. Also, reporting doesn't mean giving up on commentary. I would point to Adam Serwer and Dave Weigel as two guys who, from different political backgrounds, do a good job of synthesizing reporting and commenting. Which brings us to point 2.

2. Abandon objectivity, maintain fairness. Again, this is a well-worn idea, but still important. Looking at Serwer and Weigel, you're talking about two people who don't make many bones about their ideological commitments. That's not a failure to be appropriately professional as journalists; it's instead an acknowledgement that they are human beings with human biases and beliefs. As others (such as Jay Rosen) have pointed out repeatedly, the illusion that any reporter can maintain perfect objectivity actually makes bias more dangerous, because it creates a false sense about what the news is and compels journalists to ignore their own natural inclinations in one direction or another. What's important to remember is that you can be evenhanded without being objective; that is, you can do your best to represent what the arguments of each sides are, without pretending that you don't have an opinion or that both sides are equally accurate. If you're a journalist, you're gathering facts. If you have a functioning intelligence, you're going to generate an opinion on those facts. Don't pretend otherwise, just make sure that you understand and present the strongest possible arguments for the other side.

3. We don't need more generalists, so have a beat. This one does strike me as good advice for employment and professionalization, actually. There are plenty of generalists out there writing about whatever strikes their fancy. (Like me!) To distinguish yourself in a crowded landscape, it helps a lot to have a specific focus and to know what you're talking about. Make sure you have a strong grasp on the basic contours of your beat. Going to school helps a lot, because school (it's true) is still the best way to learn things, eduhacking nonsense aside. Develop contacts within that field. Find fresh angles. If you want to remark on a larger controversy, relate it to your field. Good examples of people who have developed a particular niche and work it consistently are Dana Goldstein, who writes brilliantly on education, and Dave Roberts of Grist, who writes about environmentalism and climate change.

Taken altogether, I guess I'm saying "be like Mike Elk." Elk is a great labor journalist whose work appears in In These Times and The Nation. I don't agree with everything Elk believes, but I am in great agreement about his primary focus, which is unions and labor. Elk is an honest-to-goodness old school reporter, someone who gets interviews and follows leads and pursues neglected stories. He does it all without hiding his pro-labor sympathies and with an interest in getting the facts right. There aren't nearly enough people on the left doing the kind of work he does; there's lots of opportunities to be had there, although probably not a lot of money.

I am 100% willing to cop to this post being a lot of boilerplate. But I think it's important boilerplate. (And read Mike Elk.)