30 July 2010

29 July 2010

Number One for 29 July 2010

The devil in my pocket turned to gold
Bitter:Sweet, "The Mating Game"

28 July 2010

Number One for 28 July 2010

Wait in line / til your time:
Zero 7, "Waiting Line"

27 July 2010


Thinking about it more, I realize that the problem with Inception is just that it wasn't imaginative enough. More precisely, the nature of the plot prevented the director from fully realizing the unique logic of the dreamscape. We experience dreams as variations on reality, in which the laws of physics and of narrative can be suspended, sometimes at will and sometimes because of exogenous shocks. Inception allowed for only a tiny fraction of that variation to be used. The whole point was that the plot within each level of the dreaming had to make sense, had to be designed in order to make sense, and had to proceed from one starting point to one end point. There was no room for intra-dream switching. The ice world could never shift to become a beach; the city Di Caprio and Cotillard had designed could not become a cottage; and the narrative structure within each dream allowed for no alternate solutions. By structuring the adventure as dependent on an architects' creation of a maze, the plot therefore foreclosed what would have been much more interesting: namely, the experience of goal-driven actors in a completely stochastic environment.

And, of course, it meant that the dream could never become a nightmare.

Enough already

I am only a Level 2 Mac fanboy; the Higher Mysteries of the RDF have not been revealed to me. Like a lot of people, I switched because of the Apple design concept, and like many people I was introduced to its pleasures by the iPod. (This is the model that I thought was so brilliant, back in 2004.)

It strikes me that the essence of the iDesign philosophy is metadictatorship. Steve Jobs does not control what I do to my iDevices, but he does control how I can use or change them. Thus, I can customize, but only to a certain point. This chafes at people who believe they are better computer programmers and designers than those in Apple's employ, including even a few people who actually are better computer programmers. For the rest of us, complaining that our phones aren't jailbroken is like complaining that we can't replace the engines in our Prius.

All of this is just a prelude to say that I have been spoiled by this benevolent dictatorship. I noticed this morning when I went to the New York Times, as I have been doing now for fourteen years, that the site is awful. Beyond awful. There are 1903 words on the home page.

Nineteen hundred and three words.

I copied and pasted the page into Word. It was 16 pages long.

Some of that is page formatting. But most of it is cruft. Look at your NYT app on your iPhone and then look at NYTimes.com. Design matters.

Creative Commons image by Incase.

Number One for 27 July 2010

Got a counterfeit dollar in his hand:

Stevie Wonder, "Misstra Know-It-All"

26 July 2010

The Idea of a Midwestern University

I am spending the summer in Ann Arbor. This may be the best, or at least the second-best, summer of my adult life. I am taking classes, but will not be graded. I am reading, but at my own direction. I am cloistered, but part of a community. I am, in short, having the experience that I thought that grad school proper would be like: private, intense, and liberating.

One reason for my contentment is the setting. It has been seven years since I spent time in one of the large Midwestern universities, and I realize now how much I had missed the environment. The urban campus I attend now has its brief moments of beauty, but they are pockets amidst a jumbled campus whose architectural incoherence is testament to the poor financial planning of previous generations of administrators. It took a lot for universities to miss out on WPA funds for new construction, but somehow the old priests managed the trick. Their successors during the Cold War failed to acquire the American talent for wealth creation but learned architecture from the Soviets. At least they had the good fortune to have inherited a stately nineteenth-century quadrangle; in Dublin, at John Newman's university, the new campus in the suburbs was built from the ground up by Brutalists.

Number One for 26 July 2010

In life revised you never went away:
The Gregory Brothers, "Summertime"

24 July 2010

Number One for 24-25 July 2010

Another summer's passing by:
Belle and Sebastian, "Asleep on a Sunbeam"
Bookmark and Share

23 July 2010

Cyclicality, again

Yesterday's post didn't end where I thought it would. It got a bit philosophical and mopey. What I'd meant to write was a much more practical piece about how the expectation of cycles constrain and condition expectations in organizational life.

If you live and work in a world with strong cycles, then you have to account for those when planning new activities. Periods of high organizational stress, or periods when high organizational performance are needed, are bad times to focus on secondary matters. That rules out changes to standard operating procedure Budget bureaus shouldn't undertake sweeping new initiatives at the beginning of a fiscal year, anymore than it's a good idea to try out a new quarterback in the postseason.

In academia, the cycles are even faster. There are at least three: the two semesters, and the summer. These are layered in the broader cycle of the school year. The separate nature of these cycles combine to make innovation peculiarly difficult in an atmosphere that already makes changes difficult.

I rule out summer, because I address faculty and grad students, not administrators. My hunch is that summer is the right time for redoing administrative procedures, since it is their relatively quiet season. But coordinating academics over the summer adds total impossibility to extreme difficulty.

But the semesters are hardly easier. The first and last weeks of the semester are no good, as is the middle of the semester. High-intensity projects would simply compete with more important responsibilities--and lose. That leaves four windows a year when there is even the possibility of adding new activities.

I have been thinking about this because, obviously, I'm involved with a new group (a workshop on advanced methods). There are many debates involved in founding a new institution, from questions of group behavioral norms norms (which can be established easily at the beginning, but which are tough to change later) to expectations about individual members' involvement to administrative worries. This last category deserves a post of its own. Drafting constitutions, sorting out financial controls, and settling issues of executive competence versus board oversight are tough, even when the group is relatively small and straightforward. One factor that has to be overcome is that academics usually privilege discussion over experimentation and deliberation over decision. Isonomy is an ideal, but it's a harsh mistress.

The more immediate questions we face now are how to keep the group going. There's loads of enthusiasm and the first semester went well, but having a vision for a group means understanding the factors that can sap those traits and lead to a gradual deflation of the popular will that sustains a collectivity and leads to the reproduction of its values and practices. In particular, I wonder if there's a good argument that this group shouldn't explicitly take into account the cycles of the semester and academic year in setting its schedule: having exciting but relatively low-work sessions to begin and end the year, while having the most difficult and labor-intensive sessions in November and January. (November, because it's a time when people want to procrastinate during the doldrums between midterms and finals; January, because the midpoint of the year finds most everyone in midseason form.)

Lowering ambitions a bit deflates expectations at the beginning. Adopting a more conservative attitude makes it more likely that the group can achieve the goals it wants to. The greater danger, though, is in allowing enthusiasm to outstrip capabilities and creating a gap between what is achievable and what is expected. Cyclicality encourages conservatism.

Bookmark and Share

Number One for 23 July 2010

It's a godawful small affair:
David Bowie. Life on Mars? Uploaded by kidibiza. - Explore more music videos.
Bookmark and Share

22 July 2010

Cyclical time and the academy

Well, here it is, another summer and I'm back in school. There is something odd about being more excited to go to class than going to the beach, but thankfully the adult world is structured so that people who share enthusiasms can congregate.

I wonder sometimes if Americans don't have different connections to the seasons than do other cultures. I wonder this not because I want to posit some uniquely American relationship with fall or with winter, but largely because from age 5 to 18, at least, Americans experience summer as a long, unbroken string of endless days. (There's an entire, and astonishingly subversive, Disney cartoon about this phenomenon.) Other countries generally have a shorter summer break; Americans experience summer as a nice preview of life itself. The summer begins full of promise, ripens even as it sours, and ends in a haze of boredom and anticipation. The metaphor breaks down at that point, though, because the coming of fall heralds both the beginning of a new cycle and a promotion within a nicely hierarchical system. Whereas you were once a lowly second-grader, now you may know the mysteries of Third Grade.

Most people outgrow this cycle and graduate into the Real World. I think, in fact, that the linear nature of the Real World is what people have in mind when they discuss this mythical place. (That, and money.) After all, the stages of adult life are strictly sequential, and I suspect that the cumulative nature of outside relationships begins to overwhelm even the seasonality of jobs like those in retail, fashion, and tax accounting. By contrast, academics repeat the cycle until death or denial of tenure, in increasing order of terror. Each year brings a new crop of students, who are there to be taught, nurtured, tolerated, and finally cast out into the world. We grow older, and they stay the same age.

Cyclicality is probably the calendrical equivalent of folk physics. There's probably a good reason why religions structure themselves around cycles. From one perspective, human life is just the rehearsal of roles defined by forces beyond our comprehension and before our understanding. We think there is something natural and inevitable about cycles that are plainly both artificial and recent. Consider the concepts of childhood, adolescence, and young adulthood, none of which existed in recognizable forms two hundred years ago, and for only a very few people a few decades ago. (I like to look at historical statistics, and I'm always stunned at how recently it was customary to leave school at 13 or 14 and begin working in what were essentially adult occupations.) The persistence of such notions in the face of obvious counter-evidence and despite changes across roles between generations is a good sign that we are slotting in our observations about life into a preconceived template.

In fact, I can think of only one other tribe of adults who live by as cyclical a calendar as academics (into which category I will admit, for one night only, teachers): politicians. The electoral cycle is slower now than it used to be, in the 19th century, when one- and two-year terms were the norm, but it must feel more hectic than it was. The principal difference between the electoral cycle and the academic cycle is stark: the participants in one cycle are all but assured that they will be in the same jobs in the next revolution.

Bookmark and Share

Number One for 22 July 2010

Let's all get up and dance to a song:
Bookmark and Share

21 July 2010

Number One for 21 July 2010

Windows rolled down with the heat on high:
  • Dear Pinot Noir: It's not me, it's you. [The Gray Market Report]
  • Bill Murray thought Garfield was a Coen brothers movie [Vulture]
  • Yes: Let's end the American aristocracy. But I'm tired of these weak, Cass Sunstein "nudge"-style policy proposals. How about our progressives propose some real, sanguinary, Bolshevist proposals? [Ta-Nehisi]
  • Suck it, Aaron Friedberg: America didn't become a garrison state because we're too corporate [Who is IOZ, via ZW]
  • Drastic oversimplification: Do Confucians believe in sex? [IPE @ UNC]
  • Jim Vreeland gets an uncredited guest blog [The Guest Blog]
Carrie Underwood,"Get out of this town". No, these links aren't designed to prove I have good taste ...
Bookmark and Share

20 July 2010

Quote of the day

From an anonymous commenter on the PoliSciJobRumors Web site:
Stata 11 is of course going to feature the often demanded "figure this shit out" or ftso command. Simply type the command: ftso 'depvar' and it will give you the results you need in order to answer your research question! If you have time-series cross-sectional data, or if you have no clue what kind of data you have, but want it to look more sophisticated anyways, you should use xtftso.

Bookmark and Share

Auto-Tune the Chart 2

Nobody but me cares, but this is fun...

Bookmark and Share

Number One for 20 July 2010

Guess I'll try to go despise a blog by someone else.
  • IRV gains a new supporter. Too bad he only supports it because he lost. [Yglesias]
  • Dan Drezner gives two cheers for redundancy. He should have called the post "Department of Redundancy Department". [Drezner]
  • Bellisles didn't fabricate, but he didn't fact-check [Chronicle of Higher Ed.]
  • Kathryn Lopez fawns over Mel Gibson [NRO, via reader AT]
  • Science is becoming exponentially more difficult. [Boston Globe, via Monkey Cage]
MC Frontalot, "I hate your blog"
Bookmark and Share

19 July 2010

Number One for 19 July 2010

I think that we make a pretty good team:

  • How Cornficker defeated the smartest guys in the world. [Atlantic]
  • My guess is Stochastic Democracy will eat 538's shorts. [Stochastic Democracy]
  • Today is upgrade day. I hope Stata releases aren't like the Star Trek films, where only the even-numbered ones are good. [Stata]
  • It's also the first day of classes: [1], [2], [3] [ICPSR]
  • Calibrating your gaydar. (Can you draw a ROC curve for that?) [Gelman Blog]
  • Straight talk from Tom Friedman [New York Times]
  • David Blackwell, game theory and Bayesian pioneer, died. More information here. [New York Times, Mathematicians of the African Diaspora]
  • Taiwanese news portrays Steve Jobs as Darth Vader. NB: "Apple" is "pingguo" in Mandarin; "problem" is "wenti". Count how many times you hear those words! [Via Daring Fireball]
Obi Best, "Nothing Can Come Between Us"
Bookmark and Share

18 July 2010


The critics speak.

For what it's worth, I didn't see much in the film that hadn't been done better by Dark City, the Matrix* or Total Recall. The only hints that we have that anyone besides Fischer is a real person is the kiss between Ellen Page and Joseph Gordon-Levitt; it's the only actual moment of human feeling in the entire piece. Marion Cotillard is radiant and rises above her lines (the incantation about the train sounds dumb when we find out what it is), but imagine if Page had been a rival to her charms. The plot "twists" were all heavily telegraphed and easily familiar to anyone who's read Dick, Borges, the better Bradburys, or Poe. Would it have hurt to have made Saito Chinese instead and had a reference to Zhuangzi?

I think the final scene makes the whole thing obvious (remember: we don't know how Di Caprio washed up at the beach at the beginning of the film, which is a dead giveaway). That is a big disappointment, especially compared to Total Recall. Clearly, Nolan is brilliant--the film is gorgeous and visually inventive--but his talents are better deployed at adaptation than invention. In particular, Dark Knight portrayed a better understanding of ethical challenges and moral questions than Inception, which has none.

Bookmark and Share

Humor Department, Bureau of "Your Matriarch" Jokes

From a loyal reader:

1: We were so poor growing up.
2: How poor were you?
1: We had to shop at the Quarter Foods store.

Bookmark and Share

Number One for 18 July 2010

Modern minds can come up with three questions:
  • Don't fill much-needed holes in the literature, says Erik Voeten. [The Monkey Cage] (See also James Stimson)
  • What is a "computer"? Paging Dr. Wittgenstein. [Charlie Stross]
  • Losing $9.2bn is the result of a non-material deficiency. I'd hate to see a material one. [FT Alphaville]
  • Incidentally, FT is right that EDGAR is teh suxx0r. In fact, most federal databases are awful. Please: make documents available as txt and pdf, make all searches Boolean, tag all documents consistently, present tabular data as csv, and mathematics as TeX. Never again should I have to read a document like this one or use a database as terrible as this one.
  • McChrystal, F*** YEAH. [Atlantic]
  • Robin Hanson is beginning to understand the alienation of labor. [Overcoming Bias, via ZW]
Mr. Show presents "The Limits of Science":
Bookmark and Share

Periodizing U.S.--Soviet Conflict

As my study partner and I re-read the political science literature on U.S. foreign policy, we have wondered at the number of times the United States has been proclaimed the world's only superpower, which is exceeded only by the number of times the IR community has proclaimed that the era of U.S. unipolarity has been finished. Offhand, I can find citations that would bolster the claim that American unipolarity began in the 1940s and ended in the 1950s, in the 1960s, in the 1970s, and in the 1980s, as well as arguments that American hyperpuissance began in the 1980s and ended in the 1990s, began in the 1990s and ended in the 2000s, began in the 1990s and will end in the 2010s or 2020s, and never began or ended at all. Judging by Bear Braumoeller's working paper on U.S. isolationism, I could probably also make a good argument that American unipolarity was at least a possibility in the 1920s. And what else can we take away from Kindleberger but that the United States failed to exercise the global leadership to which it was so plainly entitled?

If you think that dating the potential of American hegemony to before the Second World War is hyperbole, consider the criteria by which Spain, the Netherlands, and the United Kingdom were all retrospectively crowned hegemon; certainly the United States of the 1920s exceeded in relative power the Great Britain of the later Victorian years, when London was unable to contemplate maintaining Canada and its position in the Western Hemisphere without significant rapprochement with Washington. Had the United States bothered to maintain a significant land army or invested in its air force to a greater degree, either of which it could have afforded without a problem in either the 1920s or the 1930s, its military power coupled with its economic influence and de facto imperial hold on the Latin American countries would have certainly made it surpass the relative power position of Athens at the Periclean height. (I suspect that American influence in the Western Hemisphere peaked about 1940, which is when the FBI--the FBI!--ran U.S. intelligence operations throughout the region and external penetration of regimes was at its minimum.)

If periodizing U.S. unipolarity is such a problem, it is no less difficult than determining when the Cold War began and ended. The high school history textbook answer is 1946 to 1991, but over the past decade I have come to the radical position that everything we learn in high school is probably wrong. (Even the Pythagorean theorem.) A very informal survey of the IR literature leads me to conclude that the Cold War as understood at the time actually ended about 1971, +/- four years (in other words, within the period between Glassboro and Helsinki). The renewed pattern of hostile interactions between the invasion of Afghanistan and Reagan's second inauguration was widely seen by everyone except the editors of Human Events as a throwback or a reignition of a dormant conflict. Moreover, this Cold War ended at least three times: with the conclusion of major arms limitation talks in Europe, with the fall of the Berlin Wall and the dissolution of the Soviet Eastern European empire, and with the collapse of the U.S.S.R. itself in 1991. (For extra credit, pinpoint the dissolution of the U.S.S.R.: was it the August coup, the signing of the C.I.S. treaty, or the resignation of Mikhail Gorbachev?)

Politics ain't beanbag, and political science ain't physics. There is no shame in our having multiple definitions of the inauguration and conclusion of different eras. The different periods may be useful for different purposes. (I think it is clear as can be that 1973 marked the end of American economic hegemony and the beginning of meaningful multilateral governance of aspects of the international--read first world--economic system.) Yet the proliferation of periodizations nevertheless should prompt some epistemic humility among contemporary IR scholars and also a re-evaluation of the way we present the "stylized facts" of 20th century history to undergraduates. In particular, we should reject the high school narrative of the Cold War as a monolithic event that serves a useful analytical purpose and instead present the years between Roosevelt's death and Clinton's boxers as a series of more discrete and more analytically-defined periods. I suggest the following:
  • The Cold War, 1947 to 1962. The Truman Doctrine and the Cuban Missile Crisis bookend the height of the Cold War. The Truman Doctrine symbolizes U.S. resolution to engage the Soviet Union and neatly outlines the doctrine of containment; the Cuban Missile Crisis symbolizes both the rise of Soviet power and the need of the United States to adapt to a world in which its strategic supremacy was no longer a given.
  • The Soviet-American condominium, 1963-1979. The signal fact about the 1960s and the 1970s was the strategic stability of the global order, as assured destruction and concomitant strategic talks between Moscow and Washington imposed an order on bilateral relations. The "opening" to China---a far more complex event than normally portrayed---was as much a way for the United States to maintain the global order as it was for Washington to seek an advantage versus the U.S.S.R. (In particular, a Sino-Soviet war, as seemed possible in 1968 and 1969, could have had incalculable consequences for global order generally.) The Kissingerian mantra of a "structure" of global peace fits the period well, in which the drumbeat of nuclear tests had been replaced by a numbing succession of test-ban treaties and SALT talks.
  • Strategic supremacy, 1979 to ?. Washington's response to the Soviet invasion of Afghanistan and the buildup of American military budgets, combined with the increasingly unsustainable Soviet economic and political structure, produced a situation in which the domestically-determined collapse of the U.S.S.R. unfolded to maximum American advantage. It was Washington, not any multipolar arrangement, that dictated the fundamentals of the post-Soviet era: a unified Germany in NATO, the deference to the use of American military power in the Gulf and later in the Balkans, and the ability of the United States to project power throughout the world.
This is obviously a rough schematization of the period, and its essential elements are not original, but given the fact that undergraduates have little historical sense and much of what they do know they seem to have imbibed from presidential hagiographers it is probably a good idea to begin pushing back.
Bookmark and Share

16 July 2010

The secrets of the Mona Lisa

A great day for SCIENCE!!!
Some verities are eternal. Others are ephemeral, but last a pretty long time anyway. I am not sure into which category the idea "Science can't explain art" falls, but I am confident that its predicted lifespan exceeds my own.

Some scientists have used Technology to unlock the secrets of the Mona Lisa. On closer inspection, the "secrets" that the scientists deduced (using Technology!) turn out to be some interesting but trivial details about Leonardo's brush technique and color-mixing. Interesting, sure, but not exactly the Secrets! that I was promised. Nor does it seem as if this could be a line of inquiry as promising or at least provocative as the Hockney-Falco thesis. The scientists seem to be only interested in making some very specific measurements about Leonardo's skills, which are far less interesting than some generalizable theory about, say, Florentine painters generally. Nor does there seem to be any pressing reason to engage in the dispute, as with the shroud of Turin controversy. (Note, by the way, that debunkings of the shroud are almost as common as devotionals to it.)

In fact, for polemical purposes, I am almost disappointed that the scientists were French, not American. If they'd been Yankees, then I could have turned in a paint-by-numbers screed about how Americans are uniquely susceptible to the fallacy that Art can be subsumed by Science, and chide my fellow countrymen for their failure to appreciate both noble truths and noble lies. So I'll rewrite the critique, only substituting "modern" for "American"--a more useful turn, anyhow, albeit one that leaves itself open to criticism on its flank, as the scientists could then advance the line that un-modern thinking has a hard time appreciating science in any of its flavors, from proper Lakatosian thought to Dexter's Laboratory.

The essential point holds: although there may not be separate magisteria for religion (in its vernacular sense) and science, that is a negation of a specific manifestation of a more general proposition about the relative autonomy of spheres of human endeavor and understanding. As it is impossible to understand the weather based on quantum dynamics, so too is it fruitless to try to understand intentional behaviors, like the production of art, by the tools of science alone. I don't claim that this argument is original--how could I?--but it is good to be reminded of these arguments from time to time.

Bookmark and Share

The central question of American foreign policy for the coming generation

Should we greet China's rise with hugs or missiles?

Bookmark and Share

09 July 2010

New frontiers in LaTeX

Friend of the blog JW sends a link to "An Option Value Problem from Seinfeld, by Avinash Dixit (Princeton), which provides a model of how to precisely estimate a man's spongeworthiness.

Bookmark and Share

07 July 2010

Before they were famous!

Rather astonishingly, Sagan (1994) cites David H. Petraeus, "The American Military and the Lessons of Vietnam: A Study of Military Influence and the Use of Force in the Post-Vietnam Era", (Ph.D. diss., Princeton University, 1987).

Article available here.

Postscript: On the next page, Sagan writes:
[T]he military, like most organizations, tends to plan incrementally, leading it to focus on immediate plans for war and not the subsequent problems of managing the postwar world. Moreover, since managing the postwar world is the diplomats' job, not part of military officers' operational responsibility, the professional military is likely to be short-sighted, not examining the long-term political and diplomatic consequences of preventive war.

Bookmark and Share

Number One for 7 July 2010

There'll be no one left to blame us:
Randy Newman, "Political Science":
Bookmark and Share

06 July 2010

The foma that make you brave and kind and happy and healthy

Via reader JS, Patrick Deneen discusses Vonnegut.

It's a good essay, and explores an aspect of Vonnegut that I haven't really thought all that much about--the notion that the alienation of modern life is an alienation from (in Vonnegutian terms) our karass. This is a little bit of a bowdlerized Vonnegut, though: Vonnegut's ardent atheism and misanthropism were not incidental to his worldview, as Deneen suggests, but essential to them. It is telling that Deneen selects mainly from Vonnegut's earlier works, from the Nineteen Fifties, in which Vonnegut was still convinced of the possibility of individual freedom and of its being snuffed out by social action. From the Nineteen Sixties forward, I think it is clear that Vonnegut had decided that freedom was illusory. How else are we to understand the Trafalmadorians or the chronosynclastic infundibulum?

Two additional quick notes that arise from this discussion of Vonnegut. The first is that I have found a Patrick Deneen essay with which I entirely, wholeheartedly, and enthusiastically agree in all respects: this one, lamenting technology in classrooms. It is always and everywhere done poorly and for administrative convenience, and Georgetown seems to be worse about this than elsewhere. The logical outcome of this trend was precisely and presciently predicted in Forster's The Machine Stops, one of only two Forsters I have read, and the only one I understand.

The second is that I have an excuse to reprint one of my favorite passages from Vonnegut, from the opening chapter of Sirens of Titan, describing the closest thing the novel has to a female lead:
Her face, like the face of Malachi Constant, was a one-of-a-kind, a surprising variation on a familiar theme—a variation that made observers think, Yes—that would be another very nice way for people to look. What Beatrice had done with her face, actually, was what any plain girl could do. She had overlaid it with dignity, suffering, intelligence, and a piquant dash of bitchiness.
May we all someday be able to draft a phrase like "piquant dash of bitchiness."

Bookmark and Share

05 July 2010

Holy Cow

Google's style guide for R code.

Bookmark and Share

Mitch Daniels' Five Books, and Mine

Mitch Daniels discusses the five books that he says have influenced him the most. Self-reporting is always open to question (has anyone ever said the book that really influenced them to get into public life was Richard III? And yet how else do we explain Richard Nixon?), but Daniels' choices ring true, or at least plausible. It is refreshing to see an elected official who can discuss Mancur Olson; most politicians probably can't spell "Mancur." (Link via The League of Ordinary Gentlemen.)

Lists always provoke a response. Here, then, are the five books that I would list as my greatest influences. Shockingly, given that I don't read fiction, some fiction books make the list, but I note with some regret that they are both books I'd read before I turned 18.

  • Mother Night, by Kurt Vonnegut. Vonnegut is underrated and this is his least-appreciated book. I think it is stronger than Slaughterhouse-Five: there is no question about the reality of the plot within the confines of the novel, for instance, and the agency of the main character is the crux of the book instead of being assumed away. In other words, in Mother Night unlike many of Vonnegut's other books, the protagonist is a villain as well as a victim. (To be fair, many of Vonnegut's characters have some degree of choice, but the overriding theme from Sirens of Titans forward is that we are helpless before the sea of troubles. God Bless You, Mr. Rosewater, the finest book about Indiana, was a close call for this slot.)
  • A Canticle for Leibowitz, Walter Miller. I find the notion of original sin compelling. It seems to fit the individual-level evidence rather better than the alternative hypothesis. Consequently, Miller's meditation on politics and faith squares with my firm skepticism about the extent of civilizational progress. Miller's book is part of a small but almost uniformly excellent tradition of Eisenhower-era Catholic science fiction; James Blish's A Case of Conscience springs immediately to mind as a challenging novel in this category. I might also mention the (hardly Catholic) 334, by Thomas Disch, as well as any number of Dick novels, but the nearest competitor to this choice was On the Beach, which is the Calvinist version of Miller.
  • What is power, who uses it, and how does it transform both the objects of its action and the wielders of its force? In an almost novelistic chronicle of the career of Robert Moses and the development of the modern bureaucratic state, Caro addresses questions fundamental to our notions of what is acceptable and what is intolerable in a democratic polity. I say "novelistic" both because that is the nature of the book's writing but also because more recent accounts have stressed the elements that Caro left out of his story; this is a common knock on Caro, especially by those who have troubled to look up more balanced views of Coke Stevenson, LBJ's opponent in 1948. But if we consider this as an argument and not a strictly factual account (and it is factual enough, anyhow), then Caro's deep concern about whether government can be made accountable to anyone but itself becomes all the more evident.
  • Like many people, I enjoyed a long flirtation with the classics of libertarian thought throughout my teens, but Sen and others (especially, and obviously, including Nussbaum) finally put paid to those dalliances by demonstrating the inextricable link between political and economic freedom and satisfaction. Although I think it would be premature at the very least to say that Sen offers a fully developed way forward, he and others in the capabilities framework offer much more plausible solutions to a realm of problems that most Western (and certainly most right-of-center) thinkers do not take seriously, from acute famine to routine deprivation. Sen's discussion of the relation of political freedom to freedom from famine in the Bengali context (although not new to this book) is a challenge to other, less inclusive definitions of freedom.
  • This is certainly the least expected book on my list, and it stands in for many others. Tanizaki's explanation of the cultural and aesthetic assault of Westernization and modernization on his experience of Japaneseness forced me out of any sort of complacent, Galbraith-style optimism about the inevitability and the desirability of modernization as a totalizing force. Tanizaki explores what happens when modernization comes not as a natural outgrowth of existing social structures (although, as Polanyi and others have noted, that can be plenty disruptive enough) but as a foreign-led imposition on a way of life that has its own internal logic. Reading this, and many, many other books, it becomes possible for even the densest, stubbornest reader to comprehend why colonialism and other forms of paternalism (including those, like Japan, undertaken by indigenous elites) were so alienating.

Bookmark and Share

03 July 2010

Gene Roddenberry Didn't Understand Star Trek

For many reasons, I watched "Q Who?" yesterday evening. I hadn't seen it in a long time, maybe fifteen years. As ever when I watch a Next Generation episode, I was vaguely disappointed at how much worse the episode was than I remembered it. TNG episodes were never quite the right length. There was always too much or not enough plot to fill the time. There's almost always too much Wesley. Plot points are hinted at, but never developed at all. "Q Who?" has two: Ensign Sonya Gomez, who could have filled the naive-but-competent niche much more believably than Wesley, and Guinan, who apparently has magical powers of which Q is afraid--watch her put her hands up when she faces off with Q in 10-Forward. But of course there were many throughout the series, perhaps most glaringly the aliens in "Conspiracy", who never showed up.

For all that, the episode is still pretty good. And I think I finally understand why. It is the episode in which the series finally buried the influence of Gene Roddenberry.

As a former Trekker, I realize that questioning the greatness of Roddenberry will strike some as nearly blasphemous, but let's not kid ourselves. Gene Roddenberry nearly killed Star Trek, and the Star Trek Gene Roddenberry liked was pretty awful. Star Trek: The Motion Picture is as near a perfect expression of Roddenberry's idealism as you can find, and it is perfectly incoherent. The plot makes no sense, not even from a superficial narrative point of view. V'Ger is nearly omniscient, but unaware of its malevolence; Will Decker is supposedly competent and qualified to command, but has almost no loyalty from his (Kirk's) senior crew; Ilia is from a species famed for its sexuality, but sworn to celibacy (which is a deeply ambivalent statement about the way in which Roddenberry conceived female sexuality); Spock is supposed to have achieved self-mastery, but deceives Kirk and everyone else to draw nearer to V'Ger; and Kirk is supposedly competent and ready to take on yet another alien threat, but doesn't even know how the new Enterprise works. As for the plot, it's almost unspeakable. The first two acts are essentially "getting the band back together"; it's telling, by the way, that only Kirk, Spock, and McCoy are shown as having the ability to even try to lead a life without the Enterprise. V'Ger itself is totally nonthreatening: its only real act of aggression against the crew is to kill Ilia, who is thus the baldest redshirt ever. And the final act culminates with Decker, fake Ilia, and V'Ger melding together and ascending into a higher plane, leaving Kirk and crew behind. The only characters to have any sort of real development, choice, or growth in the entire film are a) the son of a man Kirk destroyed, b) his robot clone lover, and c) space junk.

Think of it this way: Will Decker, a guest star about whom we know nothing, is the protagonist of TMP. No wonder no one cares about the movie! He doesn't even have a particularly interesting choice. His options are (a) do nothing, and see Earth be destroyed or (b) become a potentially omnipotent being. This is a much less interesting choice than even the Silver Surfer faced. Norrin Radd gained a portion of the power cosmic, but in order to save his world he had to give his soul to Galactus and take part in the destruction of millions of other civilizations. Will Decker, by contrast, gave up ... nothing.

And Kirk was a bystander. Re-run the entire movie without Kirk. Does anything change? Kirk is awkwardly shoved in to the entire plot, with nothing to do. Even McCoy gets to show off his deep reluctance to rejoin the service. Absolutely nothing in this analysis is weakened when you read the novelization, which Roddenberry himself wrote. Instead, the problems with the plot and the characters become even more galling, because the woman who dies on the transporter pad in the beginning of the film is Kirk's girlfriend, Lori Ciana. Even in the novel, Kirk is hardly distressed by her death.

Yet TMP is the realization of Roddenberry's vision for the series. It is superficially idealistic, its central dispute is resolved when We Understand Each Other Better, and the moral is something hazy about the human spirit (or, to take the movie poster, the "human adventure").

And that's the key to understanding why Star Trek fans were so astonished by the Borg. The Borg represent an invasion of Roddenberry's narrative by the science fiction of the 1970s and 1980s. They are, as Q explains, an enemy so remorseless, relentless, and completely opposed to everything about Roddenberry's Star Trek that there is simply no way for Picard and his crew to respond within the practiced and easy repertoire of the original series or the first season of the Next Generation. The Borg can't be bargained with, reasoned with, or outfought. Instead, in their original, pure form (roughly lasting through the end of the sixth season of TNG), they are simply the Adversary.

Nothing in Star Trek before "Q Who?" compares with the Borg except for Armus, and Armus could be safely contained, like the Talosians. Remember that Roddenberry had already decided that there would be no real enemy for the Federation in the new series. Worf had joined the bridge crew, the Romulans had retreated back behind the Neutral Zone, and the Ferengi were the closest thing Starfleet had to an enemy. The Ferengi, of course, were evil because they were greedy. Unsurprisingly, none of this worked dramatically. The writers quickly decided that Worf had to struggle with his Klingon heritage, the Romulans were re-introduced as soon as possible, and the Ferengi evolved into the shrew and useful race of DS9. To give depth to the Federation, the post-Roddenberry writers even invented the Cardassians, more threatening and sanguinary than the Romulans but also more human in their essential motivations.

The pivot for all of this re-invention was the Borg. By introducing a race with which the Federation could not be reconciled, and against which Starfleet was pitifully powerless, the writers redefined Star Trek once and for all against the episodic nature of the original series' conflicts while serving notice that the easy, New Frontier idealism of the first show was no longer operative. And the show was better for it.

Bookmark and Share

Clouds, clocks, and graduate curricula

From Almond and Genco (World Politics, 1977):
The stress on
reductionist explanation, quantification, and formalization has also
led to an overloading of graduate curricula. If a political scientist
must be a staistician, psychologist, and sociologist, then some of the
traditional curriculum has to be set aside in order to make room for
these newer disciplines and techniques. Anyone who has taught in a
major graduate department of political science in the last twenty
years will recall this inexorable process of narrowing and
technicizing of the curriculum; the foreign-language requirements have
been reduced, the field examination requirements have dropped from
five to four to three, perhaps even to two."
Whig historiography is always noxious, but so too is its opposite--what I suppose we might as well call jeremiad historiography. Clearly, the adherents of a waxing paradigm will always be Whigs and the partisans of the status quo--or, especially, of the just-recently-displaced status quo--will always be Jeremiahs.

But without meaning any offense to my colleagues in political theory and comparative politics, the thought of having to prepare myself for comprehensive examinations in those fields in addition to IR and American government fills me with despair. I am sure they would be equally dismayed by the idea that they would have to master the minutiae of bureaucratic politics, the presidency literature, or the democratic interdemocratic peace in addition to subjects they more naturally care about.

Five comprehensive exams! The literature was much smaller, but the essays must have been much shorter or the standards much lower.

Bookmark and Share

Inouye Most Senior Asian-American in American History

Now that Daniel Inouye is president pro tempore of the Senate, is it too early to point out that--almost unnoticed--he has become the first non-white to hold that post? And as far as I can tell, only one Web site has noticed that technically Inouye is the highest-ranking elected Asian-American in U.S. history.

Bookmark and Share