14 December 2010
08 December 2010
07 December 2010
- Realism in the comics pages -- Dilbert
- The greatest trick the devil ever played was convincing the world that advertising exists -- Kevin Drum
- "The 25-Year 'Foreclosure from Hell'" -- WSJ
- "Republicans are the Keyser Soze of Politics" -- Enik Rising
- Markets in everything: China copies, upgrades, sells Soviet-era fighters and other tech -- WSJ
- Elected assessors placate voters -- Enik Rising
- West Virginia's Darth Vader -- Rolling Stone
06 December 2010
- Why we need a National Discard Day -- Scott Adams
- "Thanks for your service Jayson, but I think the best and most exciting days of your career are behind you." -- The Long-Distance Phillies Fan
- Great video from RC airplane buzzing Statue of Liberty, Brooklyn Bridge -- Ars Technica
- U.S. shakes up embassies staff, anticipates waves of PNGs -- Independent
- My university badly lacks Laptopistan -- New York Times
- Julian Assange is like the Iranian students, but more successful at disseminating his purloined letters -- E.J. Epstein
- The Twitter Revolution was a fraud -- London Review of Books
04 December 2010
When I was an undergraduate, the answer was straightforward: "Doctor [lastname]" or "Professor [lastname]." These days, I generally call them "firstname," except (a) sometimes when emailing (I write several registers more formally than I speak, especially when asking for favors!) or (b) when dealing with someone particularly eminent or senior. And even (b) is a diminishing category.
But it's never crossed my mind to demand that someone in an informal or presumptively collegial setting call me "Mr." Much less have I ever dreamt about being called "Professor" on a routine basis. (I still think the first time should be pretty fun.)
A recent contretemps has shown that apparently some holdouts on formality remain, something that is deeply ironic considering the presumably egalitarian norms of the professor in quo. The comments section was, as often happens, livelier than the post, particularly FLG's question "Would you accept Dark Lord Alpheus? Or Darth Alpheus?"
So now I know what I will insist my students call me once I've received my doctorate: Darth Pedantus.
- The elves of Indiana's Legislative Service Agency are toiling -- Masson's Blog
- What I'm asking for next Christmas: A retrofit MacBook with >1 TB storage -- Remiel.Info
- Wikileaks is a legal innovation, not a technological one -- Economist DiA
- Eichengreen mostly recants on euro -- Kevin Drum
- Everyone else is linking to the promise of Kalamazoo -- Washington Post
03 December 2010
02 December 2010
Drezner's Foreign Policy article covers the essentials of his presentation, although if you can you should see it live. (So to speak.) The discussion was equal parts nerd-fest and theoretical disputation; Dan Nexon, the chief critic of Drezner's IR zombies approach, pointed out that Drezner suffered from a "vitalist" perspective that blinded him to the post-zombie world's problematization of the hitherto binary life/death category. (He also noted that the COW dataset will have a problem in coding armed conflicts between humans and zombies, since it requires 1,000 battle deaths.)
Drezner and Nexon have staked out the ground for productive debates. But there's still more to be said about IR and zombies. For instance:
- Buck passing. There are two ways in which states could choose to buck-pass. The first are small powers, which could simply choose to let great powers shoulder the burden of resisting the menace. This, as Drezner points out, could lead to great-power intervention--but the strain of regime change in addition to zombie-nuking might be too great. (In fact, compellence might instead take the form of warning potential buffer states that they are to be turned into cordon sanitaires.) The second is that great powers may choose to postpone their own interventions until the hegemon or some other k-group chooses to act in order to safeguard their own interests.
- Zombie gerrymandering. Why not selectively exclude troublesome populations (or voting blocs) from counter-zombie efforts? A Republican president, for instance, might choose to draw the defensive lines at red-state borders instead of the national borders. (Democrats would be too wussy to do this.) Of course, the U.S. constitution is silent on how to handle the congressional seats that would be elected by voters in undead-held lands. But a dictator (a Stalin or a Kim) might be quite happy to see rebellious provinces subjugated by the zombie menace, which would allow them a twofer of both eliminating domestic opposition and having the U.S. foot the bill for cleanup.
- Religious reformations. There would be millions of undead wandering the earth. Certainly, if religion matters in IR, this would be an event that would make it salient.
- Post-bailout fatigue. Contra Drezner, zombie protection is not a public good; it is a classic private good. (Want to exclude someone? Just don't send the Marines.) So why should we expect to see states with varying preferences choosing to protect each other? Sure, there may be some constructivists who believe the "West" will stick together, but recent experiences in the much more thickly constructed European Union demonstrate that even "good international citizens" like Germany are unwilling to provide bailouts for, say, Portugal. Would a GOP House vote for a zombie bailout of Ghana?
- The end of IR. How can any meaningfully constituted "international society" survive a zombie apocalypse? Both Nexon and Drezner assume that international relations will survive in some fashion after the flesh-eating undead hordes attack. But why should we continue to see anything resembling a Westphalian anarchic world--or any other definition that could reasonably be construed as "international"--after such an existential threat that would lead to the end of international trade and migration, as well as the immediate extinction of homo sapiens in entire countries and continents? Nexon posits a return to empire, but I suspect that the true end state is that of a 1984--style totalitarian government. (Twelve Monkeys suggests how this could play out.) Consequently, even for the survivors, the zombie uprising will lead to the end of anything that a Western liberal would regard as "human" in anything but the biological sense.
- Niamh Hardiman of University College Dublin (home of the Fighting Cardinals!) explains the Irish collapse -- Crooked Timber
- Just because you like a TV show doesn't make it yours -- Katherine Welsh @ Alyssa Rosenberg
- I've always loved the house in the photo -- Yglesias
- I'm also a proud Philistine -- Overcoming Bias
- Hans Rosling puts form before substance -- Talking Points Memo
- An old article but I always love pieces that praise negative campaigning -- Washington City Paper
01 December 2010
- Every guide to college success says that students should go to office hours.
- Hardly anyone ever goes to office hours.
And a lot of them should have. It's remarkable that students who have received a C or worse (and at my institution, a B- is probably pretty bad news) refuse to come to office hours, even when I've told them to. It's trite but true that a student who gets a B+ is much more likely to pay me a visit in the hopes of angling for an A-.
(I suppose it's the same principle by which stout burghers are more likely to get a new sidewalk installed by the city government than their slum-dwelling co-urbanites. Even though the sidewalk is probably much more important to someone who has to walk everywhere, or at least to the bus, than to the homeowner who only visits the sidewalk when it rains, nevertheless it's the latter who knows that city hall will, in fact, take care of these things.)
I don't mind that students have almost no idea what office hours are supposed to be like. I never did. I went either because I was intensely interested by the subject or drowning--as well, of course, to negotiate grades. Of course, now, I actively don't want people interested in the subject to come and talk to me; if you really care, come by and I can point you to some things to read. I much prefer talking to people who are earnestly and sincerely grappling with the material, because in plain fact they often understand the material better than people who are really passionate about international relations, comparative government, and other do-gooding endeavors.
It took me about a semester to realize what TAing was supposed to be, and since that first term I've watched, amused, as succeeding cohorts have repeated my early mistakes. Here's the classic rookie errors:
- Offering to hold multiple scheduled office hours per week. This is the biggest mistake. It's dumb. If people won't come to one office hour session, they won't come to two. And you'll be chained to some stupid table in a coffee shop playing Solitaire two hours a week, waiting for undergrads to drop in.
- Believing that section is a lecture. It's not. It's not about you; it's about them. Don't prepare extensive notes or handouts for them; listen to them instead, and find out what they don't understand. Generally, you should be trying to simplify, not complexify.
- Thinking that people will do the reading. They won't. You didn't--and if you did, you should know right off the bat that you're weird. Part of your responsibility is to help them know what they have to read. Another part is to make them read what they should be reading. Reading quizzes are great for this.
- Being disappointed if they don't take the course seriously. Chances are, this is an intro class, which means that at most a third of the students actually care about the course. And why should they? Frankly, partying is both more fun and likely more important to their future happiness than your course. Better a pig than Socrates.
- Thinking that you're paid to be a TA. You're not. You're paid to do research. If you enjoy teaching, then great--but anything over the de minimis effort should come out of the time you budget for recreation, not research.
From now on, though, I'll be providing a much less personal experience. Students who want to visit me will be able to check out my Tungle page and schedule 20-minute appointments with me. But no more fixed office hours por nada.
- Chocolate City Brewery to open this spring; vanilla suburbs rejoice -- Conor Williams
- God fails football -- Daily News
- Which shipping company will punt your fragile shipment? -- Popular Mechanics
- "[I]t is still striking how many pies the United States has its fingers in, and how others keep expecting us to supply the ingredients, do most of the baking, and clean up the kitchen afterwards." -- Foreign Policy
- Do elected officials perform better than appointed ones? -- Enik Rising
- Why Euro defaults are necessary -- Felix Salmon
30 November 2010
Recent developments in Europe have brought this lesson home. For people of my generation, who remember the period between the fall of the Soviet empire and the fall of the Twin Towers with crystal clarity, the Washington consensus attained something of the status of divine write. Good governance may have been difficult to establish, but it was at once straightforward and persistent. All a government had to do was let the rule of law take hold, establish some private property rights, and hold some elections and, presto, you've reached the end of history.
In the past decade, each of these recommendations and a host of corollaries has been shown to be either vastly difficult than assumed or actively counterproductive. (To take just one example, international relations theorists now argue that although democratic countries are less likely to go to war among themselves, democratizing countries are much more bellicose than the run-of-the-mill authoritarian state. Think for a second what that implies for a democratizing PRC.) One measure of this shift is what I like to call the Friedman Index, which is a measure of the fraction of Serious People who take Tom Friedman seriously. By all indications, the Friedman Index is a trailing measure that peaked sometime around 2005. Among NYT columnists, it is Krugman, not Friedman, who now speaks for the professional elite of the East Coast, with the prominent exception of the current occupant of the Oval Office. And Krugman, for obvious reasons, is hardly in thrall to the quantitative mystique in which economists have clad themselves over the past generation.
Which draws us back to Ireland. Conventional political economists have sought to explain the Irish example by looking at past IMF interventions and focusing on the narrow (but important) questions of how such international institutions allow states to credibly bind themselves to necessary but draconian actions they would otherwise be unable to sustain. (See, for instance, The Monkey Cage.) But this analysis is badly misguided, since it ignores the rather obvious point that the states now threatened by financial contagion have options that the Asian currency crisis countries did not. European states are massively wealthier and more influential than Thailand or Indonesia; unlike Jakarta and Bangkok, Paris and Berlin can simply rewrite the rules to benefit the bulk of their citizens who are not international creditors.
Would they? I expect they will. An editorial from the Irish Times suggests why:
IT MAY seem strange to some that The Irish Times would ask whether this is what the men of 1916 died for: a bailout from the German chancellor with a few shillings of sympathy from the British chancellor on the side. There is the shame of it all. Having obtained our political independence from Britain to be the masters of our own affairs, we have now surrendered our sovereignty to the European Commission, the European Central Bank, and the International Monetary Fund. ... The Irish people do not need to be told that, especially for small nations, there is no such thing as absolute sovereignty. We know very well that we have made our independence more meaningful by sharing it with our European neighbours. We are not naive enough to think that this State ever can, or ever could, take large decisions in isolation from the rest of the world. What we do expect, however, is that those decisions will still be our own. A nation’s independence is defined by the choices it can make for itself.I would not be surprised if the Irish government at some point in the next six months breaks its most recent commitments to Brussels and M. Strauss-Kahn. I fully expect that whatever "punishment" is meted out to Ireland to that transgression will be forgotten within a decade. Equally, I expect the eurozone to do what everyone believed was impossible as recently as seven or eight months ago: to shrink to the point where it is an actual optimal currency area, which is to say that it will probably be Germany, France, Benelux, and some German quasi-possessions on its borders (Austria, the Czech Republic, and Slovakia).
A social science that proceeds inductively from a limited wealth of experience will miss the really important shifts. Generalizing from the international arrangements that governed the world economy over the past sixty years to the actions we can expect influential actors to take in the next decade is wrong-headed. It is time, instead, to begin preparing both scholars and policymakers to be ready for a world in which defaults are policy tools, states reassert their primacy over economics, and protectionism is put back on the table.
- Richard Lugar, liberal maverick -- NYT
- Grand strategy is a crock -- Drezner
- Grand strategy is a crock -- Krasner
- Humans are getting fatter. But so are animals that live near humans -- Stross
- Markets say there will be a Portugal default. Didn't markets also once say that U.S. housing values could never decline? -- Salmon
- Good teacher, bad teacher: It doesn't matter. The kids will forget what teachers say. "Why make the poor kids suffer if they won't retain what they learn anyway?" -- Caplan
- Why measures of central tendency give a distorted view of legal earnings -- Empirical Legal Studies
- The tragic death of this blog -- Technologizer
- Alan Moore's never-written DC saga: "one of the things that prevents superhero stories from ever attaining the status of true modern myths or legends is that they are open ended." Four-Color Heroes
21 August 2010
- Why pick on nerds? [Overcoming Bias]
- Why nerds are unpopular [Paul Graham]
- Obama and gay marriage: A case study in ambition [The New Republic]
- What's more valuable: a college or a stereo? [The New Republic]
- When did the infovore evolve? [Mother Jones]
- Eric Schmidt, creeper [Daring Fireball]
- How to become an expert [Dan Drezner]
20 August 2010
Consider the Second World War. Everyone knows the Allies won. But judging from the representation of the war in British memory you would find at least as many instances in which it is not altogether obvious that London was among the victors. 1984, famously, projected the material deprivations of London 1948 into the future; it was as much a work of observation as of speculation. And the weird nature of the postwar settlement made it hard for ordinary Britons to gloat about their role in winning the "good war":
By "winning" we can mean two things. The first is a question of contributions: Without X strategy or Y materiel, would the war had been won? The counterfactual then takes the form of "Had the U.S.S.R. not entered the war, would the U.K. and the U.S.A. not won?" or "Had the Western Allies made a separate peace in 1940, would the U.S.S.R. have defeated Nazi Germany?" These questions are impossible to answer for certain, but we can make a plausible case. In this instance, it becomes more plausible to argue for a solo Soviet victory, and even more for a solo Soviet near-loss, than for a joint Anglo-American victory or for a solo Anglo victory or near-loss. In this sense, then, it is meaningful to talk of the Soviet Union "winning" the war.
The more important sense is that of outcomes. Who in the international system benefited the most from the international environment at the war's end compared to their position at the war's beginning? This focus on relative power accords with most theorizing about the international system. A player who goes from having 10 percent of the distribution of power to 50 percent is strictly better off, regardless if the size of the pie has shrunk.
An alternative definition of winning, of course, would simply assert that the top-ranked player at the end of the period won. That is inadequate. It implies that a hegemon could engage in a costly, pointless battle, lose every engagement and waste scads of money, and yet still "win" simply because they didn't lose enough to become the second-ranked player.
As a corollary, I should note that it is much easier to identify the losers of a war: anyone who moves down the relative or ordinal league tables. The greatest loser, however, need not be the player who fell the farthest; here, losing the top spot and falling to second place may be much worse than falling from 25th place to 100th, given a flat distribution of powers (the familiar "long tail" effect).
There are three ways to measure changes in relative power. The first is strictly relative: Who arithmetically gained the most? A player that increases from 20 percent to 40 percent has probably gained more than anyone else in the system. This leads to the second definition: the moment at which changes in relative power lead to qualitative changes in the organization of the international system. The United States entered the Second World War as the largest power in a multipolar world; it exited the conflict the hegemon in a uni- or bipolar system. A system shift is a major consequence, and a system shift in your favor surely counts as a different category of "win."
I argue there is a third kind of victory: the relative proportion of relative gains. That is to say, in the universe where player A goes from a 20 to a 40 percent share as a consequence of war, he has doubled his share of power in the system; but if player B goes from a 2 percent share to a 10 percent share, they have quintupled theirs. Assuming that the structure of the system has not changed (that is, that A is not a unipole), I contend that player B is also a winner, and in some ways even more of a winner than A, since they have produced gains more efficiently.
Consider the Korean War. Arguably, the United States won; definitely, the North Koreans lost a lot and the South Koreans lost a little bit (in the short run). But I contend the real winner was the Mao regime, who was left with a firmer position at home and abroad.
Here are some other Startling Propositions which, as Dean Acheson would say, are clearer than the truth:
- Japan was the real winner of World War I. It managed to eliminate its major security threat (the USSR), marginalize its principal offshore competitor (the UK), and make major gains in China at low cost to itself.
- Southern poor whites were the real losers of World War II. The real winners? Northern industrialists.
- The real winner of the Seven Years' War was the United States of America. The real loser was Britain.
- Actually, the United States really did win the Spanish-American War.
- The winners of the Mexican War were Southern plantation owners. This was recognized at the time but bears repeating.
- The follies of the RIAA [Penny Arcade]
- Let's all remember that Ground Zero drew on Islamic influences [Slate]
- I normally don't approve of the misuse of evolutionary theory to explain individual-level outcomes, but this definitely deserves a Darwin Award [People]
- More people should read Mancur Olson. And his discontents. [Matthew Yglesias]
- This House endorses Mitch Daniels for President, and it looks like the Economist will too [The Economist]
- I fully agree that undergrads have no idea what grad school is like, and furthermore that success in undergrad has little direct influence on success in graduate school [Historiann]
- Time to get the Watergate exhibit out the door [Episconixonian]
18 August 2010
- If this interview with Wolfram is accurate, nobody can say Macs aren't for techies [The Setup]
- As a political scientist, I disagree with a lot of what he writes, but Todd Purdum has produced a great article about modern Washington [Vanity Fair]
- I'm a big believer in using evidence and institutions to smooth over the irregularities of opaque norm-based group dynamics. This looks like a great way to do just that. [Lifehacker]
- I don't fully understand why, but I was really disappointed in Alan Blinder as a person because of this FDIC-backed quasi-fraud. [Felix Salmon]
- Yglesias is wrong. There are consequences to Senate obstructionism. It's just that they are perverse. [Matthew Yglesias]
- Special bonus musical link: dj BC's Glassbreaks, featuring Philip Glass vs. hip-hop. [WFMU]
17 August 2010
|Even William of Orange had to apply for jobs.|
Applying for things is hateful. Nearly everyone hates being a salesman, and even practiced salesmen hate selling themselves. Yet the nature of modern life is that salesmanship is an essential skill. At the moment, I am collecting emails from people interested in taking a room in the house I share, and it is clear that they are treating the process for what it really is: an application-and-interview process no less irritating or important than a job search.
Why do we hate talking about ourselves? Even my most successful friends loathe writing cover letters or resumes. More to the point, even my friends who enjoy talking about themselves in any other context will postpone writing applications until the last possible moment, and when they do produce one it will be listless and ashamed.
There are three strategies to cope with applications. The first is to be a narcissistic sociopath:
But that path has some drawbacks.
The second is to have someone else write your application. (I know at least one professor who took this route; it worked not least because the spouse who wrote the tenure packet was an academic as well.) When this works, it works great, but it requires the other person to know you well and also to know how to handle applications--which is a rare combination.
The third is to grit your teeth and do it. Grit your teeth, not gnash them, because there is no use raging, raging against the writing of the C.V. It is trivially easy to spend more time complaining about the unfairness of a universe that requires such a noisy filter for matching people with jobs, roommates, and Match.com hookups than actually writing an application.
The good news is that it gets easier to write these applications with time, if for no other reason than that you have a well-stocked "Application Hell" folder of your own. The bad news is that it never feels any easier.
- Sorry to miss posting yesterday. Playing with the puppy took up a bit more free time than I'd expected.
- There are many ways to waste time if you use Gmail. [Lifehacker, Simple Productivity, AlphaGeek, Best of the Web]
- The grad school version mentions coping with neuroses [EHow]
- Charlie Stross asks: What's the next bubble? My answer: AI. [Charles Stross]
- Drezner says enough about Cordoba House as a threat to America; enough about saying critics of Cordoba House are threats to America [Dan Drezner]
- Kevin Drum says not so fast [Calpundit]
14 August 2010
13 August 2010
- Great tips. [Might.Net]
- More great tips. [Might.Net]
- Lebensraum on the subway. [Unsuck DC Metro]
- The right way to wind your Mac power cord [Lifehacker]
- Lifehacker doesn't share my top Macbook productivity tip: Buy a spare power cord that you only use for traveling, so you never have to worry if you have one with you.
- Better than the reviews [Beer Advocate]
|CC image by Cliff1066|
A friend's Facebook update reminded me about its existence. I visited once, last year, and it is fair to say that I was just shy of enthralled. First, of course, I wanted to see what they made of the topic. For those of us who have been in the museum business, no matter how slightly, there is always some professional interest in seeing how a well-funded museum spends its funds. (I once spent ten minutes at the Georgia O'Keeffe museum examining how the curators had lit their galleries. For a few minutes, at least, that was more illuminating than the paintings themselves.)
In this case, what I really wanted to know was how you could bring to life the story of something that is inherently undramatic. The answer, it turns out, was lots of props--and a few good computer simulations. As a way of conveying history, this has the advantage of being tactile with the disadvantage of being a little misleading. Okay, so the Postal Service used to sort mail on trains, and so you have a caboose. But what does this tell us about how Rural Free Delivery reshaped rural life? And what do we learn about how the Postal Department was a major source of political patronage, helping to develop the American party system on a national basis?
This leads neatly to the second reason I wanted to visit, and that is that I genuinely find the history of the Postal Service interesting. And this isn't the only boring institution I care about--far from it. Of course, I'm using "boring" to refer to how most people, most of the time, see such institutions. And me, too. I'm no anorak or trainspotter. I'm not interested in learning the details of rate cards or the breed of pony used in the Pony Express.
WHAT INTERESTS ME INSTEAD is how transformations in such institutions shape everyday life. The existence of international postal reply coupons was the cause of the Ponzi scheme; the existence of the Sears Catalog made the Midwest habitable. And at the core of all such institutions is politics. True, economic and technical considerations matter, but their resolution is often guided by considerations of sheer power.
The Postal Service, like all other boring institutions, is in its way a product of the fundamental arrangements of power in society. And that makes it something worth paying attention to, at least a little bit, in a way that pop culture and high culture--which are more intrinsically more interesting--can't compete with.
12 August 2010
- Kirk or Picard? Kirk. But Sir Patrick Stewart is obviously a better actor.
- Favorite science fiction TV show? Tough to say...but it is probably Firefly.
- Favorite comic book series? Avengers
- Favorite comic book issue? Fantastic Four 262, "The Trial of Reed Richards"
- Favorite DC character? Batman.
- Favorite Marvel character? Victor Von Doom.
- Desired superpower? Phoenix's.
- Favorite Cylon? Eight.
- Favorite sf novel? The Man in the High Castle. Very, very close second: A Canticle for Leibowitz.
- Favorite sf short story? Actually, "The Rocket Man" (Ray Bradbury, R is for Rocket). At least five of my top 10 would be Bradbury. And three of them would be from Martian Chronicles?.
- Favorite SF movie (non-series)? Blade Runner
- Favorite SF movie (series)? Star Trek II: The Wrath of Khan.
- Luke or Han? Han.
|Probably not exactly fair use.|
I think that this is an insult whose day has come again.
The concept of the poseur is superficially related to the concept of authenticity, but without any of the troublesome ideological overtones. "Authenticity," after all, is a purely artificial concept. (As Ernest Gellner's readers know, the "authentic" Ruritanian is the Megalomanian who behaves the way actual Ruritanians would act if only they were enlightened.) But a poseur, by contrast, is a fake. He knows just enough to skate by in casual conversation or the everyday presentation of self, but he really doesn't have any claim to the status he claims for himself. Accordingly, anyone can be a poseur in any dimension. You can pose as a comic book fan, and even pass as ones to most people, but five minutes of conversation with someone who knows what they're talking about will reveal you as a poser.
The greatest asset of resurrecting the "poseur" tag will not be in adding a new insult to our quiver, but in deterring behavior that smacks of poseurdom. All of us feel the temptation from time to time to pose. But if we're more afraid of being caught out than we are tempted of putting on a false front, then we may be able to overcome the Dunning-Kruger effect.
So let's all start acting like middle schoolers again. Sometimes, kids can be just cruel enough.
31 July 2010
30 July 2010
29 July 2010
28 July 2010
- "I usually like Ellen Page, but she is the Mary Sue of Basil Expositions in this film." [Edge of the American West]
- "This is the country we've made. This is the country we deserve." [Ta-Nehisi Coates]
- PMQT uses state-of-the-art opinion technology. [The Smart Set]
- Analogy FAIL [Washington Post]
- Analogy FTW [The Atlantic]
- Add it to your Google Reader (or, if your tech-savvy is antediluvian, your bookmarks): Snob Lessons [Snob Lessons]
- Was Alger Hiss the victim of an Army Counter Intelligence coup? [University of Michigan]
- When is it time to die? Atul Gawande asks. [The New Yorker]
27 July 2010
And, of course, it meant that the dream could never become a nightmare.
It strikes me that the essence of the iDesign philosophy is metadictatorship. Steve Jobs does not control what I do to my iDevices, but he does control how I can use or change them. Thus, I can customize, but only to a certain point. This chafes at people who believe they are better computer programmers and designers than those in Apple's employ, including even a few people who actually are better computer programmers. For the rest of us, complaining that our phones aren't jailbroken is like complaining that we can't replace the engines in our Prius.
All of this is just a prelude to say that I have been spoiled by this benevolent dictatorship. I noticed this morning when I went to the New York Times, as I have been doing now for fourteen years, that the site is awful. Beyond awful. There are 1903 words on the home page.
Nineteen hundred and three words.
I copied and pasted the page into Word. It was 16 pages long.
Some of that is page formatting. But most of it is cruft. Look at your NYT app on your iPhone and then look at NYTimes.com. Design matters.
Creative Commons image by Incase.
- Everything you need to know in order to be snobbish about high-alcohol wine [Snob Lessons]
- Overheard in DC. Sounds like a lot of Hoyas were eavesdropped upon. [DCist]
- Julia Stiles finds her balance. [Alyssa Rosenberg]
- I once dismissed Pierre Bourdieu as an airy French philosophe. I am increasingly convinced that he's the only recent sociologist worth studying. [Wikipedia]
- That doesn't mean that he's easy to read. [Marxists.Org]
- "Every once in a while someone from security studies tells me that international political economy is really, really boring and that they can't understand how I could find it interesting. I think today is one of those days in which I would tell them the same thing." [Daniel Drezner]
- A brilliant survey of classical music [City Journal]
- The guest "Automata" strips at Penny Arcade: exuberant! [Penny Arcade]
26 July 2010
One reason for my contentment is the setting. It has been seven years since I spent time in one of the large Midwestern universities, and I realize now how much I had missed the environment. The urban campus I attend now has its brief moments of beauty, but they are pockets amidst a jumbled campus whose architectural incoherence is testament to the poor financial planning of previous generations of administrators. It took a lot for universities to miss out on WPA funds for new construction, but somehow the old priests managed the trick. Their successors during the Cold War failed to acquire the American talent for wealth creation but learned architecture from the Soviets. At least they had the good fortune to have inherited a stately nineteenth-century quadrangle; in Dublin, at John Newman's university, the new campus in the suburbs was built from the ground up by Brutalists.
- I can't believe there are two "Shaft" parodies starring William Howard Taft. And one of them is actually okay. Histeria, College Humor
- Why isn't federal highway design regarded as a failure of central planning? [The League of Ordinary Gentlemen]
- Most trilogies do suck [Flowing Data]
- GapMinder: the home edition [Flowing Data]
- Now I'm grateful that Charles Hill wasn't a professor at my institution [The League of Ordinary Gentlemen]
- I'm deeply skeptical of the notion that political scientists should engage the public, but I'm downright contemptuous of the notion we should aspire to be think-tankers [The Monkey Cage]
- A copy editor complains that everyone thinks "douchebag" is spelled without a space. Heads up, copy editors: when everyone agrees on something, it is time to update your stylebook. [The Awl]
- Following on the above: I prefer Urban Dictionary to Merriam-Webster for slang. [Urban Dictionary]
- Piling on copy editors: Andrew Gelman compares copy editors to pinch-hitters. "If the pinch hitter were really good, he'd be a starter." [Gelman Blog]
24 July 2010
- Tell your advisor you're not procrastinating: You're solving the unsolvable [Bobulate]
- Jorge Cham inadvertently reminds me of my preference for Jean Grey [Ph.D. Comics]
- How many people does it take to run civilization? [Charlie Stross]
- Mark Taylor is a menace 2 academic society. [Brian Leiter]
- I look forward to the new Avengers movie attracting a small but loyal audience, failing commercially, and ending midway through the second act. [Gizmodo]
- Gorgeous reimagined movie posters [Escape Into Life]
- Oh, for the days when mathemeticians were heroes. It's more likely our next technical heroes will be roboticists. [Barnes & Noble Review]
- "Math is perhaps the ultimate in acquired tastes" [David Foster Wallace, Science]
23 July 2010
If you live and work in a world with strong cycles, then you have to account for those when planning new activities. Periods of high organizational stress, or periods when high organizational performance are needed, are bad times to focus on secondary matters. That rules out changes to standard operating procedure Budget bureaus shouldn't undertake sweeping new initiatives at the beginning of a fiscal year, anymore than it's a good idea to try out a new quarterback in the postseason.
In academia, the cycles are even faster. There are at least three: the two semesters, and the summer. These are layered in the broader cycle of the school year. The separate nature of these cycles combine to make innovation peculiarly difficult in an atmosphere that already makes changes difficult.
I rule out summer, because I address faculty and grad students, not administrators. My hunch is that summer is the right time for redoing administrative procedures, since it is their relatively quiet season. But coordinating academics over the summer adds total impossibility to extreme difficulty.
But the semesters are hardly easier. The first and last weeks of the semester are no good, as is the middle of the semester. High-intensity projects would simply compete with more important responsibilities--and lose. That leaves four windows a year when there is even the possibility of adding new activities.
I have been thinking about this because, obviously, I'm involved with a new group (a workshop on advanced methods). There are many debates involved in founding a new institution, from questions of group behavioral norms norms (which can be established easily at the beginning, but which are tough to change later) to expectations about individual members' involvement to administrative worries. This last category deserves a post of its own. Drafting constitutions, sorting out financial controls, and settling issues of executive competence versus board oversight are tough, even when the group is relatively small and straightforward. One factor that has to be overcome is that academics usually privilege discussion over experimentation and deliberation over decision. Isonomy is an ideal, but it's a harsh mistress.
The more immediate questions we face now are how to keep the group going. There's loads of enthusiasm and the first semester went well, but having a vision for a group means understanding the factors that can sap those traits and lead to a gradual deflation of the popular will that sustains a collectivity and leads to the reproduction of its values and practices. In particular, I wonder if there's a good argument that this group shouldn't explicitly take into account the cycles of the semester and academic year in setting its schedule: having exciting but relatively low-work sessions to begin and end the year, while having the most difficult and labor-intensive sessions in November and January. (November, because it's a time when people want to procrastinate during the doldrums between midterms and finals; January, because the midpoint of the year finds most everyone in midseason form.)
Lowering ambitions a bit deflates expectations at the beginning. Adopting a more conservative attitude makes it more likely that the group can achieve the goals it wants to. The greater danger, though, is in allowing enthusiasm to outstrip capabilities and creating a gap between what is achievable and what is expected. Cyclicality encourages conservatism.
- If this were a pilot, it would air on Showtime [I Find Your Lack of Faith Disturbing]
- David Edelstein guest blogs @ Steve Walt's [Foreign Policy]
- Civil disobedience wasn't always pretty. That's why it's called disobedience [Yglesias]
- Fixing California: a case study in institutional failure. Thanks a lot, Hiram Johnson. [Kevin Drum]
22 July 2010
I wonder sometimes if Americans don't have different connections to the seasons than do other cultures. I wonder this not because I want to posit some uniquely American relationship with fall or with winter, but largely because from age 5 to 18, at least, Americans experience summer as a long, unbroken string of endless days. (There's an entire, and astonishingly subversive, Disney cartoon about this phenomenon.) Other countries generally have a shorter summer break; Americans experience summer as a nice preview of life itself. The summer begins full of promise, ripens even as it sours, and ends in a haze of boredom and anticipation. The metaphor breaks down at that point, though, because the coming of fall heralds both the beginning of a new cycle and a promotion within a nicely hierarchical system. Whereas you were once a lowly second-grader, now you may know the mysteries of Third Grade.
Most people outgrow this cycle and graduate into the Real World. I think, in fact, that the linear nature of the Real World is what people have in mind when they discuss this mythical place. (That, and money.) After all, the stages of adult life are strictly sequential, and I suspect that the cumulative nature of outside relationships begins to overwhelm even the seasonality of jobs like those in retail, fashion, and tax accounting. By contrast, academics repeat the cycle until death or denial of tenure, in increasing order of terror. Each year brings a new crop of students, who are there to be taught, nurtured, tolerated, and finally cast out into the world. We grow older, and they stay the same age.
Cyclicality is probably the calendrical equivalent of folk physics. There's probably a good reason why religions structure themselves around cycles. From one perspective, human life is just the rehearsal of roles defined by forces beyond our comprehension and before our understanding. We think there is something natural and inevitable about cycles that are plainly both artificial and recent. Consider the concepts of childhood, adolescence, and young adulthood, none of which existed in recognizable forms two hundred years ago, and for only a very few people a few decades ago. (I like to look at historical statistics, and I'm always stunned at how recently it was customary to leave school at 13 or 14 and begin working in what were essentially adult occupations.) The persistence of such notions in the face of obvious counter-evidence and despite changes across roles between generations is a good sign that we are slotting in our observations about life into a preconceived template.
In fact, I can think of only one other tribe of adults who live by as cyclical a calendar as academics (into which category I will admit, for one night only, teachers): politicians. The electoral cycle is slower now than it used to be, in the 19th century, when one- and two-year terms were the norm, but it must feel more hectic than it was. The principal difference between the electoral cycle and the academic cycle is stark: the participants in one cycle are all but assured that they will be in the same jobs in the next revolution.
21 July 2010
- Dear Pinot Noir: It's not me, it's you. [The Gray Market Report]
- Bill Murray thought Garfield was a Coen brothers movie [Vulture]
- Yes: Let's end the American aristocracy. But I'm tired of these weak, Cass Sunstein "nudge"-style policy proposals. How about our progressives propose some real, sanguinary, Bolshevist proposals? [Ta-Nehisi]
- Suck it, Aaron Friedberg: America didn't become a garrison state because we're too corporate [Who is IOZ, via ZW]
- Drastic oversimplification: Do Confucians believe in sex? [IPE @ UNC]
- Jim Vreeland gets an uncredited guest blog [The Guest Blog]
20 July 2010
Stata 11 is of course going to feature the often demanded "figure this shit out" or ftso command. Simply type the command: ftso 'depvar' and it will give you the results you need in order to answer your research question! If you have time-series cross-sectional data, or if you have no clue what kind of data you have, but want it to look more sophisticated anyways, you should use xtftso.
- IRV gains a new supporter. Too bad he only supports it because he lost. [Yglesias]
- Dan Drezner gives two cheers for redundancy. He should have called the post "Department of Redundancy Department". [Drezner]
- Bellisles didn't fabricate, but he didn't fact-check [Chronicle of Higher Ed.]
- Kathryn Lopez fawns over Mel Gibson [NRO, via reader AT]
- Science is becoming exponentially more difficult. [Boston Globe, via Monkey Cage]
19 July 2010
- How Cornficker defeated the smartest guys in the world. [Atlantic]
- My guess is Stochastic Democracy will eat 538's shorts. [Stochastic Democracy]
- Today is upgrade day. I hope Stata releases aren't like the Star Trek films, where only the even-numbered ones are good. [Stata]
- It's also the first day of classes: , ,  [ICPSR]
- Calibrating your gaydar. (Can you draw a ROC curve for that?) [Gelman Blog]
- Straight talk from Tom Friedman [New York Times]
- David Blackwell, game theory and Bayesian pioneer, died. More information here. [New York Times, Mathematicians of the African Diaspora]
- Taiwanese news portrays Steve Jobs as Darth Vader. NB: "Apple" is "pingguo" in Mandarin; "problem" is "wenti". Count how many times you hear those words! [Via Daring Fireball]
18 July 2010
For what it's worth, I didn't see much in the film that hadn't been done better by Dark City, the Matrix* or Total Recall. The only hints that we have that anyone besides Fischer is a real person is the kiss between Ellen Page and Joseph Gordon-Levitt; it's the only actual moment of human feeling in the entire piece. Marion Cotillard is radiant and rises above her lines (the incantation about the train sounds dumb when we find out what it is), but imagine if Page had been a rival to her charms. The plot "twists" were all heavily telegraphed and easily familiar to anyone who's read Dick, Borges, the better Bradburys, or Poe. Would it have hurt to have made Saito Chinese instead and had a reference to Zhuangzi?
I think the final scene makes the whole thing obvious (remember: we don't know how Di Caprio washed up at the beach at the beginning of the film, which is a dead giveaway). That is a big disappointment, especially compared to Total Recall. Clearly, Nolan is brilliant--the film is gorgeous and visually inventive--but his talents are better deployed at adaptation than invention. In particular, Dark Knight portrayed a better understanding of ethical challenges and moral questions than Inception, which has none.
1: We were so poor growing up.
2: How poor were you?
1: We had to shop at the Quarter Foods store.
- Don't fill much-needed holes in the literature, says Erik Voeten. [The Monkey Cage] (See also James Stimson)
- What is a "computer"? Paging Dr. Wittgenstein. [Charlie Stross]
- Losing $9.2bn is the result of a non-material deficiency. I'd hate to see a material one. [FT Alphaville]
- Incidentally, FT is right that EDGAR is teh suxx0r. In fact, most federal databases are awful. Please: make documents available as txt and pdf, make all searches Boolean, tag all documents consistently, present tabular data as csv, and mathematics as TeX. Never again should I have to read a document like this one or use a database as terrible as this one.
- McChrystal, F*** YEAH. [Atlantic]
- Robin Hanson is beginning to understand the alienation of labor. [Overcoming Bias, via ZW]
If you think that dating the potential of American hegemony to before the Second World War is hyperbole, consider the criteria by which Spain, the Netherlands, and the United Kingdom were all retrospectively crowned hegemon; certainly the United States of the 1920s exceeded in relative power the Great Britain of the later Victorian years, when London was unable to contemplate maintaining Canada and its position in the Western Hemisphere without significant rapprochement with Washington. Had the United States bothered to maintain a significant land army or invested in its air force to a greater degree, either of which it could have afforded without a problem in either the 1920s or the 1930s, its military power coupled with its economic influence and de facto imperial hold on the Latin American countries would have certainly made it surpass the relative power position of Athens at the Periclean height. (I suspect that American influence in the Western Hemisphere peaked about 1940, which is when the FBI--the FBI!--ran U.S. intelligence operations throughout the region and external penetration of regimes was at its minimum.)
If periodizing U.S. unipolarity is such a problem, it is no less difficult than determining when the Cold War began and ended. The high school history textbook answer is 1946 to 1991, but over the past decade I have come to the radical position that everything we learn in high school is probably wrong. (Even the Pythagorean theorem.) A very informal survey of the IR literature leads me to conclude that the Cold War as understood at the time actually ended about 1971, +/- four years (in other words, within the period between Glassboro and Helsinki). The renewed pattern of hostile interactions between the invasion of Afghanistan and Reagan's second inauguration was widely seen by everyone except the editors of Human Events as a throwback or a reignition of a dormant conflict. Moreover, this Cold War ended at least three times: with the conclusion of major arms limitation talks in Europe, with the fall of the Berlin Wall and the dissolution of the Soviet Eastern European empire, and with the collapse of the U.S.S.R. itself in 1991. (For extra credit, pinpoint the dissolution of the U.S.S.R.: was it the August coup, the signing of the C.I.S. treaty, or the resignation of Mikhail Gorbachev?)
Politics ain't beanbag, and political science ain't physics. There is no shame in our having multiple definitions of the inauguration and conclusion of different eras. The different periods may be useful for different purposes. (I think it is clear as can be that 1973 marked the end of American economic hegemony and the beginning of meaningful multilateral governance of aspects of the international--read first world--economic system.) Yet the proliferation of periodizations nevertheless should prompt some epistemic humility among contemporary IR scholars and also a re-evaluation of the way we present the "stylized facts" of 20th century history to undergraduates. In particular, we should reject the high school narrative of the Cold War as a monolithic event that serves a useful analytical purpose and instead present the years between Roosevelt's death and Clinton's boxers as a series of more discrete and more analytically-defined periods. I suggest the following:
- The Cold War, 1947 to 1962. The Truman Doctrine and the Cuban Missile Crisis bookend the height of the Cold War. The Truman Doctrine symbolizes U.S. resolution to engage the Soviet Union and neatly outlines the doctrine of containment; the Cuban Missile Crisis symbolizes both the rise of Soviet power and the need of the United States to adapt to a world in which its strategic supremacy was no longer a given.
- The Soviet-American condominium, 1963-1979. The signal fact about the 1960s and the 1970s was the strategic stability of the global order, as assured destruction and concomitant strategic talks between Moscow and Washington imposed an order on bilateral relations. The "opening" to China---a far more complex event than normally portrayed---was as much a way for the United States to maintain the global order as it was for Washington to seek an advantage versus the U.S.S.R. (In particular, a Sino-Soviet war, as seemed possible in 1968 and 1969, could have had incalculable consequences for global order generally.) The Kissingerian mantra of a "structure" of global peace fits the period well, in which the drumbeat of nuclear tests had been replaced by a numbing succession of test-ban treaties and SALT talks.
- Strategic supremacy, 1979 to ?. Washington's response to the Soviet invasion of Afghanistan and the buildup of American military budgets, combined with the increasingly unsustainable Soviet economic and political structure, produced a situation in which the domestically-determined collapse of the U.S.S.R. unfolded to maximum American advantage. It was Washington, not any multipolar arrangement, that dictated the fundamentals of the post-Soviet era: a unified Germany in NATO, the deference to the use of American military power in the Gulf and later in the Balkans, and the ability of the United States to project power throughout the world.