29 July 2012

Q. Are we not researchers? A. We are professors

I've been thinking about 80s music a lot lately.
 What follows is a digressive and only partly thought-through exploration of a tough topic. I'm posting it as an excuse to work through some of my own thinking about the responsibility of academia to the public, the scholar to the student, and the discipline to all other parties. I hereby reserve the right to admit that I'm completely wrong about everything.

Why should the public pay taxes to support our research?

This used to be an academic question, but officeholders like Tom Coburn and Jeffrey Flake have put the question squarely before political scientists. Other disciplines have faced this moment before, as physics did in the late 1980s and early 1990s, and still more will face it in the future. My reasoning is plenty motivated, so I think that the case for funding basic academic research is pretty good. Like a lot of social scientists, I'm frustrated by the often self-inflicted lack of solidarity among academics ("Which social science should die?") and the professoriate's inability to sustain a campaign to make our arguments. Even the political science blogosphere's concerted arguments in favor of NSF arguments probably did less to move the needle than one foolish New York Times op-ed.

The outlook for funding for the social sciences and academia more generally is a lot dimmer for my generation than it was for the two or three generations of academics before us. At worst, they had to look forward to long periods of stagnation (unless you were a Sovietologist); at best, you could go from nothing--literal non-existence--to a seat at the grown-up table in the time it takes some disciplines' doctoral students to finish a dissertation. As the funding pool dries up, there's bound to be a lot more friction and fighting among academics than there was when the only question was how to divide up the increasing pie.

The situation won't get better on its own. I'm unconvinced by the idea that state governments' spending will naturally pick back up as the economy does, since it relies both on the idea that the economy will pick up--green shoots!--and also that politicians will refrain from directing that savings from their structural adjustments toward other priorities. And even if it does, higher education is under increasing pressure to justify its business model in every dimension:
  • Why should college be a physical location, and not an MMORPG-cum-TEDtalk?
  • Why should professors be hired and promoted based on their research skills--most without ever having had any practical or theoretical training in pedagogy--when the public's understanding of funding universities rests on their teaching productivity?
  • Why should there be an expectation that tuition should rise faster than inflation?
  • Why should an English degree take as long to complete and cost the student as much as a physics or an engineering degree?
All of these are different ways of asking the "why-pay-for-my-research" question. But historically we've only had to answer one or two of them at a time. And now we have to respond to all of them in a way that recognizes that answers to questions that were serviceable ten years ago--in 2002!--may now be unpersuasive or unrealistic.

At root, we largely rely on one of two arguments to answer all of these questions. The first is that the knowledge produce is a public good, and that institutions are organized to support our knowledge production for the maximal good of society (or, at least, that reform would cost more than it would gain--that we are in a stable local maximum). The second is that our research leads to better learning outcomes for our students. Winning the future, then, requires academics to have maximum time for research, even if there are short-term tradeoffs for the students.

I think the first answer is better, since it does at least appeal to the innate beauty of knowledge, but it tends to obscure the third reason that we voice silently--that learning, like art, is a luxury good. We can't say that aloud, since it carries with it the implication that  opposition to learning is opposition to civilization itself, and that the superior culture is the more refined culture. That is a line of argument that has never played well with the citizenry, and these days would not play well with the mythical one percent.

The second answer--better research, better students, better jobs--is, to put it bluntly, not overly persuasive. It is, in fact, close to being faith-based. We expect better evidence and stronger argumentation from our students when they turn in term papers than when we produce public justifications of our budgets. Why should we ask students who want to be lawyers to sit through four years of an undergrad degree and then three years of law school when the British and others manage to produce serviceable advocates after three years of law-focused undergraduate studies? And much of the other justifications for our teaching--our students learn critical thinking! statistics! data visualization! discipline!--only beg the question of why students should learn those skills by majoring in our subject instead of majoring in something more practical in which those skills are foregrounded.

There are answers, of course. One part is that it's that the ivory-tower academics are not the ones who are out of touch with the real world. Rather, frequently it's our critics who deride us for not taking "real-world" problems seriously enough who have no idea what the real world is actually like. This sounds counterintuitive, but sober reflection or residence in Washington (or Jefferson City, or Albany, or wherever legislators gather) will quickly remind you that politicians and activists who make these claims frequently have no idea about what either academics or businesspeople or bureaucrats actually do. The "soft skills" matter, and those are hard to impart at a distance or by paying professors less than a living wage. As Tim Burke, the charge of know-nothingism applies with especial force to fact-free discussions about information technology and teaching.

A second part would be to forcefully stress how social science matters by discussing our research. This is easy in the abstract but harder concretely. Yes, individual scholars and individual projects make tangible contributions, but how many of us are willing to say there are scholarly consensuses about issues in social science as there are in comparably observational sciences like geology? Where we are most consensual, we are least interesting; where we are most interesting, we are least consensual--and often (as with the democracy-promotion agenda of the Bush administration, or the debate over whether there is a resource curse) most dangerous. So we're left with the unsatisfying choices of either "teaching the debate" or picking sides among competing findings where there is real and lasting disagreement. (One way of squaring the circle is to find and promote popularizers of our research--popularizers in the real Carl Sagan or even Stephen Jay Gould tradition--and let them handle the rhetorical aspects.)

But the third part is to recognize that we should probably systematically invest in and require better teaching and measurements of teaching effectiveness. We measure research productivity decently well--by no means as well as we should, or as we as we ought to given the importance of the measurement and the time we've spent on it--but we rely on ... student surveys to monitor teaching effectiveness. We lack a well-defined progression of introductory, intermediate, and advanced courses comparable to what other disciplines have. (What should students learn in Introduction to Comparative Government? We don't have an answer to this question that we could explain to a congressman in the same way that economists could say what Intro Micro is about.)

Achieving comparability, however, would require some conscious moves toward standardization. Such measures would reduce the autonomy of the individual instructor. And it would also prompt us to begin to assess what the discipline is and what it should become. In one sense, that means picking winners and losers, at least at the level of undergraduate instruction. And that likely would require the formalization of much political-science instruction (graduating students who don't know the median-voter theorem or the differences in free-trade voting preferences of workers and capitalists should be acutely embarrassing). It might even mean dropping one subfield--or, better, prompting that subfield to more clearly engage with the questions and methods the rest of the field has adopted (something that would benefit both sides of the debate). The relationship between international studies and political science would be redefined, perhaps drastically.

At the end of such a process, though, the relationship between our instruction and our research would be much clearer. The risk of intellectual monoculture would be blunted by the fact that different institutions would naturally choose to focus on different parts of the field. And our students would likely have a clearer idea both of what to expect in graduate school (an IR Ph.D. is not Foreign Affairs with more footnotes) and also of what the key insights of political science really are.

No comments:

Post a Comment