You might think, from my last post, that I would condemn the way Jonathan Bennett does history of philosophy. But Bennett is an exception. Bennett’s critics often charge him with historical insensitivity and anachronism. To exaggerate a bit, they say something along the lines of “Bennett treats historical texts as if they were published yesterday. He ignores context, and applies all sorts of distinctions and methods known only to contemporary analytic philosophers.” Several philosophers said this sort of thing around the time I was in grad school, since for a while there (mid and late 90s) historians of philosophy got interested in methodological reflections.
But Bennett takes on this charge in the introduction to his 2-volume Learning from Six Philosophers, and to my mind he is successful. He calls his approach “the collegial approach,” since he is interested in learning from the Great Dean as colleagues. He doesn’t have much interest in getting the history straight for the sake of historical knowledge. His interest is in philosophy, and he has found that studying these historical figures is a great way to get clear about the philosophical issues involved. Even if the collegial approach leads him into dubious interpretations — and he doesn’t admit that it does, but even if it does — the philosophical payoff is a sufficient return. We might put it this way: some people do philosophy by studying recent journal articles and going to conferences, some do it by staring off into space, but Bennett does it by wrestling with old texts and pressing his own questions against them.
That’s a respectable thing to do, I think. But here’s the catch: you have to be a really good philosopher, like Bennett is. Otherwise you end up with dubious interpretations lacking any philosophical merit.
My own approach? Something else again — another post.
When I was in grad school, I remember there being some discussion among my professors about interpretive charity. The basic idea is simply good sense: try not to interpret what you hear or read as coming from a moron. The background assumption is that most people are not morons, and so if there are two or more possible interpretations of what someone says, and one of them is something only a moron would say, go for the other interpretation. Obviously, the strategy sometimes fails, since people are sometimes morons. But usually the strategy succeeds as getting to what people intend to say.
Okay, a basic example. I now say, “The greatest distance is that between two people in a crowd.” If you interpret what I say as some statement about spatial distance, you will quickly conclude I am a moron. But maybe what I mean has something to do with “distance” in familiarity, or sympathy, or love or something. Is it more likely that I — or anyone — is such a moron as the first interpretation implies, or that I was trying to be poetic?
But in grad school, I fell into the trap of using “interpretive charity” as an excuse for doing history of philosophy in a bad way. (And in this I think I was not alone.) The trap goes something like this. Great Dead Philosopher says X. But if we take him at his word, he will be open to the fatal objections that McPhee raised against Gumperson’s analysis of modal de re predications in that seminal 1973 article that turned contemporary metaphysics on its big ear. Now it would be terrible if GDP were a victim of such dire objections. But wait! Maybe there’s a way to save GDP. If we reinterpret what he says, and supply some extra quantifiers, then look! We have a plausible theory that’s immune to McPhee’s objections! Hence that must be what GDP intended. For he surely wasn’t a moron, was he?
But of course GDP can fail to be a moron while at the same time never having considered McPhee’s objections. I firmly believe that most of the GDPs would need at least 10 years of grad school before they would have any talent at doing the sort of very specialized academic philosophy done today. It’s not that we’re smarter individually. But, collectively, we’ve devoted many more manhours of heavy thought into these philosophical questions than any single GDP ever could. So, yes, we have much more sophisticated and plausible theories.
So some limit has to be placed on interpretive charity. Maybe: “Try not to interpret someone so that they turn out to be either a moron or someone implausibly well-informed.” You have to be able to declare, at some point: “For crying out loud, he never would have thought of that!” But, unfortunately, the over-application of interpretive charity has turned out to be a useful device for scholars to show off their analytical skills alongside their knowledge of some GDP’s writings — just the kind of things grad students can clasp onto and excel at. So it probably won’t be disappearing anytime soon.
It is really a delight to be invited to a place where people have no choice but to listen to your ideas and give the impression of being interested. Everyone should have that experience from time to time, and if they are lucky the place will be Columbia, SC. The couple of days I spent there were absolutely beautiful, and everyone was friendly and engaging. The pictures above are of the campus, the state capitol building, and Matt Kisner, Spinoza scholar and my host. In addition to be wined and dined, I was allowed to bounce ideas off a history of philosophy reading group and a lecture audience. There were also several informal chats with grad students and faculty — really a fun time for me, though I felt completely “idea’d out” by the end. Which is as it should be.
I came back with several key ideas to wrestle with. The one distracting me the most is whether Nz really does encourage a “meta-perspective” sorting out of values and perspectives — or does he rather just sink into his own perspective and then battle it out with the other possible perspectives. In other words, does Nz presume to have found some key criteria by which we should sort out the good sets of values from the bad ones, or is it instead just that he thinks his values are best and is pushing forward aggressively with them?
Several other ideas and objections also came about, and I will be blogging about them later. Right now I’m just glad to be back, though the trip really was delightful.
Looking for comments. I will be presenting it at U South Carolina next week. [UPDATED: it has an end now.]
It is a little-known fact, but the phrase “small potatoes” actually stems from a practice among state universities in the middle ages. Every year, the medieval administrators would pass out awards to the faculty, and the award in each case was a very small potato, which was useful in its own way, and tasty, but at the same time an unmistakable message: “Don’t let this go to your head, you pathetic bookworm. It’s only a potato.” The award ceremony continues to this day, but potatoes have been abandoned in favor of calligraphied sheets of paper and (on occasion) nice amounts of cash. Unfortunately, the unmistakable message has gotten lost in the shuffle, and many people no longer regard the awards as truly small potatoes.
All of this by way of apology, in some sense, for having received some small potatoes (and a generous check). I know that many people have a healthy attitude toward awards: they think that it is a good thing when good work is recognized, and even better when that good work is their own. That makes excellent sense. But I was raised to think otherwise. My dad (who loved me very much, and if anything over-celebrated my small achievements) often found occasion to say “Eigenlobt shtinkt” — which was his own version of German amounting to the phrase, “Self-praise stinks.” I ended up with the belief that any sort of award reflecting well upon me was an item of self-praise, and ergo shtunk, and ergo was an invitation for God to soon find some way to make me look like an utter idiot.
So I have tended to shy away from all small potatoes. I didn’t this time — indeed, I shamelessly nominated myself for this award — because (a) I wanted some money, and (b) I wondered whether, when all was said and awarded, I would feel as if the money was worth it. So far the results are unclear. I haven’t got the money yet, and I am supposed to go to at least two ceremonies at which I will eat bland food, make small talk, and receive recognition. I know this is all very nice, and should be flattering, but my upbringing makes it seem as welcoming to me as showing up to work naked.
I should add that I do not see all awards as small potatoes. Things like the Pulitzer and the MacArthur and Nobel are very big potatoes. Anyone should feel proud and lucky to receive such tubers. (The Templeton is like a huge yam; in receiving it, you should be grateful, but also a little puzzled.) And I certainly do not condemn anyone who gives awards, or accepts them graciously with little evidence of mental anguish. I envy them all. But I am now living with the full expectation that, in months to come, I will look like an utter idiot, standing out there with a small potato in my hand.