Brandom’s inferentialism

In many works over many years, Robert Brandom has advocated a view called “inferentialism.” It’s a view about linguistic meaning, and it asserts specifically that the meaning of a claim is fixed by what role it plays in the economy of giving reasons and asking for them. So the meaning of “The earth goes around the sun” can only be seen in the way the claim is used in explanations and arguments. It isn’t typically used in an explanation of how zebras behave, so it doesn’t have anything to do with zebras, really. But it is used in figuring out calendars and astronomy and so on, and as we get into those topics, we can begin to see what the claim really means. The strength of this view is that it avoids the well-documented disaster of connecting meanings with ghostly entities in the mind (see Wittgenstein), and pays greater attention to communities of knowledge. Brandom has written several brilliant books and essays giving further details of this view, and using it as a kind of lens for understanding several of the “great dead” philosophers – especially Kant and Hegel.

It’s worth considering some implications of the view. One immediate consequence is: no community, no meaning. There cannot be an absolutely private language, one which captures meaning without there being even a small community around to share it. An utterance has meaning when it’s possible to say it at the right time as well as at the wrong time, and a community acts as a kind of police, disciplining its language learners into proper form by saying “Yes, that’s right” and “No, not quite.” Another consequence is that our entire world of meaning – all that we view to be true, or false, or possible, or impossible – has been shaped up over time by our communities telling us “Yes, that’s right” or “No, not quite” or, more commonly, a mix of both, coming from different sectors of our communities.

One might wonder if this is all there is to it. Imagine being introduced to a colony of noise-making entities and trying to learn their language. You tentatively try out this or that noise, and look expectantly at your new companions to see if you got it right. Little do you know that, in fact, these noise-making entities are randomly responding to you with affirmative and negative replies. It doesn’t really matter what you say, or when you say it. Could you ever get a sense of their language? No, of course not, because their language isn’t really a language: it’s just random noise, at least so far as you are concerned.

In order to turn this random noise into a brandom language (ha! couldn’t resist), there have to be some fixed patterns, some sort of system that determines when an affirmation is appropriate and when it isn’t. Once there is such a system, there’s linguistic meaning, according to inferentialism. A terrific example of this is the substance of Tom Stoppard’s play, Dogg’s Hamlet. Stoppard takes his cue from a scene familiar from Wittgenstein’s Philosophical Investigations in which some builders are constructing something by issuing orders to one another. In Stoppard’s play, though, a speaker of normal English is dropped into a community of people who speak Dogg’s English, which has the same sounds as English, but the conditions of affirmation and negation are quite different. The character, and the audience, have to figure out the structure behind Dogg’s English, and the figuring out leads to great hilarity. Just when we get the hang of Dogg’s English, the players put on three versions of Shakespeare’s Hamlet (a short version, a very short version, and a ridiculously short version), which makes about as much sense to the players as Dogg’s English did to us, at first. And now we can hear how crazy it sounds. Community is everything.

I’m left wondering, though, about the exact difference between randomness and brandomness. Obviously, it is a difference of structure. But structure of what? I guess it must be structures of behavior, tradition, practice. And surely it is impossible to think of these without thinking of meanings. But does this show that these practical structures are meanings? Or that they are made possible by meanings? If the first, Brandom is right; if the second, he’s described a symptom rather than the actual thing.

I would like to suppose that, at the very basis of a language, the world itself is helping to impart structure. That is to say, probably the first things said in any language have to do with immediate contacts with what’s going on in the world – what animals are present, where the pointy rock is, how far it is to the water, and so on. The world provides the most fundamental affirmations and negations. Once a basic language gets off the ground, so to speak, more complicated and abstract structures can develop. (This would also mean that the greater the distance between the ground and what’s being said, the greater likelihood that what’s being said may have no meaning – a skeptical result I rather appreciate!) This doesn’t seem to be far from what Quine claimed in Word and Object.

Yet, given Brandom’s abiding interest in Hegel, I wonder if he thinks there more to the structure than can be given by its grass roots. I’m thinking of reason, of course. The “structure of inference dynamics” that constitutes linguistic meaning perhaps is infused with logic, or reason, in some deep fashion, perhaps in the way that grammar gives shape to syntax. This would mark off a position distinct from Quine’s, more rationalistic in this respect than naturalistic. It would mean that what determines our saying “Yes, that’s right” and “No, not quite” is not merely what our collisions with the world have conditioned us to say, but also what our minds have contributed: specifically, our capacity to reason logically about grammar, truth values, concepts, and possibility. As skeptical as I like to be, I have to admit that this seems a bit more plausible. It ain’t all conditioning. There must be some ghostly entities somewhere.

Posted in Metaphysical musings | Leave a comment

Reflections on Comics Movies

I just saw the trailer for the upcoming Dr. Strange film. Now I am thoroughly a long-time deep-dyed nerd – from Dark Star to Star Wars to Doctor Who (Tom Baker!) to LOTR to Primer to Skyrim to Firefly to Mass Effect and on and on and on. I was a nerd long before those enjoying nerdom as hip were even born. But I found myself wondering when, oh when, will we finally tire of this same old shit?

Here’s the movie: the world is going to be destroyed by some localizable evil agent. Our hero is a normal dude who suddenly gains magical powers. People plea with him to use his powers to save us. He reluctantly agrees. Vast mayhem ensues. He is on the verge of failure, but then cleverly manages to deal out a horrible death to the localizable enemy. Lock and reload for a sequel.

It’s a standard plot in the vast majority of comic books, and that’s what Hollywood is finding profitable these days. And, once you accept the plot as the only possible one, no one is doing a better job than Marvel. The films of the Marvelverse are clever and visually stunning, even if utterly predictable. But as I watched the fabulously talented Benedict Cumberbatch, armed with an American accent and a cape billowing out behind him, I thought: must I watch this again?

NexuscomicThe problem may be that, nerdy as I am, I have never been into comic books. I tried, but I never got hooked in the way that keeps one buying issue after issue. (Actually, I once was hooked briefly on Mike Baron’s Nexus; whatever happened to that?) I think there might be a window in one’s life in which both comics and professional sports may get their hooks into you; if the window closes (by the time you’re 15 or so), then forget it, they never will. I admire the enthusiasm of  Cubs fans and readers of comics (that is, I admire my friend Roger), but I just don’t have it in my bones.

But something else must be going on behind the insane success of the Marvelverse, since there just aren’t that many comics fans around. I think what’s happening is that the wonderful world of comics is being co-opted by another population. In my limited experience, comics fans all know the generic plot I described above. They are interested in what creative writers and artists can do with that plot line, in the special twists and innovations that only those in the know will recognize. (They’re like fans of blues music in this respect.) I think most real comics fans enjoy the Marvel films – how could they not? – but if you give them a moment they will tell you about twenty far more interesting and far less known comic story lines, such as (I’m making this up) Teutonic Heat Shield issues 127-143, or the adventures of Cerebus the Aardvark (no, I’m not making that up). They’ll see the Marvel movies as quite decent and enjoyable comics for the masses, but they can point you toward the really interesting stuff, if you’re interested.

But the bigger population co-opting the comic world, I think, isn’t so much into this connoisseurship. They are people who simply want to see spectacles, or violent solutions to malevolent problems. Everyone – no matter the politics, religion, economic class, culture, or language – can get behind the claim that there’s evil in the world, and that it sure would be nice to have a magic power to make it suffer and die. That’s a common denominator (and, come to think of it, the plot of every world religion). Now, as we become adults, we are supposed to recognize that the evil in the world is not localizable. It’s everywhere, in varying degrees, and we’re complicit in it. And there aren’t violent solutions; indeed, attempts at violent solutions only increase the evil (an idea Marvel Studios has begun to flirt with in Captain America: Civil War, to their credit). There is only, at best, complicated compromise, in the spirit of hope.

The most interesting SF films aren’t to be found among these blockbusters, to be sure. My family and I recently watched Frequently Asked Questions About Time Travel, which was refreshingly clever, funny, and campy in its visuals. Ex Machina, I think, is a deep and wonderfully troubling film, and did enjoy commercial success. Scott Pilgrim vs. the World, my daughter reminds me, brought a slice of the wide world of alternative comics to the screen, as has Kick-Ass and, to some extent, Deadpool. So, riding in on the current superhero supercraze is some genuinely smart comics and SF. So I shouldn’t grump. The challenge remains the same as it does in all matters: search for the signal amidst the noise.

Posted in This & that in the life of CH | Leave a comment

Delusions and politics

Last night, in discussion with friends, I found myself defending my own skepticism. The topic was whether there is an objective human good, or even a genuine human nature that determines how humans should live if they want to have happy lives. I’m willing to admit that, as a matter of empirical fact, a great many people across time and culture find a certain set of things valuable: free time, friends and family, work, a sense of belonging to a larger purpose, and so on. And finding these things valuable probably has something to do with humans as a species: we’re not armadillos, after all, and evolution has made us into a certain kind of species that delights in some things over others.

I hesitate, though, in calling this a “human nature” because in my mind a nature brings with it a certain kind of necessity, which I reject. I think it is perfectly possible for there to be someone who is a member of our biological species and just doesn’t go for the usual things. Maybe they rather like sitting alone in dark rooms feeling spiteful, like Dostoyevsky’s underground man, or maybe (like a great many intelligent human beings) they can’t enjoy the simple pleasures of life without finding themselves feeling guilty, inauthentic, or idiotic. I don’t think there’s anything wrong, let alone “inhuman,” about such miserable folk. And I don’t think their attitudes are simply side-effects of a modern industrial landscape that alienates humans from their own nature. There have always been Kierkegaards and Kafkas, I suspect, or humans that just refuse to be happy with what pleases a great many of us.

I went a bit further in the discussion, maybe beyond where a skeptic should stop. The general kinds of things most humans like (free time, friends and family, etc) feel like the makings of a “meaningful” life only when those humans manage to forget the boundary conditions of human existence. As I’ve said before, life is utterly meaningless. (I’m not much of a skeptic about this.) I think people can still feel happiness and take delight in many things – or refuse to take delight in them – but neither the delights nor the refusal are at all meaningful. Nothing is. If it does turn out to be a fact of our species that we can’t fully delight in anything unless we regard something as meaningful (and I don’t think it is a fact, but anyway) – then it is also a fact that human delight requires some form of delusion, or at least forgetfulness. Nothing new here; many philosophers have made this claim.

But the main topic of our conversation was politics, and I was asked what sort of political participation follows from my skepticism about there being any kind of genuine human nature or objective human good. I’m not sure what to say here that isn’t obvious. Obviously, some people like some things, and others like other things. Political participation is a struggle to bring about more of the things you happen to like. If what you like isn’t consistent with what other people like – so there’s no way to go separate ways and be happy about it – then you roll up your sleeves and get dirty. So, for example, I happen to like everyone having food, shelter, basic medical attention, etc. Other people don’t really care about it, or don’t see it as a problem government should address. I think government is probably the best way of making it happen (it’s at least an important component), so I need to wrestle with these people and try to have my way over them. It’s not that they’re mistaken about human nature or objective human goods. (What do I know about human nature and the objective human good?) It’s that I want something to happen and they are in my way. Obviously, they feel the same way. Thus politics.

Now in our political contests, they or I might make any number of appeals to human values, human nature, God, the ends of life, etc. That’s just effective rhetoric. In politics, if it works, it’s legit. I’ll even go so far as to say that, in all likelihood, the things I like are more likely to come about if people don’t think the way I do and instead believe the rhetoric and delude themselves with visions of meaningfulness.

Come to think of it, writing this little essay is not such a great idea. You know what? Forget it. I was wrong. There is a genuine human nature, and an objective human good, and it requires some form of democratic socialism and a radical redistribution of wealth. I’ll get you the details later.

Posted in Meanings of life / death / social & moral stuff | 9 Comments

Justin E. H. Smith, The Philosopher

Reflections on Justin E. H. Smith, The Philosopher: A history in six types (Princeton UP, 2016).

This is a timely, marvelous book that raises fruitful questions and criticisms especially about the ways philosophy is conceived by its modern-day, academic practitioners. Clearly, throughout human history, there have been all sorts of people who have wondered and theorized about ultimate things, morals, politics, gods, love, knowledge, and so on. Today, though, the term “philosopher” is often recognized to refer to a university professor in a department called “Philosophy.” It’s not that what these professors do isn’t properly philosophical; but it might better be seen as one subculture, and not one that merits any special privilege or authority over philosophy as a whole.  This is Smith’s overall point, I think, and through his many examples and asides I think he is encouraging academic philosophers to broaden their minds a bit and to take up the challenge of thinking more creatively about what they should be doing.

With chapters on “the Curiosa,” “the Sage,” “the Gadfly,” “the Ascetic,” “the Mandarin,” and “the Courtier,” one might expect a fixed typology of the various ways of engaging philosophy, with illuminating examples of each type. But this isn’t what Smith provides. Instead, he uses each species of philosopher as an occasion “to elucidate a particular opposition that has been brought into service by philosophers seeking to define what is and what is not philosophy” (18). That is, every type of philosopher is tied to certain efforts, past and present, to draw a line separating real philosophy from really not philosophy. In the end, though Smith is somewhat apologetic about never reaching a crisp definition of what counts as philosophy, it is clear that this is just what he thinks we all should avoid. There really is no need to establish an official policy about what counts and what doesn’t. Why not just let people “do philosophy” in whatever ways seems interesting and important? (And isn’t this, pretty much, what has happened and will happen anyway?)

Smith’s own temperament, it seems to me, pretty clearly falls within the Curiosa, or the ones interested in a thousand different things, who try to find intelligent patterns in the noise. History offers many philosophers of the type: Aristotle and Leibniz spring readily to mind. But Smith uses this occasion to draw our attention to several others who fit this type and yet are not commonly read today as philosophers. There is Laurent Lange, and his 1735 account of the natural philosophers trying to square Siberian mammoth remains with Job’s Behemoth; there’s D’Arcy Wentworth Thompson’s 1917 critique of Kant’s teleology in biology; there’s François Bernier’s record of being self-conscious of standing out as a materialist philosopher in 17th-century India. And there is all the stuff of Leibniz that typically is not read – really, it’s hard even to imagine something Leibniz didn’t explore – as well as the voluminous undertakings of Athanasius Kircher, whom Smith reserves for later discussion. In short: start pulling at this strand of philosopher-as-know-it-all, and there’s an avalanche of neglected figures in whom we should be taking greater interest.

The same theme emerges in every chapter, the lesson being that, no matter who you are, you should be reading more, and along less predictable lines. It’s not that Smith is trying to show himself off as one who has done it right; rather, it’s that he has discovered a wide array of surprising things in his own readings, and he can’t imagine why we shouldn’t want to do the same. One very interesting aside he offers is his own attempt to sort out his “philosophy” books, which belong at the office, from his “everything else” books, which can remain at home. Descartes is philosophy, as is the Cambridge Platonist Henry More. But what about More’s poetry? And Edmund Spenser’s The Faerie Queen?

Should I keep Jean Genet’s plays at home or in the office? Genet was an important influence on Derrida…. Could it be that somewhere in the work of Genet one might find a hint as to the real significance of some point of misunderstanding in the Derrida-Searle debate? I would not want to exclude the possibility out of hand. And similarly for virtually every other literary author who has worked, broadly speaking, on the plane of ideas, entering into contact with, bumping up against, the people we categorize as philosophers: Aristophanes, Cyrano de Bergerac, Whitman, Eliot, Beckett. I am not prepared to remove these thinkers to their own library, because to do so would be an impoverishment of my private philosophy shelves. (153)

In merging his libraries, Smith is advocating that academic philosophy find common cause with the other humanities. Indeed, he later mentions James Turner’s 2014 book, Philology, which (Smith claims) argues that all of the humanities, including philosophy, have philology as their common origin. But in fact Turner leaves philosophy out of his story; as he says in his epilogue, philosophers have consistently set themselves apart from the more bookish disciplines like history and literature. Of course, this depends on what we mean by “philosophy,” and Smith in his indirect way is providing a vision of philosophy as less celibate and more promiscuous than has commonly been the case. Here, perhaps, he is guilty of a fault he impugns to others, that of providing a “royal road to me” account of philosophy (10). But I’ll forgive him that, as it’s damned near unavoidable.

One would expect a book such as this to offer a great many examples of philosophy from around the world, conceived in ways alien to the European enlightenment. And so it does, with nontrivial accounts of Indian schools, Bantu-speaking traditions, and Chinese philosophy. But Smith also discusses a further source of philosophy that is also neglected, primarily because of economic snobbery: the philosophy of poor and rural peoples. Wonder is not just for an elite; anyone can do it, and people do, everywhere. Smith recounts his own mishandling of a conversation with a Mohawk man, who wanted to assure Smith that his people had philosophy, too, while Smith was keen on pulling the man into “real” philosophy, as it is done in the university. But Smith now sees the error of his ways:

I hear Thomas Nagel holding forth on whether death is or is not an objective misfortune, or Hannah Arendt on why it is troubling to see human viscera, or Daniel Dennett on which creatures may be killed with no moral qualms, and which may not be, and I think: why should I listen to you in particular? There is a whole world full of people out there, some on farms, some in rain forests, and some in slums, all charged up with beliefs of their own about these and many other things. My philosophy would be the one that would take the broadest possible measure of these beliefs, without concern for the institutional affiliations, the literacy, or the geographical niche of their holders. (80)

Smith is also ready to hear from the educated outsiders in our own culture. One recurring character – indeed, one to whom the book is dedicated – is Bud Korg who, real or not, sends Smith letters urging him to read his e-book, Quantum Truths for the 21st Century, which has been praised by Professor Tom Kumpe of Two Prairies Technical College as a bold “attempt to show the unity of human knowledge.” Smith, Korg writes, should take the time to read the book – that is, if he’s willing to take a break from enjoying his “free ride as a tenured so-called ‘philosopher’” (124). We all know the kind: the enthusiastic autodidact who thinks he’s come up with something all the trained professionals have missed. But wait – are we talking about Descartes here? Or Leibniz? With what sort of right do we brush aside the Bud Korgs of the world? And what makes us so infuriatingly condescending?

So: this is no typical survey. It is, at once, thoroughly European, but confronting both internal and external obstacles; it is learned, but also refreshingly creative and autobiographical; it shows great literacy, with equal interest in the non-literate. It reflects what philosophy has been and can be: an interesting and unpredictable ride through Wonderland.

Posted in Books, Items of the academy / learning | 2 Comments

Hegelian vs. Kuhnian idealism

[from an essay in progress on idealism]

What we have seen so far is that there is no observation of the world, and no understanding of it, without a theory. We have also met several idealists who believe, in varying ways and for different reasons, that theory is not just important, but really, really important: the most important knowledge we can have of our experience has more to do with theory than with anything else. Hegel, the greatest idealist of all, believed that human progress consists in further development of theory.

G.W.F._Hegel_(by_Sichling,_after_Sebbers)But, as thrilling as his vision is, it is hard for us to go the full distance with Hegel. It is true that our greatest triumphs of knowledge, in physics and engineering, in biology and medicine, in economics and public policy, are all made possible by advances in theory, and through the patient application of reason to experience. But the fact is that foundation of Hegel’s vision – that in the end all our knowledge grows from a grand logic – seems to have been of no use to us whatsoever. The only scholars interested in fathoming the depths of Hegel’s most fundamental philosophy are historians who are only trying to make some sense of it. It could be, of course, that Hegel’s mind was simply more penetrating than any other mind since, and his logic should have made more of a difference; or it could be that Hegel got some things powerfully right, but made the mistake of trying to base them on a logic that was in fact only a philosopher’s dream.

What would Hegel’s philosophy look like without his grand logic? In the area of science, we would see old theories being replaced abruptly by new, radically different theories that offer deeper insight into the forces and laws of nature. In the area of history, we would see conflicts and tensions being worked out through dialogue and politics and – when these failed – war. But there would be no overall theoretical structure that governed these advances, or told them which way to go, so to speak. There would be only those of us on the ground, working things out as best we can, armed only with our insights and prejudices, our biases and our hopes, our blind spots and our misgivings. There would be no guarantee that we were heading in some special direction: there would be only our desire to fix what is broken, and to make a better situation for ourselves. Hegel’s view, minus the grand logic, begins to look startlingly plausible.

kuhnThe resulting vision might be drawn from what Thomas Kuhn offered in his 1962 work, The Structure of Scientific Revolutions. The work is famous for introducing the twin terms “paradigm” and “paradigm shift,” which can now be found in just about everything humans set their sights on. A paradigm is a framework, a theory, or a shared vision of a complex system. A paradigm shift – or a revolution – is when one paradigm is replaced by another. Earlier historians of scientific revolutions viewed these paradigm shifts as rational transitions in which one theory offering better predictive power replaces an older one that just could not compete. Kuhn’s work – itself a paradigm shift, among historians – was to see these shifts as not wholly rational. At times a better theory is available, but it is not adopted because the old theory is so well entrenched in existing practices and institutions. Other times a new theory is adopted even though, from the perspective of the old paradigm, it really does not offer any advantages. Anyone who really pays attention to history cannot see scientific revolutions as purely rational transitions. Human politics, economics, and psychology play very active roles.

epicyclesKuhn’s celebrated example is the Copernican revolution. The textbook account, still prevalent in many quarters today, is that western Europeans limped along for centuries with a Ptolemaic model of the solar system, with the earth at the center and the planets orbiting it in curlicue fashion (due to “epicycles,” or little circular orbits around a point which itself orbits the earth). Along came Copernicus in 1543, daring to suggest that the earth and the planets orbit the sun. Suddenly, the traditional story went, all of our observations of the planets made sense and planetary positions in the night’s sky could be predicted without recourse to all those silly ad hoc epicycles. But this textbook account is wrong. It is true, of course, that the Ptolemaic model reigned for centuries, but for good reason: it offered accurate predictions of where we should see the planets each night, and so it was indispensable for navigators trying to cross oceans without sight of land. Ptolemy’s system, as presented on Johannes de Sacrobosco’s 13th-century treatise On the Spheres, was taught to would-be navigators well into the 17th century. Copernicus’s new model was not accepted or taught for generations – mainly because it was worse at offering predictions. But it was not rejected outright; mathematicians and astronomers kept tinkering with it until Kepler came along, replaced the circles with ellipses, and made it work.

If it was so bad initially, why on earth was Copernicus’s view not simply rejected? Basically, Kuhn’s explanation was that experts at the time decided that the overall “package” Copernicus offered – the new model, attended by new sorts of problems – was more interesting and more promising than the old theory, which over the centuries had basically gone about as far as it could go. The new theory offered interesting opportunities. It also was in line with broader revolutions sweeping Europe, which sought to dethrone Aristotelian authority in the churches and universities and support a radically new and independent view of the universe. The Copernican revolution should be seen as something more like a political revolution: a change that happens out of a variety of influences, tipping points, and new values. It took place, in the words used above, at the hands of people “on the ground, working things out as best we can, armed only with our insights and prejudices, our biases and our hopes, our blind spots and our misgivings.” Or, in Kuhn’s words:

Individual scientists embrace a new paradigm for all sorts of reasons and usually for several at once. Some of these reasons – for example, the sun worship that helped make Kepler a Copernican – lie outside the apparent sphere of science entirely. Others must depend upon idiosyncrasies of autobiography and personality. Even the nationality or the prior reputation of the innovator and his teachers can sometimes play a significant role. (Structure, pp. 152-3)

Historians have argued ever since about exactly what a paradigm is, or precisely when a shift can be said to occur – and with good reason, for revolutions are complicated, confusing things. But no one tries to see such revolutions in Hegelian terms, or in the terms of a grand logic that become more crisply focused over time. It is not that conceptual revolutions are wholly irrational. Individuals in the middle of revolutions are sorting things out as best they can, through their own reason, weighing a range of competing beliefs and values. But whereas Hegel would see the outcome as determined by reason’s own structure, we see more chance at play. There is no global plan. There is only improvement on what was before, in the eyes (or in the paradigm) of those who have come out the other end of the revolution. Again, Kuhn:

Can we not account for both science’s existence and its success in terms of evolution from the community’s state of knowledge at any given time? Does it really help to imagine that there is some one, full, objective, true account of nature and that the proper measure of scientific achievement is the extent to which it brings us closer to that ultimate goal? If we can learn to substitute evolution-from-what-we-do-know for evolution-toward-what-we-wish-to-know, a number of vexing problems may vanish in the process. (Structure, p. 171)

What Kuhn offered was a relativized idealism. The theories or paradigms are as important as ever: they tell us what things are, how things change, and what our experience is. But the paradigms are not “pregnant” with their successors. Old paradigms are discarded and new ones adopted as the result of many factors, some rational, and others less so. The fact that we discern progress over time, particularly in the case of science, tells us more about the current paradigm we inhabit than about the objective rationality of human discovery over time. Progress, that is to say, is relative to those making the judgment, and what they count as “progress.” Indeed: without some sort of Kantian or Hegelian ideal of pure reason, what else could it possibly be?

But Kuhn’s idealism is relativized in a second way as well. Kant, with his twelve categories of the understanding, and Hegel, with his grand logic, believed that the structures we place upon our experience are fixed (Kant), or at least that the way in which those structures evolve is fixed (Hegel). For Kuhn, the paradigms we invent are constructed from available materials, and the revolutionary thinker comes upon them in a flash of insight, much in the same way an artist suddenly sees a new way of combining given elements. The new paradigm is incommensurable with the old in the sense that the two paradigms are irreducibly different ways of seeing the phenomena. Usually, in a period of revolution, there are several thinkers devising very different paradigms, and many of them go nowhere or attract only a few followers; one of them, for various reasons, gains greater currency, and “wins.” The winner, though, does not have a deep logical connection with the old paradigm and does not emerge out of it in any meaningful way. Its new concepts are regarded as important and fundamental only from the perspective of the new paradigm – indeed, they make sense only in the new paradigm.

The transition from Hegelian to Kuhnian idealism might be seen as follows. In Hegel’s version, there is a single form of human thinking that allows for a single story to be told. Hegel knew very well that history has an impact on thinking, and different cultures offer different stories about the world. But he also believed they have a common core, described through a single logic, which serves to orient all our conceptual efforts (as well as our political ones) into a single direction. Kuhn, on the other hand, allows for incommensurably different forms of thinking and different stories, offered up in their varieties like just so many Darwinian species, competing for dominance in an ideosphere. At the end of a revolution, one view dominates – but that victory has more to do with contingencies of history than with that view being a better expression of some fundamental logic. For Hegel the basic script has been written; for Kuhn, we make up the story as we go along.

Posted in Kant and/or Hume | 3 Comments

Krug’s pen

Fountain pen and letterWilhelm Traugott Krug (1770-1842) was the philosopher who succeeded Kant in the chair for logic and metaphysics at the University of Königsberg. Just before taking on that role, he had thrown down a challenge for Schelling’s idealist philosophy: could Schelling, or any idealist, pretend to offer any sort of explanation why, from the Absolute, any particular thing – such as the pen Krug was using – should exist? How do we get from a stock of pure concepts to individual things we hold in our individual hands? In the end, Schelling admitted that it can’t be done. The mind can discover all the logical possibilities, but none of the actualities. For this there must be actual intuition, Vorstellung, or the representation of factual beings from outside the intellect. The mind yields negative philosophy, or the philosophy of possibilities; for positive philosophy, we must bump up against the world.

Schelling used this insight to point out the critical shortcoming of Hegel’s philosophy. Hegel’s logic, he claimed, yields only negative philosophy; but since Hegel knew that he somehow had to account for the reality of finite particulars, he fudged a bit. With one eye on the changeless Parmenidean world of logical Being, and the other eye on the pen in his hand, he came up with the concept of Becoming, which arises magically out of the dialectic between Being and its evil twin, Non-being. “Bad faith!” charged Schelling: Hegel was twisting logic to meet his own philosophical demands.

Stephen Houlgate has argued that Schelling himself was not being fair in this accusation. For Hegel, unlike both Krug and Schelling, did not sharply separate the land of pure logic from the land of pens and writing desks. When we experience particular things, we are already deep in the world of logic and concepts. We identify them, distinguish them, and make sense of them through a logic that permeates all being and thought. In Houlgate’s words, “Things are not given to us as existing by sensation (or by Vorstellung) as such, but have to be understood to exist by the very same thought and understanding that determines what they are” (1999: 119). Hegel never finds himself trapped in a prison of abstractions, looking for an escape. The world, as it were, is trapped there in the prison with him.

Hegel’s logic, then, is not meant as a rationalist philosopher’s replacement for Genesis. His claim is not that in the beginning there was the Idea, which thought Being, etc., and in the end out popped Krug’s pen. We might even reverse the order: in the beginning was Krug’s pen, and we came to think about what it was, and what it wasn’t, and before long we found in our world Being and Non-being, and Becoming, and for further details please consult the Logic. Hegel infuses the world with logic, in just the way our physicists imbue the world with invisible forces and conservation laws. Our task is to see the logical structure of our experience, and fathom its depths, until we see for ourselves that Anaxagoras was right, and all is indeed mind.

Napoleon_Chasseur_from_Guard_by_BellangeBut one might further wonder whether there was more going on in this debate than accounting for the existence of pens. After Krug taught in Königsberg for a few years, he moved on to the University of Leipzig. In 1813, he took time off from teaching and served as a cavalry captain in the “War of Liberation” against Napoleon’s army as they retreated from Russia. The overwhelming allied forces chased Napoleon out of Leipzig to western lands and finally back to France.

There aren’t many philosophy professors ready to ride out into real battle, but this event is less surprising in the case of Krug. In a long list of works, he championed freedom of religion and speech and advocated many liberal causes, including the emancipation of the Jews. The mere thought of freedom was not enough for him; he sought action, change, and active resistance. Scholars are divided as to just how liberal Hegel was in his thinking, but it is undeniable that he saw advantages in constitutional monarchy and liked to see the World Spirit taking possession of singular, powerful individuals. While he could rationally accommodate any of the changes Krug fought for, his temperament was to smooth changes into continuous, inevitable transformations rather than to see them as sudden and contingent ruptures. Hegel saw the pen as part of a world that was meant to be; Krug saw it as a tool he could use to make a mere possibility actual.

 

______________

Stephen Houlgate, “Schelling’s Critique of Hegel’s “Science of Logic””, Review of Metaphysics (1999), 53:99-128.

Posted in Historical episodes, Kant and/or Hume | Leave a comment

Peter Adamson, and the gap problem

It’s wonderful to have Peter Adamson’s perspective on this perpetual problem in teaching the history of philosophy: whom do I cover, and whom do I leave out? Adamson, of course, is bravely executing “The History of Philosophy Without Any Gaps” podcast. He knows it’s impossible, but he’s doing what he can do give some basic treatment of philosophy from all times and places. I’ve heard a few of the podcasts, but have recently gone to the very beginning and am listening in order while I’m having to haul my body from one place to another. He’s endearingly nerdy and silly, and also absolutely genuine and responsible. He’s doing a good thing.

The basic tension is that teachers feel obligated to cover “the greats” – the people whose names students really must recognize, and will most likely encounter in other classes or books or conversations. But at the same time, these “greats” in the western tradition tend to be all men, precisely because women have for such a long time been forbidden or at least strongly discouraged from participating. Whether we mean to or not, we perpetuate the discrimination by not including women philosophers, since undergraduate women often come away with the sense that this is a game for boys. And even setting that important issue aside, there are loads of wonderful, intriguing philosophers from history who did not make the “A-list” for reasons having nothing to do with intrinsic merits of their writing. Accidents of history, and all that.

But every school term is limited. Peter nails the tension head-on:

Are you really going to drop Aquinas from your medieval philosophy course to make room for Eriugena, or skip over Hume to accommodate Mary Wollstonecraft when teaching modern philosophy?

And his answer swiftly follows:

But what I’ve come to think is that we should give up on trying to cover “all the important things.” For this is impossible by a very large margin. You might tell yourself you have covered the important medieval philosophers if you’ve done Anselm, Abelard, Avicenna, Aquinas, Scotus, and Ockham. That’s an impressive line-up, no doubt. It’s a lot more medieval philosophy than most undergraduate students will ever read, and even gets in a thinker from the Islamic world. But do these big names really have a greater claim on our attention than Eriugena, Hildegard of Bingen, John Buridan, Meister Eckhart, and Fakhr al-Din al-Razi?

My answer would be no. The fact that such authors are not, or not yet, “canonical” has little to do with historical and philosophical merit and much to do with the historiographical priorities and limited perspectives of previous generations. These generations wrote our textbooks, designed the syllabi for courses we took as students, and decided what to edit, study, and translate—and in so doing, shaped our sense of what is too “important” to leave out. In reality, there are simply too many important thinkers in every period to be fit into any undergraduate historical course, in both the historical and philosophical sense of “important.” And that’s without even getting into “minor” figures like, say, Saadia Gaon, Yahya ibn ‘Adi, Alcuin of York, John of Salisbury, Hadewijch, Radulphus Brito, or Henry of Ghent, all of whom would be well worth teaching to undergraduate students. So when we’re exposing students to any period in the history of philosophy, we should not tell ourselves that we only have time to visit the highlights. In fact we should admit that we don’t even have time to do that.

Peter goes on to recommend that we don’t think of covering the “major” figures as our primary responsibility. We might think in terms of giving a taste of the kinds of problems of the time period, the styles of argument, the big concerns, and the seemingly endless variety of voices. Students who leave class with an informed sense of the complicated landscape of early modern philosophy – metaphysical, social, epistemological, political, religious – will be much better served that those who leave with the sense that there was Descartes, Spinoza, Leibniz, Locke, Berkeley, and Hume — and little else going on.

There is plenty of room for experiment and variation in these matters. One way to dispel the “boys only” sense is to include secondary articles by contemporary female historians of philosophy. One can devote a day to two or three lesser-known figures; or, turning that upside down, one can spend a day providing a thumbnail sketch of a “great,” and then spend two or three days going into greater detail of a lesser-known figure. One can assign an “orthodox narrative” (like Copleston’s) as homework reading, and then use class time to complicate and challenge that narrative.

In the end, I agree with Peter that we can stop thinking of our work in teaching the history of philosophy as something like screwing this or that part onto a chassis as it rolls down the line, thinking that we must make sure that these parts are included for the final product to be functional. A better view is that we are equipping students for much more variable tasks, and a more open-ended future.

 

Posted in Items of the academy / learning, Uncategorized | Leave a comment