The sifting of human creations! —nothing less than this is what we ought to mean by the humanities. Essentially this means biography; what our colleges should teach is, therefore, biographical history, that not of politics merely, but of anything and everything so far as human efforts and conquests are factors that have played their part. Studying in this way, we learn what types of activity have stood the test of time; we acquire standards of the excellent and durable. All our arts and sciences and institutions are but so many quests of perfection on the part of men; and when we see how diverse the types of excellence may be, how various the tests, how flexible the adaptations, we gain a richer sense of what the terms “better” and “worse” may signify in general. Our critical sensibilities grow both more acute and less fanatical. We sympathize with men’s mistakes even in the act of penetrating them; we feel the pathos of lost causes and misguided epochs even while we applaud what overcame them.
Such words are vague and such ideas are inadequate, but their meaning is unmistakable. What the colleges—teaching humanities by examples which may be special, but which must be typical and pregnant—should at least try to give us, is a general sense of what, under various disguises, superiority has always signified and may still signify. The feeling for a good human job anywhere, the admiration of the really admirable the disesteem of what is cheap and trashy and impermanent—this is what we call the critical sense, the sense for ideal values. It is the better part of what men know as wisdom. Some of us are wise in this way naturally and by genius; some of us never become so. But to have spent one’s youth at college, in contact with the choice and rare and precious, and yet still to be a blind prig or vulgarian, unable to scent out human excellence or to divine it amid its accidents, to know it only when ticketed and labeled and forced on us by others, this indeed should be accounted the very calamity and shipwreck of a higher education.
The sense for human superiority ought, then, to be considered our line, as boring subways is the engineer’s line and the surgeon’s is appendicitis. Our colleges ought to have lit up in us a lasting relish for the better kind of man, a loss of appetite for mediocrities, and a disgust for cheapjacks. We ought to smell, as it were, the difference of quality in men and their proposals when we enter the world of affairs about us. Expertness in this might well atone for some of our ignorance of dynamos. The best claim we can make for the higher education, the best single phrase in which we can tell what it ought to do for us, is then, exactly what I said: it should enable us to know a good man when we see him.
The rest of the lecture can be found here.
(an excerpt from How You Play the Game):
I have killed three dogs in Minecraft. The way to get a dog is to find a wolf, and then feed bones to the wolf until red Valentine’s hearts blossom forth from the wolf, and then it is your dog. It will do its best to follow you wherever you go, and (like a real dog) it will invariably get in your way when you are trying to build something. Apart from that, they are just fun to have around, and they will even help you fight monsters. If they become too much of a nuisance, you can click on them and they will sit and wait patiently forever until you click on them again.
I drowned my first two dogs. The first time, I was building a bridge over a lake, but a bridge that left no space between it and the water. The dog did its best to follow me around, but it soon found itself trapped beneath the water’s surface by my bridge. Not being smart enough to swim out from under the bridge, it let out a single plaintiff yelp before dying and sinking. Exactly the same thing happened to my second dog, as it was this second episode that made this particular feature of dogs clear to me. I know now to make dogs sit if I’m building bridges. I’m not sure what happened to the third dog, but I think it fell into some lava. There was, again, the single yelp, followed by a sizzle. No more dog.
I felt bad each time, while of course fully realizing that only virtual entities were being killed. Surely some of the sorrow I felt was imported from the real world, where I am fond of dogs and do what I can to avoid drowning or burning them. I could not be said to have developed a meaningful relationship with my virtual dogs, but I was pleased to see them each time they caught up with me, and I was a little sad to realize they wouldn’t be getting in my way anymore. I think I was right to feel at least a little bit bad about killing them. But how can there be any place for emotional or even moral attachments to virtual characters? What could cause me to feel any kind of sympathy or concern for beings that don’t really exist?
The answer probably has something to do with the way humans are wired to form attachments to other beings generally. From the perspective of evolution, it’s obviously good that we form strong attachments to human infants. It’s also good that our ancestors formed strong attachments to pets, and not merely because pets can be trained to help us. Pets also share their germs with us, and those ancient people whose constitutions allowed them to survive their pets’ germs bestowed upon us, their descendants, stronger immune systems. So over time we have come to be wired to love furry little things. This general disposition to like animals probably spills over into our encounters with virtual animals – and so we come to feel attachments to them too.
But to explain our attachments to virtual animals in this way does not necessarily lessen those attachments. After all, similar explanations explain why we like babies and dogs – and even cats – and typically we take those felt obligations very seriously. So unless we have some very good reason to overrule the concerns we naturally feel for virtual animals, we should take those obligations seriously as well. But here is a very good reason to overrule these concerns we feel: unlike real dogs and cats, virtual dogs and cats don’t actually feel anything. They are not any more real than the dogs and cats in dreams or comic books. So while we might naturally feel some attachment to them, we do not have any real obligations to them. Thinking that we do have obligations would be like thinking we should worry about how Snoopy feels if we stop reading Peanuts. There isn’t any real being there to be an object for our concern.
We humans form attachments with the unlikeliest of objects. Anyone who doubts this might consider how they would feel if they were asked to throw away their teddy bear. It is just an object, of course, with no feelings or thoughts whatsoever. No real harm is done to anyone by holding it under water or tossing it into a pool of lava. Yet we would not do such things lightly, since we have formed attachments to our teddy bears. We have been through a lot with them – scary nights and lonely times – and having them in our arms has helped us to feel better. We end up feeling that we owe them something, out of gratitude and respect. When it comes time to part, we might consider passing them along to other children, so long as we can assure ourselves that the teddy bears are going to good homes. At one level this is all silly, of course. But that level is not nearly as important to us as the level on which it is not silly at all.
When it comes time to tear down the old school, we feel concern for the bricks and mortar. We never regard them as sentient, but we think of the role the building has had in our lives – perhaps we have passed through the halls, and so have our children. We have grown up there, and we cherish it. It is, we say, of “sentimental value”, but too often that label is preceded with the qualifier “merely”. There is nothing “mere” about such value. None of us would want to see sentimental value deprived of all significance, even if we realize that sometimes it is necessary to let go. At these moments, we register a loss, and the loss often seems to us immeasurable in the terms of any other calculus.
There is no reason not to form attachments to virtual places and characters as well, though this kind of object is relatively new to the scene, and typically these virtual objects aren’t around for very long even in the best of cases. Consider the story of Jerry the slime. A prominent Minecraft player (“CaptainSparklez”) found himself being followed around by a baby slime with a cute smile. He hesitated to kill it, and decided instead to build it a pen and name it “Jerry”. Jerry eventually de-spawned, as that is the fate of such creatures. CaptainSparklez and his followers were sufficiently distraught to build an enormous tree as a monument to Jerry. Now, to be sure, this whole episode was motivated in large part by the joy of pure silliness, and by players just wanting to have some fun with the idea of memorializing a slime. But, undeniably, there was a primitive attachment that served as a basis for the fun. It is not unthinkable for someone to feel that sort of attachment to a virtual slime – indeed, its possibility is what makes the joke possible in the first place. And even now I can hear CaptainSparklez protest in mock seriousness, “How dare you make fun of Jerry!” But the seriousness is not entirely mocking. (For whatever it’s worth, we may also note that Jerry has his own fans who have created special Jerry mods and a special Jerry game. He has his own Facebook page. Jerry has more friends than I do.)
The point is that we form attachments to things that may have no feelings or rights whatsoever, but by forming attachments to them, they gain some moral standing. If you really care about something, then I have at least some initial reason to be mindful of your concern. (Yes, lots of complications can come in here – “What if I really care for the fire that is now engulfing your home?” – but the basic point stands: there is some initial reason, though not necessarily a final or decisive one.) I had some attachment to my Minecraft dogs, which is why I felt sorry when they died. Had you come along in a multiplayer setting and chopped them to death for the sheer malicious pleasure of doing so, I could rightly claim that you did something wrong.
Moreover, we can also speak of attachments – even to virtual objects – that we should form, just as part of being good people. Imagine if I were to gain a Minecraft dog that accompanied me on many adventures. I even offer it rotten zombie flesh to eat on several occasions. But then one day I tire of it and chop it into nonexistence. I think most of would be surprised: “Why did you do that? You had it a long time, and even took care of it. Didn’t you feel attached to it?” Suppose I say, “No, no attachment at all”. “Well, you should have”, we would mumble. It just doesn’t seem right not to have felt some attachment, even if it was overcome by some other concern. “Yes, I was attached to it, but it was getting in the way too much”, would have been at least more acceptable as a reply. (“Still, you didn’t have to kill it. You could have just clicked on it to sit forever….”)
The Minecraft book is available now (see right column). It was loads of fun to write, and it was even more fun exploring the game with my son. The whole process of working with Kindle Singles was fun, too. The editor I worked with was very helpful, insightful, and thorough.
There is no text more commonly read in philosophy courses than Descartes’s Meditations on First Philosophy. This is astonishing, given that the work was written well over three centuries ago. To some extent, to be sure, it is so commonly assigned simply because it is so commonly assigned; that is, it is hard to imagine an undergraduate escaping from a program in philosophy without having read the work at least once – because every program assigns it. This is a true perpetual motion machine.
But the book is a natural choice to assign to undergraduates because of its approach. Descartes pretends to cast aside all the crap he learned in high school and figure out for himself what he should believe. This is exactly how we would like to think of undergraduates, leaving the home environment and state-mandated education for the first time and venturing out on their own to discover the world, building their own minds in the process. Descartes asks his readers to doubt everything, and see what remains as indubitable; and he builds a new world upon a new foundation. And that is the ideal of university education – and yes, yes, it is an ideal rarely met, and one that is always under attack by those who would like to see the university as training for the workplace rather than an enterprise in soul-building. (Newsflash: it is going to be soul-building in any event, and our only choice is in deciding what sorts of souls are going to be built.)
But why did Descartes write this work? Was he trying to write a bestselling textbook for university students? Was he excited about his own efforts at soul-building, and intent upon sharing his success with the world? He had each of these motivations, in some sense. He certainly was excited and wanted to be influential. But his chief motivation was to offer to a broad, reading audience in his day a new structure for their beliefs. For any reader keeping score, not much has changed by the end of his work – he starts out believing in God, a soul, and the physical world – throws this all into doubt – and ends up believing once again in God, a soul, and the physical world. What has changed is the arrangement of these beliefs, and what they are based on: Descartes’s world has gained a different structure upon different foundations. His aim in the Meditations is to convince his readers that they can still believe in all the important things they want to believe in, even if they accept the radical revolution in physics and metaphysics that was brewing in the 17th century. It is just that they will have to re-arrange their beliefs a bit.
The view that was being overturned in the 17th-century had its roots in Aristotle’s philosophy. According to this view, the main players in the world are substances, or bundles of matter that have certain natures, and behave in ways according to those natures. Every substance tries to go about its own natural business, but inevitably each ends up getting in another’s way, sometimes in ways we like and other times in ways we don’t like – and thus the world. Descartes and his comrades believed that the content-rich “natures” of substances could be replaced by more austere, geometrical entities. Basically, the new philosophers asked us to replace a blooming, buzzing botanical garden of metaphysical natures, forms essences, modes, and qualities with a sculpture park designed by Mies van der Rohe. It must have looked like a very poor exchange indeed, giving up an extraordinarily rich set of explanatory powers for a set of meager promissory notes that did not encourage much confidence. The philosophers that Descartes was writing to worried about the cost of swapping out one operating system for another: how would the change affect our beliefs in God and in the soul, as well as our commonsensical ways of explaining nature?
So Descartes wrote the Meditations as way of saying, “See? You can still have strong arguments for the existence of God, and for believing in the existence of a soul; and you can have excellent new strategies for explaining why the physical world behaves as it does.” It was meant as a persuasive and reassuring work, a work that demonstrated that you could still do everything you wanted in the new operating system. (I am not sure whether to cast Descartes as Mac or PC; his system was slick, new, radical, and hard to use – so maybe Linux?) His arguments, as countless undergraduates have demonstrated, are not faultless. But that’s okay; they are good enough for Descartes’s primary purpose, which was to show that conversion to the new philosophy does not require giving up on the sorts of arguments valued by philosophers of the time. His overall rhetorical strategy is to demonstrate that from a completely unbiased starting point – one that is achieved through radical skepticism, doubting everything you think you know – it is perfectly possible to end up inhabiting the new system. It was not as foreign as it may have seemed.
Seen in this way, Descartes’s Meditations really is a work that is stuck in a particular historical context. It is safe to say that few readers today are seeking to be reassured that they can give up their Aristotelian metaphysics for a geometrical world view. This makes it all the more surprising that the Meditations is so frequently read today. But the work, of course, has been repurposed: once designed to serve one polemical purpose, it now serves another. Once meant to ease the transition from old to new philosophies, it now eases the transition from pre-philosophical to philosophical thought. And, as one would expect, it succeeds only partway in this new purpose. Students really do find themselves challenged by Descartes’s skeptical doubts – but they are uniformly unimpressed by Descartes’s own solutions. No one buys his arguments. The overall effect, I am afraid, is a general mistrust of philosophical arguments. Students come away thinking that philosophers are much better at raising troublesome, skeptical questions than they are at providing good solutions. Every argument is bound to fail. And this in turn engenders a measure of misology, or a distrust of reason, at least as it applies to philosophical matters. Philosophical theses come to be viewed as indemonstrable matters of taste.
This is unfortunate for our students – and also, by the way, quite unfair to Descartes. Imagine walking into a computer shop, exploring whether to change to a new system. You worry about the new system’s ability to generate spreadsheets. A helpful assistant demonstrates how to set up a short, tidy spreadsheet in the new system. But you reject the demonstration entirely, since you need have no need for the spreadsheet example you have been shown. “No fair!” the assistant pleads. “I was just showing that you could do this sort of thing!” And this is basically what Descartes wants to tell the undergraduate who has just savaged his argument for God’s existence in the third meditation. The student has missed the central point that Descartes’s operating system supplies arguments for God’s existence just as well as Aristotle’s old operating system. But today’s student, of course, has little reason to be impressed by this feature. And that’s why the way that Descartes’s Meditations is usually taught – namely, as a non-polemical, disinterested research into what is known with certainty – ends up being quite unfair to him.
I do not mean to suggest that Descartes was only trying to show that philosophers could still have good arguments within his new operating system. He believed his arguments were truly compelling – and they are indeed much better than our undergraduates take them to be. But his main aim was to get philosophers talking about his arguments, while making use of his new system – that is, he wanted fellow philosophers to try working within his system and find out for themselves that abandoning Aristotle did not mean abandoning philosophy. It is for this reason that he sent out copies of the Meditations to several influential figures, of diverse backgrounds, and published their objections alongside his replies. The resulting publication was itself a demonstration that this new operating system was sufficient for fruitful and intense philosophical discussion – like getting en entity like UPS to use an open-source operating system.
But all this puts teachers of historical philosophical texts in a bit of a quandary. Taking proper measure of an historical work’s context might make it less gripping to modern students (unless they are blessed with geekiness over history). But re-purposing historical works is unfair to those works and leads often to unintended consequences. So should teachers simply leave history alone, and let the dead rest? Or should they plunge ahead anyway, believing that the good in confronting great texts outweighs any mistaken judgments that are encouraged along the way? What is the best way to read/teach this kind of book?
Though I do end up worrying over this question from time to time, in the end I think it is a false dilemma. None of the foregoing concerns should be out of place in an undergraduate curriculum. That is, we can imagine a fantastic class on Descartes’s Meditations in which we read the texts with our own concerns and questions; find problems in the text; introduce more historical circumstance to reorient our reading, and come to a better understanding of the text; and then see whether we have learned any global lessons about history, philosophy, writing, and reading as a result. In the case of the Meditations, we will probably discover that no conceptual revolution is without its costs; that even an antiquated system like scholastic metaphysics had some very real advantages; that revolutions in thinking aren’t “proven” by experiment, but involve a willingness to conceptualize our experience in new ways; and – perhaps most important of all – there is always a deeper story to be told. Books aren’t repositories of truth, but bits of evidence in a crime scene, and it’s up to us to figure out whodunnit and why, and even: so what?
I have long believed that I should love opera. I’m a great fan of “classical” music (a fairly meaningless term, as it encompasses way too much), and view its existence as one of the primary pieces of evidence for believing life is not meaningless. One of the greatest experiences of my life was several years ago when I had the chance to participate in a special seminar on Beethoven’s string quartets. The class incorporated lecturers from various disciplines, and featured a visiting musicologist who knew everything there is to know about those quartets. Accompanying the class was our resident string quartet’s performance of the entire cycle. On successive evenings I was transported into a heavenly region of the soul, one that said everything that can be said about being an embodied mortal with intimations of eternity.
With all that passion, it seems I should love opera. My friends all told me so. But I have tried and tried, only to find it tiresome. The music, of course, is occasionally beautiful – but the drama is so slow, so uneventful, that I have felt the truth of what Mark Twain once recorded:
“I have attended operas, whenever I could not help it, for fourteen years now; I am sure I know of no agony comparable to the listening to an unfamiliar opera. I am enchanted with the airs of “Travatore” and other old operas which the hand-organ and music-box have made entirely familiar to my ear. I am carried away with delightful enthusiasm when they are sung at the opera. But, oh, how far between they are! And what long, arid, heartbreaking and headaching “between-times” of that sort of intense but incoherent noise which always so reminds me of the time the orphan asylum burned down.”
So I was prepared to mark down “opera” as one of those things I just was not engineered to appreciate.
But this week I have subjected myself to Wagner’s Ring cycle – and behold! A breakthrough! I have discovered the key I needed to unlock the secret. I’ll share it here now, for anyone who is in the same position. The key is this: don’t watch opera as drama, in the way you’d watch a play or a film; listen to it as music – with a side accompaniment of a story, which only aims to offer some general themes for reflection. Put the music first, and maintain only a dim awareness of the plot.
I’m not quite through the cycle – Götterdämmuring lies before me – but I am savoring each moment, anticipating what is to come, replaying themes and ideas in my mind. Paradoxically, by paying minimal attention to the story, I am now fascinated by it. But I think it was the music that opened the door for me. It’s too early to say how much this new-found enthusiasm will translate to other operas, but I’m expecting it will. But even if I’ve only learned to appreciate The Ring, I’m glad for it.
(For several months I have lost myself in the thoughts of Peter Sloterdijk, a contemporary German philosopher. I need to continue to read and absorb his works, but the following is a “status report” on what I have found so far.)
“When someone tries to ‘agitate’ me in an enlightened direction, my first reaction is a cynical one: The person concerned should get his or her own shit together. That is the nature of things. Admittedly, one should not injure good will without reason; but good will could easily be a little more clever and save me the embarrassment of saying: ‘I already know that.’ For I do not like being asked, ‘Then why don’t you do something?'” – Peter Sloterdijk
The world we face is not the same as the one Peter Sloterdijk faced in 1983. That world, and Sloterdijk’s Germany especially, was split down the middle, with a belligerent Soviet Union to the east and an exceptionally deadly alliance of nations to the west. Nuclear weapons were primed and ready and held at abeyance only by the mutual recognition that destroying all life on earth would be poor sportsmanship. Germany itself was divided physically by The Wall as well as being fractured throughout in multiple ways by having been the principal aggressor in two world wars, and losers in them both. German philosophers were trying to come to terms with how their glorious and magnificent culture – the home of Goethe, Schiller, Lessing, Beethoven, and Brahms, a culture which was itself a monument to the highest ideals of humanity – could have descended so completely and quickly into the deepest inhumanity. In the bicentennial of Kant’s Critique of Pure Reason, Sloterdijk joined others in wondering whether it was indeed still possible to have any faith at all in reason or in any celebrated ideal of humanity.
But it is also true that the world Sloterdijk faced has not changed much in the last 30 years. The Wall is gone, and the threat of a nuclear holocaust has been replaced by comparatively smaller (though still deadly) threats. But humanist intellectuals remain skeptical of the ideals of the Enlightenment. We see now that it never really was what it pretended to be. Its authors celebrated human freedom while providing a cover story for colonial oppression and subjugation. Scientists strove mightily for a knowledge of nature while industrialists destroyed nature through forge and furnace. Its politicians aspired to democracy while creating wage slaves. The Enlightenment, it seems now, was a theater of false promises at best and a crucible of cruel mendacity at worst.
Where then do we find ourselves in our post-Enlightenment times? In Sloterdijk’s 1983 estimation, we are in an age of cynicism, or enlightened false consciousness. We all know, when we stop to think about it, that the world is not as it should be, and that we are not living as we ought to live. But even as we know this, we numb ourselves to it, and even laugh over the disconnection between what we know and how we live. “We do our work and say to ourselves, it would be better to get really involved. We live from day to day, from vacation to vacation, from news show to news show, from problem to problem, from orgasm to orgasm, in private turbulences and medium-term affairs, tense, relaxed. With some things we feel dismay but with most things we really can’t give a damn” (98-9). Our cities “have been transformed into amorphous clumps where alienated streams of traffic transport people to the various scenes of their attempts and failures in life” (118). We realize all this, and acknowledge it – and then show up at work the next day, since we feel deep down that there really is nothing to be done about it. In this way we are conscious of our situation, and unhappy about it, but resigned to it: “To be intelligent and still perform one’s work, that is unhappy consciousness in its modernized form, afflicted with enlightenment. Such consciousness cannot be dumb and trust again; innocence cannot be regained” (7).
As I work through the recent works of Peter Sloterdijk (Spheres I: Bubbles, Spheres II: Globes), I am chiefly amazed and enthused by his ability to find deep symbolic and mythic connections throughout the history of philosophical thought, and to use that understanding to bring our culture into a startlingly fresh relief. His insights make me feel as if the rest of us are sleepwalking through time. As much as we like to pretend otherwise, we are still mythic beings, ordering our experience according to prehistoric passions and plots. Perhaps every age thinks they are doing something different, and are at last bringing clear and objective thought to what was once distorted through prejudice and magical thinking. But it seems to me that this pretense of objectivity is at an all time high in our culture – for we do have, unquestionably, the most advanced sciences ever. We think now in terms of demythologizing – as if that is ever possible for human beings. We think we can carve our own psychologies out of our theory-making efforts and grasp things in themselves as they are in themselves, if only we make good use of careful experiments and double-blind refereeing processes. We even believe we can see ourselves as we are in ourselves, even when it is we who are doing the looking.
This perspective cuts us off from our own history. For we are instituting a new dateline – something like “Before Darwin vs. After Darwin” or more likely “Before WW2 vs. After WW2″, since that also marks the rise of the modern research university, before which (apparently) no trustworthy science was ever done. We tend to see our new theories and insights as qualitatively different from anything proposed earlier. This turns our history into an inchoate prelude to the present. If we do make the effort to cast a glance backward, we feel that we can sort out the thinking that was on the right track from the crazy or “insufficiently critical” thinking of the past – ignoring the fact that then, as now, everything is thoroughly entwined with everything else, and our current efforts at understanding are in fact not qualitatively different than historical efforts. We are always and forever “in a fine mess,” as Oliver Hardy would say, mixing up mythologies and metaphysics with science, literature, and politics.
When we think of the gradual shift among European cultures from paganism to Christianity, we are likely to see one wacked-out metaphysics being replaced by another: where people once saw divinities in nature, and magical forces permeating their lives, they now saw the Holy Ghost and the salvific powers of the Church alongside the forces of sin and grace. No real advance there: just the exchange of one lunacy for another. But the shift from “Before the Modern” to “the Modern” marks exactly the same sort of transformation: metaphysical lunacy abounds though we are unlikely to see it because we are so closely bound up with it. We are likely to insist just as stubbornly as did the new Christian converts that there is genuine progress in the shift, using of course our new found values to prove how genuine the progress is.
Our own metaphysical lunacy sees individuals as free-floating agents, interested first and foremost in extending their privileges. We are objects rattling around in an uncaring world that has no purpose or end. The power of art and religion is couched only in terms of psychology and sociology – their meaning lies in what they do for us or to us. We conflate reasons with causes. There can be no transcendent or categorical ends, as every end can be understood only as a provisional end adopted by an individual or group. We are content to understand the meaning of life as a matter of choice for individuals to set for themselves.
I am not about to try to call us back to the good old days. Rather, I am pointing out this metaphysics – our metaphysics – as one among many different possible ones. It is not that we have finally seen what is true; it is just that this is where we now find ourselves. Once we grasp this, we can try to understand our mythology as mythology, and then try to integrate it with the mythologies our species has adopted at other times and places. We can begin to see our continuity with our past as thinking, culture-making entities. And then we can begin to see how remarkably weird we are – that is, we can see how arbitrary our metaphysics is, and how strange we are when we take it as granted.
If we manage to attain this perspective, we will have found the sort of skepticism David Hume promoted. In a slogan, Hume’s skepticism was an attempt to recall ourselves to ourselves: that is, to bear in mind our own weaknesses and susceptibility to nonrational influence. He never doubted common sense, or the conclusions of “natural philosophy” or science. But he also knew better than to take them seriously as definitive, proven, or inescapable:
When we see, that we have arrived at the utmost extent of human reason, we sit down contented; tho’ we be perfectly satisfied in the main of our ignorance, and perceive that we can give no reason for our most general and most refined principles, beside our experience of their reality; which is the reason of the mere vulgar, and what it required no study at first to have discovered for the most particular and most extraordinary phenomena. And as this impossibility of making any further progress is enough to satisfy the reader, so the writer may derive a more delicate satisfaction from the free confession of his own ignorance, and from his prudence in avoiding that error, into which so many have fallen, of imposing their conjectures and hypotheses on the world for most certain principles. When this mutual contentment can be obtained betwixt the master and scholar, I know not what more we can require of our philosophy.
The chief difference in this regard between Sloterdijk and Hume is one of method. Sloterdijk shows us our provinciality through exhaustively exploring the other strange metaphysical routes we have taken in the past, along with another strange one – “spherological” – we might now adopt as a live possibility. Hume arrives at a contented agnosticism (his own live option) by deconstructing our “reason” from within.
Our Fall 2014 semester just wrapped up. I asked the students in my seminar to write a longer paper on our three philosophers – and then joined in the fun and wrote one myself.
Divine natures: a tale of three brothers
It would be said rightly that the rumors of the death of God are exaggerated, but it is true nevertheless that the traditional monotheistic belief in a divine person transcending the natural world and using it as some kind of means toward a beneficent and glorious end has fallen upon hard times. Back in the days when so little of the natural world was understood, it was easy to see extraordinary events and coincidences as issuing from the hand of God. In the days when everyone looked to some superior lord in both gratitude and fear, it was natural to ask who was the Lord of All. And in the vaulting cathedrals and darkened temples one did more than marvel at great historical architecture. Belief in a supreme, divine person fit naturally in our social structure and natural ignorance, so much so that arguably it may have been the only live option for our ancestors.
My kids and their friends know the world of Japanese animation in the way my generation can sing the Looney Tunes libretto to “What’s Opera, Doc?” But their involvement in this world goes much further. They regularly convene with their fellow fans, in crazy costumes, and celebrate their common love for a world of warriors with huge swords and neon hairstyles, demons spawned from friendlier zones of hell, bikini-clad princesses with eight-foot long sniper rifles, and a wide assortment of crazed scientists and military types.
Entering this fanzone is entering another world, a world far more accepting and supportive than any other produced thus far by our species. As we stood in a long line for tickets, enthusiasts clicked pictures of one another, always with expressions of admiration and love. It is a lot like a gay pride event – indeed, a lot like one – in which everyone supports one another’s celebration of individuality. A pick-up truck drives by, and a girl in the back shouts out “You all are wonderful!” and everyone cheerily waves back to her.
Once we get inside, we have our fake weapons checked for nonlethality and buy our tickets. I let the kids go off to find their own adventures. I wander the halls for a few minutes, feeling a bit like a weirdo since I am dressed for the mundane world. I am also self-conscious because there are a lot of half-naked young women running about, and I still don’t know where to put my eyes in order to strike the right balance that should be there somewhere between “rude indifference” and “sexual objectification”.
After a bit, I wander off the premises. The convention hall is surrounded by land being developed into commercial structures, which exist as small islands in a sea of parking lots. I navigate myself over weedy mounds of excavated dirt before reaching the shores of the wine-dark asphalt. I spy in the distance strange characters making their way to the con, carrying flags and enormous scythes and grease-stained bags of food from Carl’s Jr. Jets soar loudly overhead because we’re only a few miles from an Air Force base. Would this world be explicable to anyone not in it? As I trudge along, I think about writing a new version of the Canterbury Tales in which a set of cosplayers set out from a convention across a wasteland to fetch lunch at Quizno’s. Nah – too real, too bleak, no point.
I finally shore up at Target and buy some juice and a snack, standing in the checkout line beside a young man in white labcoat and wild white hair, with a huge bolt penetrating his head. I then set out again for the return voyage, one pilgrim among many trekking over the parking lots and mounds of dirt. I return to the car to take shelter in it and type up these reflections. Next to me is a small troop of half-naked ninjas. I smile at them, give a thumbs-up, and say, “You all look great!” They react with uncertain wariness and then ignore me, which is understandable. I want to tell them, “I’m a geek, too! I go crazy over 18th-century automata!” but I suspect they won’t see the similarity.
I rendezvous with the kids for lunch and they are in a sort of happy, over-stimulated daze. Their eyes scan the crowds, identifying characters and assessing the quality of the costumes. I suddenly realize how they must feel when mom and dad take them to foreign cities. I’m asking very basic questions about genres and cultures which they can’t fully answer because, like national and ethnic identities, they don’t make a lot of sense, and you really don’t need to know the details in order to find your way around. Eventually I stop asking.
Back to the con for them, and back to the car for me. Sitting in my car, typing away, makes me feel like a cosplay hermit, like some Obi-Wan in the deserts of Tatooine, waiting for Luke to mature – with the beard, no less, but minus any cool Jedi robe. (This at least is a genre I know well.) Soon enough it is time for me to cross the trackless wastes and scare off the sandpeople.
The three kids – two of them mine, plus a friend of theirs – are supposed to meet me at the Designated Place. #1 and #2 are there; #3 is not in sight. #2 goes off to attend something; #3 shows up; #2 goes off to find #1; #1 returns, but now #3 wanders off to another adventure; it’s just #1 and me; now #2 returns, and we are back to wondering where #3 is. But it takes only a half hour or so for all four of us to gather in the same spot, and they are weary with the intense visual processing of the day. We return slowly but happily to our trusty landspeeder, and away we go.
(Warning: here comes a rant)
I recently had the joy of meeting with colleagues from around the state, but unfortunately most of our meeting was focused on one of the least interesting topics with which academics can interact: outcome assessments, or essential learning outcomes, or “learning how to measure what we value”. Everything that can be said under this heading can easily be articulated within your own brain with a few minutes’ thought. Academic programs need to articulate what knowledge and skills they are imparting to their students; along with this, they need objective measurements of how successful they are at doing this; and the loop must be closed – meaning, programs need to use the results of these measurements to modify or reform what they are doing.
What, pray tell, are the skills and knowledge to be taught? Well, the principles, terms, and theories that are fundamental to the discipline, of course; and the skills are critical reading, thinking, writing, and the ability to work constructively with others. Even (!!!) in humanistic programs like philosophy or literature or languages, programs can boast that they are giving students the critical skills they will need to combat the difficult problems they are sure to face in their multiply-careered lives. (Another way of putting this point: even though what humanists teach is crap, the skills students gain by learning that crap are useful.) And on and on and on. All too predictable.
But in this eager race to the lowest (but easily measured) intellectual denominator, the “most essential” learning outcome of all is seldom noticed. What do students expect from college? What do professors like to see as they watch students proceed through the ranks? What do employers want to see emerging at the other end? Interesting minds. While it is said that students want college to give them jobs, I very much doubt that it is true. Students want to have their worlds rocked by ideas and insights, and they want to become intelligent and interesting. Professors love nothing more than to see the lost and naive freshman become a thinker alive to ideas and objections and concepts. Employers want college grads who are interesting (meaning: smart and creative) – people who can diagnose problems in fresh ways and brainstorm solutions into being. That is exactly the sort of thing you can’t reliably obtain from someone who hasn’t spent months or years grappling with treatises, fictions, heterodoxies, and paradoxes.
Indeed, most of the discussions regarding “assessment” are fine examples of exactly what we do not want to see college producing: vague and uniform truisms, hooked up with measures so meaningless as to guarantee that nothing will ever change. It is the deadened life of the bureaucratic mind. But imagine, as an alternative, academics charting the careers of students who have turned out to be really interesting, and trying to figure out what really happened, and to what extent their own courses or programs can take any credit for it. Undoubtedly, there never will be any sure-fire formula. But we might be able to collect a range of good practices, interesting ideas, experiments to try, as well as some solid critiques of what can stultify a college career.
There would be in-house benefits as well. It may turn out that each discipline needs something outside itself in order to improve the chances of its students gaining interesting minds. An accounting major might be lit afire by an art history course, as a philosophy major might develop new approaches by spending a semester in computer science. Professors, to the extent they wanted to make their students interesting, would have to get out in the wide world of the college campus and see what was on offer, just so that they could better advise their students. Who knows? In the end, they might end up having interesting minds as well.
- “All history is history of thought.”
- “Historical knowledge is the re-enactment in the historian’s mind of the thought whose history he is studying.”
- “Historical knowledge is the re-enactment of a past thought incapsulated in a context of present thoughts which, by contradicting it, confine it to a plane different from theirs.”
I must admit to knowing very little about Collingwood’s philosophy of history. A bright and motivated student and I are planning to read through his Idea of History after plunging through some Hegel this semester. So I hope to know more in a few months. Meanwhile, the principles above come from RGC’s Autobiography, which is very short, witty, and interesting. He traces his own course through schools, academic politics, intellectual interests, and world events.
I have been thinking about RGC’s principles with reference to the history of philosophy, of course. (It probably fits better there anyway; many friends of mine are professional historians and I think they would place severe qualifications upon the claim that history is first and foremost about ideas.) There has been a lot of discussion among historians of philosophy about the nature of what they do, and there has emerged something of a divide between the “philosophy” historians of philosophy and the “history” historians of philosophy. (Please, bear with me.) Those in the “philosophy” camp read historical philosophical texts as attempts to get at the philosophical truth, just as contemporary articles in metaphysics or ethics or epistemology are attempts to get at the truth. Those in the “history” camp set this concern aside, and are more interested in getting at what the historical author had in mind, given the author’s historical circumstance – more like what RCG says history is all about.
There is something to be said for each camp. Jonathan Bennett has been the most forceful proponent of the “philosophy” camp. He reported that he found wrestling with texts of Leibniz, Descartes, Locke, and Spinoza is the best way for him to think through the underlying philosophical issues. He never spared the Great Dead from his extremely analytical mind, calling spades spades and flagging every lapse. He would lend whatever aid he could to help get a philosopher’s system working, and he found several of the old systems to be not only salvageable but powerfully right in some of their claims. He couldn’t see why any philosopher (qua philosopher!) would read old philosophical texts in any other way.
(I am writing about Bennett in the past tense, but only because he has retired from active academic publication. He is very much alive, though recently suffered the loss of his wife Gillian, whose death was itself a testament to her nobility, her integrity, and her concern for enlightened social policy. The story can be read here.)
The “history” camp has greater interest in placing philosophical texts in their historical context, though very often the “historical context” comes to mean other philosophy texts from the same period. Seldom is any attention paid to the wars being waged or the political disturbances and factions of the day, let alone the actual living and working circumstances of the authors. That’s just historical “noise.” The thinkers are treated as if they live in bubbles of cogitation, insulated from the contingencies of existence.
RGC would have us employ historical knowledge and imagination to draw closer to the thinkers as they think their thoughts. This would require us to not just closely read the works of Hobbes and Spinoza – even in Latin! – and pay close analytical attention to every jot and tittle in the texts. We would have to start thinking about the issues of their days, the audiences they were writing to, and perhaps even their own political ambitions for status and readership. They weren’t interested solely in philosophical truth – indeed, no one is. They were interested in controversies and careers, and many of them lived in times where misplaced publications would have very dire consequences. “Re-enacting” the philosopher’s thoughts in our own mind requires knowing quite a bit more about the philosopher’s lived reality.
But why would a philosopher (qua philosopher!) be interested in this? I think the answer can emerge by thinking through RGC’s third principle. As we strive to understand historical thought, we can’t do so without underscoring the relevant differences between our circumstance and theirs. We “incapsulate” the thoughts within our own times, which just means: we know we are doing history, and not contemporary philosophy. We pay attention to the differences. But this also means paying attention to the sameness. We don’t live in those historical times, but some of the concerns those thinkers had should be familiar to us. We know death, sickness, and the comfort of friendship. We are familiar with the fear of mobs and the ideals of a well-functioning state. Many things change, but these do not. When we see how thoughtful human beings responded to their times, we see human beings responding to time; and that is relevant to us. RGC is right that, by doing history, we in a sense “contradict” the historical thoughts with present thoughts, and so put them on a different plane. But we find familiar human beings on every such plane. And from them, we learn.
A potted history:
I believe Peter Sloterdijk is right that the Enlightenment has been followed by philosophical cynicism, or an impressive array of natural knowledge unaccompanied by any faith in providence. The U.S., which became the dominant intellectual and cultural force in the course of the 20th century, was well-suited to put this cynicism to work: for America was built upon a pragmatic, “can do” attitude, and seemed ready to let expediency drive ideology . (There are probably interesting connections here to Protestantism and Holland of the 17th century.) And so there arose on American shores the fulfillment of the German idea of a research university, with its faculty as a specialized workforce and its students as Model-Ts rumbling down an assembly line on which three credits of this and three credits of that are bolted on to each chassis.
Each academic discipline became a guild or union, where membership is tightly controlled and guild members insist on their indispensability to the general curriculum. New disciplines created their own means of controlling membership and making cases for their newfound indispensability.
As unions generally lost power and new models of management were developed in the last third of the 20th century, the university also experienced a shift in authority from the faculty to the administration. In the names of efficiency and accountability, administrators deployed numerous measures for evaluating faculty “productivity”; and the nature of these measures encouraged faculty to entrench themselves more firmly in their respective guilds.
In the case of philosophy, this meant (1) more attention devoted to narrow problem-solving activity rather then efforts to deepen philosophical wonder; (2) increasingly narrow specialization and less general knowledge of the discipline itself and its history; (3) less engagement with anyone outside the professional guild; and (4) development of various cants and shibboleths to patrol membership in the guild.
What to do? (Provided, that is, that one is inclined to see these results as problems!)
Most academic philosophy departments see themselves primarily as housing a specialized academic discipline, and contributing only incidentally here or there to a university’s general education curriculum. The priority needs to be reversed. Frankly, there is little or no need for specialized academic philosophy; if it disappeared overnight, the only ones who would notice would be the practitioners themselves. But on the other hand, despite the occasional iconoclastic polemic saying otherwise, there is a widespread recognition that philosophy provides a valuable contribution to the mind of an educated person, even if the person is not working toward a degree in the field. Philosophy professors need to see their primary job as enriching the mental lives, values, and discourses of non-philosophers. For almost everyone, we should be a side dish rather than the main course. That is where our societal value lies.
Now it can be argued that in order to do this well, philosophers also need opportunities to continue to learn and grow: they too need the chance to “geek out” with fellow philosophers through publications and conferences. And, where there is both talent and motivation, some philosophers will manage to advance our very old and rich discipline. But genuine advances in philosophy will not happen with the frequency of advances in younger and more technological disciplines, like computer science and chemistry. Genuine advances in philosophy are as few and far between as are the geniuses of the 17th, 18th, and 19th centuries. For most of us most of the time, our primary job is to enlighten masses.
If philosophy reconceived itself along these lines, graduate training in philosophy would look very different. Right now, the usual aim is to equip each student for intensely critical interaction with a vanishingly narrow band of specialists. (Typically, these PhDs are then hired to teach very broad undergraduate classes – an assignment for which, of course, they are wholly unprepared.) But if my proposal were adopted, these candidates would be trained to engage meaningfully, fruitfully, and philosophically with a wide range of people lacking expertise in philosophy. They would be required to write not dissertations, but books that could meaningfully inform the lives of their fellow citizens. This would be the norm rather than the now-celebrated exception. Philosophy would move out of the tower and back into the agora.
I can hear the complaint: “But there are many really smart people who are now attracted to philosophy’s narrow and difficult questions, and wouldn’t go into the discipline at all if they instead had to ‘dumb down’ their efforts for bigger audiences.” I grant the objection, and have three responses:
- First, it seems to me that these smart people might be able to find as much enjoyment working through equally difficult abstract problems in other fields – fields in which solving the problems would have more impact on more people. Smart problem-solvers are in demand all over the place.
- Second, there would still be room in the discipline for some really smart, narrow specialists, even if most of the room were given over to the broader task I’m recommending. Right now, of course, all of the room is reserved for narrow specialists – and that just doesn’t seem sensible, especially given the nature of the great majority of teaching jobs that exist.
- And third, I bet that for every person who is drawn into philosophy because of an inordinate enthusiasm for tight and narrow problems, there are ten really smart people who turn away from the discipline because there is no current opportunity for tackling broad and deep questions, and bringing them to the attention of wider audiences.
It would take some courage for philosophy as a discipline to make this move and “demean itself” by talking to broader audiences. It might seem like some sort of admission of defeat. But in reality, I think this move would be greeted very enthusiastically by a lot of educated people who have become increasingly disappointed in academic philosophers’ refusal to connect with people other than themselves. Moreover, it might encourage other disciplines in the humanities and social sciences to follow our lead, and recall their original purpose: to enlighten, deepen, enrich, and complicate the minds of human beings from all walks of life.
I went for a long bike ride yesterday. At the start I was just rolling along, letting my mind wander, and taking in the sights:
• kids selling lemonade,
• a well-kept garden,
• a Rat Patrol-style jeep with a gatling gun perched on top and three guys wrapping it in plastic,
• an interesting older red pick up for sale, …
Wait….. whaaaaa? So I had to turn the bike around to investigate.
Sure, enough, the jeep was painted drab, army green, with faux motor pool numbers stenciled on the side. It was on a transport trailer, and looked to be in near-new condition, so it was probably being sent off to a customer somewhere. On the passenger side, just outside the vehicle, was a rifle holster, with a nasty black armament in the holster. And hard at work were three guys wrapping the jeep in plastic (I suppose to protect it against kicked up stones on the road).
I learned recently that the idea of bolting a machine gun to the top of a moving vehicle was the idea of George S. Patton, he of pearl-handled revolver fame. He first used the weapon in the Pancho Villa Expedition, strapping a gun atop a 1915 Dodge, racing up behind the enemy, and blasting away.
“So…,” I said to the guys, “what’s going on up top there?” I gestured toward the gatling gun. It had six long round barrels, two pistol-grip handles, and a big red button.
They kept wrapping. Their leader eventually explained, “It’s for crowd control.”
“It’ll do that,” I agreed. “But is it real? I mean, it isn’t, right? It has a big red button on it, and nothing real has a big red button on it.” I was rapidly reaching the end of my knowledge of weaponry.
“It’s real. I added the big red button myself; it didn’t come with that. It delivers (x hundred? y thousand?) rounds per minute, something, something, something.” He added, “It shoots CO2 pellets.” So it was in fact a very badass BB gun. It wouldn’t kill, but it would definitely disperse a crowd. The other two guys kept wrapping and did not acknowledge my presence.
Now I’ve lived in the rural west long enough to know not to ask questions to which you don’t want to know the answers. So I said, “Well, it sure looks cool. You guys have a good day,” and pedaled off. The encounter gave me plenty of material for thinking over my ride. My guess is that this guy, out of his home, equips vehicles with weaponry, under contract with – well, with whom? Probably not with municipal police units, since the jeep was done up to look federal, and those agencies like to keep it clear who is doing the shooting. Possibly with paramilitary groups, of which there seem to be increasing numbers. Or possibly a low-GDP foreign government? Mexico? Puzzling, the things one sees from time to time. I’ve got to admit, though, that thing was cool.