On the mistaken view that there is something especially sciency about the so-called scientific method
I heard it again on the radio today as I was driving around. The story was about a new science curriculum to be introduced in public schools. The problem with the existing curriculum is that kids are getting the idea that being a scientist means reading lots of textbooks and memorizing stuff. “They are thinking that science is history,” the person being interviewed said.
Since the person who said this has been to school for a great many years, I’m guessing there is something wrong with the existing history curriculum as well. I don’t think there is any career, inside or outside of education, that requires reading lots of textbooks and memorizing stuff. (Well, okay, here’s the exception: professional Jeopardy contestant.) Textbooks are always a consequence of learning (fortunate/unfortunate) – not the business of it. When it comes to the business of learning, science, technology, and math are not really distinct from history, literature, or philosophy.
I tried to make this point once as a guest speaker in an anthropology class. The teacher had the quite laudable goal of wanting to show her students different approaches to knowledge, so she invited me, a philosopher, to come in and describe how we do things over in the humanities side of the woods. I tried to show that generating or discovering knowledge works pretty much the same way wherever you go. (I also tried to convince them that scientists live in social and cultural circumstances, and the theories they come up with sometimes have more to do with those influences than with strict allegiance to the scientific method, but that’s another story.)
So the “scientific method” runs something like this. You come across something that seems interesting, and you poke around in it for a bit. After a while you make a guess about what’s going on. Then you think about what else you should be seeing if your guess is right, and you go looking for it. (Alternatively, you figure out what should not be happening, and see if it is happening nonetheless.) If what should be happening isn’t happening, or if what shouldn’t be happening is happening, then – sorry! – your guess is wrong and you should try again. Repeat.
There’s nothing especially sciency about that. Suppose you are reading Moby Dick, and you start thinking that the fact that the whale is white might be significant somehow. So you make a guess: maybe it is supposed to represent God, or maybe the idea of white supremacy. So with that idea in mind, you go back into Moby Dick and look for other passages that either support or do not support your interpretation. You research what the earliest readers of the text said about it, or what Melville himself may have said, and you start working toward better and better guesses. You will surely find better and worse answers to the question, “What does the white whale represent?”, even if you don’t come across a definitive answer. You might have to revise one of your background assumptions that there should be a single thing being represented. But all of this is as much in accord with “the scientific method” as figuring out how photosynthesis works or what an enzyme does.
From what I read and hear, the push for more STEM education misses this central point, that intellectual discovery works pretty much the same way no matter what you are studying. Indeed, the privileging of STEM over other disciplines tends to perpetuate the view that science is reading and memorizing textbooks – for science differs from the humanities only in the peculiar differences in the domains: inclined planes vs. caste systems, planetary revolutions vs. printing presses, mixing stuff in beakers vs. studying original documents. Those differences in domain amount only to different piles of facts that have to be learned and kept in mind: once the data are in, the way of thinking about them and interpreting them works the same way: guess, derive other consequences, and look for them. The “scientific” method is just the method of thinking.
As the Republic of Letters expanded in the late 17th/early 18th centuries, gentlemen began to assume titles which were, let’s say, a bit generous. Johann Burkhardt Menke’s Charlatanerie des savans (1715) brings such persons up short –
Since the beginning of the Restoration of the Sciences, has not this fury for Titles, & if I dare to speak so, this Titlemania, been carried as far as giving a simple Jurist the title Invincible monarch of the Empire of Letters. Do not expect that I speak to you here, Messieurs, of these Scholastic Doctors, Doctors Angelic, Seraphic, Illuminated, Subtle, Admirable, Universal, well-founded, very-resolute; nor of this Visionary, who according to the report of several people worthy of belief, had his Portrait engraved on a steel plate underneath a Crucifix, to which he inquired laconically, Lord Jesus, do you love me? And the Saviour responded to him emphatically, Yes, very-illustrious, very-excellent, & very-learned Lord Segerus, Crowned poet of his Imperial Majesty, & very-worthy Rector of the University of Wittenberg, Yes, I love you.
(From Anne Goldgar’s Impolite Learning)
My academic specialty is known among philosophers as “early modern” philosophy, and by that is typically meant a string of canonical figures extending from Descartes to Kant. Before Descartes, philosophy is all medieval (the story goes); after Kant, it is an assortment of heady idealism, existentialism, utilitarianism, and nascent naturalism. (Philosophers mostly disregard the Renaissance and the term “Enlightenment,” as it is pretty difficult to tease out arguments from philosophers falling under those headings without having to study a lot of other stuff, like literature and history and politics.) For the most part, philosophers sort out historical periods on the basis of when certain figures they happen to find useful happened to live.
Of course, this is silly, and we all know it is, but it is convenient for things like organizing sessions at conferences and classifying both jobs and job-seekers, so it is retained. But, setting aside the fact that the practice won’t change anytime soon, I’d like to explore a more meaningful way to divide philosophy’s comparatively recent history.
In my mind, the modern period begins with Gutenberg’s invention, around 1450. There is a long and complicated story to be told here, involving both the Renaissance and the Reformation, but it is the advent of printed books that forcefully changed the nature of the learned European world. Scholars were rapidly inundated with such a variety of authors and ideas that it became imperative to construct new orderly systems. The grand cathedral of monotheistic Aristotelianism was blown apart into many cottages built by individuals trying to find new ways to make sense of it all.
We can identify this period of individual system-builders as the “early modern” period. It’s not that these individuals acted in complete independence, of course – the Republic of Letters arose inevitably as an attempt for everyone to keep track what everyone else was up to. But there were very nearly as many systems as there were system-builders, a flourishing of philosophical species, while early science began to develop as a sort of strain to separate winners from losers.
In my mind, the “early” segment of the modern period came to an end in roughly 1750, with the rise of truly encyclopedic thinking. The French Encyclopédie was put forward as a sum total of knowledge, not written by a single person, but written by a team of philosophes actively constructing a new cathedral to house the Enlightenment. The overall shift, seen from this high altitude, was from books representing individual projects to books identified under themes of large and broad movements. Philosophical efforts were less individualistic and more communal (excepting the early existentialists, who bucked the ruling trend). This to my mind is the true modern period, which begins with the Enlightenment. The very question, “What is Enlightenment?”, is a peculiarly modern question.
When did the modern period end, and the post-modern period begin? Though of course there is a genre of philosophical literature called “post-modern,” I don’t think there really has been any post-modern period. (It sort of fizzled, didn’t it?) So long as there are large banners of thought, and people enlisting themselves within schools of ideas, we are in a modern period. There may come a time when the question “what kind of thinker are you?” is met with a blank stare, and schools within disciplines and even disciplinary thinking itself becomes extinct – but it hasn’t happened yet.
I recently came across a 1685 English translation of Comenius’s “World of Pictures,” which was a primer aimed at helping children to learn Latin. (Comenius’s original was for German children, but this book was translated by Charles Hoole.) The idea was to give this book to kids and just let them enjoy the pictures and figure out the text for themselves (once they learned their ABCs). Each page offers a picture of a topic or event, and then offers side-by-side English and Latin descriptions, with references to parts of the picture.
Here, for instance, is the first page, inviting the young scholar to the master/pupil relation –
And here is a sea battle (“when huge Ships, like Castles, run upon one another with their Beaks, or shatter one another with their Ordnance, and so being bored thorow, they drink in their own destruction, and are Sunk”) –
Thanks, Early English Books Online!
I recently had the joy of discussing perfect and invented language on Utah Public Radio with USU Folklorist Lynne McNeill, who, as it turns out, speaks some Klingon. If you are interested
The sifting of human creations! —nothing less than this is what we ought to mean by the humanities. Essentially this means biography; what our colleges should teach is, therefore, biographical history, that not of politics merely, but of anything and everything so far as human efforts and conquests are factors that have played their part. Studying in this way, we learn what types of activity have stood the test of time; we acquire standards of the excellent and durable. All our arts and sciences and institutions are but so many quests of perfection on the part of men; and when we see how diverse the types of excellence may be, how various the tests, how flexible the adaptations, we gain a richer sense of what the terms “better” and “worse” may signify in general. Our critical sensibilities grow both more acute and less fanatical. We sympathize with men’s mistakes even in the act of penetrating them; we feel the pathos of lost causes and misguided epochs even while we applaud what overcame them.
Such words are vague and such ideas are inadequate, but their meaning is unmistakable. What the colleges—teaching humanities by examples which may be special, but which must be typical and pregnant—should at least try to give us, is a general sense of what, under various disguises, superiority has always signified and may still signify. The feeling for a good human job anywhere, the admiration of the really admirable, the disesteem of what is cheap and trashy and impermanent—this is what we call the critical sense, the sense for ideal values. It is the better part of what men know as wisdom. Some of us are wise in this way naturally and by genius; some of us never become so. But to have spent one’s youth at college, in contact with the choice and rare and precious, and yet still to be a blind prig or vulgarian, unable to scent out human excellence or to divine it amid its accidents, to know it only when ticketed and labeled and forced on us by others, this indeed should be accounted the very calamity and shipwreck of a higher education.
The sense for human superiority ought, then, to be considered our line, as boring subways is the engineer’s line and the surgeon’s is appendicitis. Our colleges ought to have lit up in us a lasting relish for the better kind of man, a loss of appetite for mediocrities, and a disgust for cheapjacks. We ought to smell, as it were, the difference of quality in men and their proposals when we enter the world of affairs about us. Expertness in this might well atone for some of our ignorance of dynamos. The best claim we can make for the higher education, the best single phrase in which we can tell what it ought to do for us, is then, exactly what I said: it should enable us to know a good man when we see him.
The rest of the lecture can be found here.
(an excerpt from How You Play the Game):
I have killed three dogs in Minecraft. The way to get a dog is to find a wolf, and then feed bones to the wolf until red Valentine’s hearts blossom forth from the wolf, and then it is your dog. It will do its best to follow you wherever you go, and (like a real dog) it will invariably get in your way when you are trying to build something. Apart from that, they are just fun to have around, and they will even help you fight monsters. If they become too much of a nuisance, you can click on them and they will sit and wait patiently forever until you click on them again.
I drowned my first two dogs. The first time, I was building a bridge over a lake, but a bridge that left no space between it and the water. The dog did its best to follow me around, but it soon found itself trapped beneath the water’s surface by my bridge. Not being smart enough to swim out from under the bridge, it let out a single plaintiff yelp before dying and sinking. Exactly the same thing happened to my second dog, as it was this second episode that made this particular feature of dogs clear to me. I know now to make dogs sit if I’m building bridges. I’m not sure what happened to the third dog, but I think it fell into some lava. There was, again, the single yelp, followed by a sizzle. No more dog.
I felt bad each time, while of course fully realizing that only virtual entities were being killed. Surely some of the sorrow I felt was imported from the real world, where I am fond of dogs and do what I can to avoid drowning or burning them. I could not be said to have developed a meaningful relationship with my virtual dogs, but I was pleased to see them each time they caught up with me, and I was a little sad to realize they wouldn’t be getting in my way anymore. I think I was right to feel at least a little bit bad about killing them. But how can there be any place for emotional or even moral attachments to virtual characters? What could cause me to feel any kind of sympathy or concern for beings that don’t really exist?
The answer probably has something to do with the way humans are wired to form attachments to other beings generally. From the perspective of evolution, it’s obviously good that we form strong attachments to human infants. It’s also good that our ancestors formed strong attachments to pets, and not merely because pets can be trained to help us. Pets also share their germs with us, and those ancient people whose constitutions allowed them to survive their pets’ germs bestowed upon us, their descendants, stronger immune systems. So over time we have come to be wired to love furry little things. This general disposition to like animals probably spills over into our encounters with virtual animals – and so we come to feel attachments to them too.
But to explain our attachments to virtual animals in this way does not necessarily lessen those attachments. After all, similar explanations explain why we like babies and dogs – and even cats – and typically we take those felt obligations very seriously. So unless we have some very good reason to overrule the concerns we naturally feel for virtual animals, we should take those obligations seriously as well. But here is a very good reason to overrule these concerns we feel: unlike real dogs and cats, virtual dogs and cats don’t actually feel anything. They are not any more real than the dogs and cats in dreams or comic books. So while we might naturally feel some attachment to them, we do not have any real obligations to them. Thinking that we do have obligations would be like thinking we should worry about how Snoopy feels if we stop reading Peanuts. There isn’t any real being there to be an object for our concern.
We humans form attachments with the unlikeliest of objects. Anyone who doubts this might consider how they would feel if they were asked to throw away their teddy bear. It is just an object, of course, with no feelings or thoughts whatsoever. No real harm is done to anyone by holding it under water or tossing it into a pool of lava. Yet we would not do such things lightly, since we have formed attachments to our teddy bears. We have been through a lot with them – scary nights and lonely times – and having them in our arms has helped us to feel better. We end up feeling that we owe them something, out of gratitude and respect. When it comes time to part, we might consider passing them along to other children, so long as we can assure ourselves that the teddy bears are going to good homes. At one level this is all silly, of course. But that level is not nearly as important to us as the level on which it is not silly at all.
When it comes time to tear down the old school, we feel concern for the bricks and mortar. We never regard them as sentient, but we think of the role the building has had in our lives – perhaps we have passed through the halls, and so have our children. We have grown up there, and we cherish it. It is, we say, of “sentimental value”, but too often that label is preceded with the qualifier “merely”. There is nothing “mere” about such value. None of us would want to see sentimental value deprived of all significance, even if we realize that sometimes it is necessary to let go. At these moments, we register a loss, and the loss often seems to us immeasurable in the terms of any other calculus.
There is no reason not to form attachments to virtual places and characters as well, though this kind of object is relatively new to the scene, and typically these virtual objects aren’t around for very long even in the best of cases. Consider the story of Jerry the slime. A prominent Minecraft player (“CaptainSparklez”) found himself being followed around by a baby slime with a cute smile. He hesitated to kill it, and decided instead to build it a pen and name it “Jerry”. Jerry eventually de-spawned, as that is the fate of such creatures. CaptainSparklez and his followers were sufficiently distraught to build an enormous tree as a monument to Jerry. Now, to be sure, this whole episode was motivated in large part by the joy of pure silliness, and by players just wanting to have some fun with the idea of memorializing a slime. But, undeniably, there was a primitive attachment that served as a basis for the fun. It is not unthinkable for someone to feel that sort of attachment to a virtual slime – indeed, its possibility is what makes the joke possible in the first place. And even now I can hear CaptainSparklez protest in mock seriousness, “How dare you make fun of Jerry!” But the seriousness is not entirely mocking. (For whatever it’s worth, we may also note that Jerry has his own fans who have created special Jerry mods and a special Jerry game. He has his own Facebook page. Jerry has more friends than I do.)
The point is that we form attachments to things that may have no feelings or rights whatsoever, but by forming attachments to them, they gain some moral standing. If you really care about something, then I have at least some initial reason to be mindful of your concern. (Yes, lots of complications can come in here – “What if I really care for the fire that is now engulfing your home?” – but the basic point stands: there is some initial reason, though not necessarily a final or decisive one.) I had some attachment to my Minecraft dogs, which is why I felt sorry when they died. Had you come along in a multiplayer setting and chopped them to death for the sheer malicious pleasure of doing so, I could rightly claim that you did something wrong.
Moreover, we can also speak of attachments – even to virtual objects – that we should form, just as part of being good people. Imagine if I were to gain a Minecraft dog that accompanied me on many adventures. I even offer it rotten zombie flesh to eat on several occasions. But then one day I tire of it and chop it into nonexistence. I think most of would be surprised: “Why did you do that? You had it a long time, and even took care of it. Didn’t you feel attached to it?” Suppose I say, “No, no attachment at all”. “Well, you should have”, we would mumble. It just doesn’t seem right not to have felt some attachment, even if it was overcome by some other concern. “Yes, I was attached to it, but it was getting in the way too much”, would have been at least more acceptable as a reply. (“Still, you didn’t have to kill it. You could have just clicked on it to sit forever….”)
The Minecraft book is available now (see right column). It was loads of fun to write, and it was even more fun exploring the game with my son. The whole process of working with Kindle Singles was fun, too. The editor I worked with was very helpful, insightful, and thorough.
There is no text more commonly read in philosophy courses than Descartes’s Meditations on First Philosophy. This is astonishing, given that the work was written well over three centuries ago. To some extent, to be sure, it is so commonly assigned simply because it is so commonly assigned; that is, it is hard to imagine an undergraduate escaping from a program in philosophy without having read the work at least once – because every program assigns it. This is a true perpetual motion machine.
But the book is a natural choice to assign to undergraduates because of its approach. Descartes pretends to cast aside all the crap he learned in high school and figure out for himself what he should believe. This is exactly how we would like to think of undergraduates, leaving the home environment and state-mandated education for the first time and venturing out on their own to discover the world, building their own minds in the process. Descartes asks his readers to doubt everything, and see what remains as indubitable; and he builds a new world upon a new foundation. And that is the ideal of university education – and yes, yes, it is an ideal rarely met, and one that is always under attack by those who would like to see the university as training for the workplace rather than an enterprise in soul-building. (Newsflash: it is going to be soul-building in any event, and our only choice is in deciding what sorts of souls are going to be built.)
But why did Descartes write this work? Was he trying to write a bestselling textbook for university students? Was he excited about his own efforts at soul-building, and intent upon sharing his success with the world? He had each of these motivations, in some sense. He certainly was excited and wanted to be influential. But his chief motivation was to offer to a broad, reading audience in his day a new structure for their beliefs. For any reader keeping score, not much has changed by the end of his work – he starts out believing in God, a soul, and the physical world – throws this all into doubt – and ends up believing once again in God, a soul, and the physical world. What has changed is the arrangement of these beliefs, and what they are based on: Descartes’s world has gained a different structure upon different foundations. His aim in the Meditations is to convince his readers that they can still believe in all the important things they want to believe in, even if they accept the radical revolution in physics and metaphysics that was brewing in the 17th century. It is just that they will have to re-arrange their beliefs a bit.
The view that was being overturned in the 17th-century had its roots in Aristotle’s philosophy. According to this view, the main players in the world are substances, or bundles of matter that have certain natures, and behave in ways according to those natures. Every substance tries to go about its own natural business, but inevitably each ends up getting in another’s way, sometimes in ways we like and other times in ways we don’t like – and thus the world. Descartes and his comrades believed that the content-rich “natures” of substances could be replaced by more austere, geometrical entities. Basically, the new philosophers asked us to replace a blooming, buzzing botanical garden of metaphysical natures, forms essences, modes, and qualities with a sculpture park designed by Mies van der Rohe. It must have looked like a very poor exchange indeed, giving up an extraordinarily rich set of explanatory powers for a set of meager promissory notes that did not encourage much confidence. The philosophers that Descartes was writing to worried about the cost of swapping out one operating system for another: how would the change affect our beliefs in God and in the soul, as well as our commonsensical ways of explaining nature?
So Descartes wrote the Meditations as way of saying, “See? You can still have strong arguments for the existence of God, and for believing in the existence of a soul; and you can have excellent new strategies for explaining why the physical world behaves as it does.” It was meant as a persuasive and reassuring work, a work that demonstrated that you could still do everything you wanted in the new operating system. (I am not sure whether to cast Descartes as Mac or PC; his system was slick, new, radical, and hard to use – so maybe Linux?) His arguments, as countless undergraduates have demonstrated, are not faultless. But that’s okay; they are good enough for Descartes’s primary purpose, which was to show that conversion to the new philosophy does not require giving up on the sorts of arguments valued by philosophers of the time. His overall rhetorical strategy is to demonstrate that from a completely unbiased starting point – one that is achieved through radical skepticism, doubting everything you think you know – it is perfectly possible to end up inhabiting the new system. It was not as foreign as it may have seemed.
Seen in this way, Descartes’s Meditations really is a work that is stuck in a particular historical context. It is safe to say that few readers today are seeking to be reassured that they can give up their Aristotelian metaphysics for a geometrical world view. This makes it all the more surprising that the Meditations is so frequently read today. But the work, of course, has been repurposed: once designed to serve one polemical purpose, it now serves another. Once meant to ease the transition from old to new philosophies, it now eases the transition from pre-philosophical to philosophical thought. And, as one would expect, it succeeds only partway in this new purpose. Students really do find themselves challenged by Descartes’s skeptical doubts – but they are uniformly unimpressed by Descartes’s own solutions. No one buys his arguments. The overall effect, I am afraid, is a general mistrust of philosophical arguments. Students come away thinking that philosophers are much better at raising troublesome, skeptical questions than they are at providing good solutions. Every argument is bound to fail. And this in turn engenders a measure of misology, or a distrust of reason, at least as it applies to philosophical matters. Philosophical theses come to be viewed as indemonstrable matters of taste.
This is unfortunate for our students – and also, by the way, quite unfair to Descartes. Imagine walking into a computer shop, exploring whether to change to a new system. You worry about the new system’s ability to generate spreadsheets. A helpful assistant demonstrates how to set up a short, tidy spreadsheet in the new system. But you reject the demonstration entirely, since you need have no need for the spreadsheet example you have been shown. “No fair!” the assistant pleads. “I was just showing that you could do this sort of thing!” And this is basically what Descartes wants to tell the undergraduate who has just savaged his argument for God’s existence in the third meditation. The student has missed the central point that Descartes’s operating system supplies arguments for God’s existence just as well as Aristotle’s old operating system. But today’s student, of course, has little reason to be impressed by this feature. And that’s why the way that Descartes’s Meditations is usually taught – namely, as a non-polemical, disinterested research into what is known with certainty – ends up being quite unfair to him.
I do not mean to suggest that Descartes was only trying to show that philosophers could still have good arguments within his new operating system. He believed his arguments were truly compelling – and they are indeed much better than our undergraduates take them to be. But his main aim was to get philosophers talking about his arguments, while making use of his new system – that is, he wanted fellow philosophers to try working within his system and find out for themselves that abandoning Aristotle did not mean abandoning philosophy. It is for this reason that he sent out copies of the Meditations to several influential figures, of diverse backgrounds, and published their objections alongside his replies. The resulting publication was itself a demonstration that this new operating system was sufficient for fruitful and intense philosophical discussion – like getting en entity like UPS to use an open-source operating system.
But all this puts teachers of historical philosophical texts in a bit of a quandary. Taking proper measure of an historical work’s context might make it less gripping to modern students (unless they are blessed with geekiness over history). But re-purposing historical works is unfair to those works and leads often to unintended consequences. So should teachers simply leave history alone, and let the dead rest? Or should they plunge ahead anyway, believing that the good in confronting great texts outweighs any mistaken judgments that are encouraged along the way? What is the best way to read/teach this kind of book?
Though I do end up worrying over this question from time to time, in the end I think it is a false dilemma. None of the foregoing concerns should be out of place in an undergraduate curriculum. That is, we can imagine a fantastic class on Descartes’s Meditations in which we read the texts with our own concerns and questions; find problems in the text; introduce more historical circumstance to reorient our reading, and come to a better understanding of the text; and then see whether we have learned any global lessons about history, philosophy, writing, and reading as a result. In the case of the Meditations, we will probably discover that no conceptual revolution is without its costs; that even an antiquated system like scholastic metaphysics had some very real advantages; that revolutions in thinking aren’t “proven” by experiment, but involve a willingness to conceptualize our experience in new ways; and – perhaps most important of all – there is always a deeper story to be told. Books aren’t repositories of truth, but bits of evidence in a crime scene, and it’s up to us to figure out whodunnit and why, and even: so what?
I have long believed that I should love opera. I’m a great fan of “classical” music (a fairly meaningless term, as it encompasses way too much), and view its existence as one of the primary pieces of evidence for believing life is not meaningless. One of the greatest experiences of my life was several years ago when I had the chance to participate in a special seminar on Beethoven’s string quartets. The class incorporated lecturers from various disciplines, and featured a visiting musicologist who knew everything there is to know about those quartets. Accompanying the class was our resident string quartet’s performance of the entire cycle. On successive evenings I was transported into a heavenly region of the soul, one that said everything that can be said about being an embodied mortal with intimations of eternity.
With all that passion, it seems I should love opera. My friends all told me so. But I have tried and tried, only to find it tiresome. The music, of course, is occasionally beautiful – but the drama is so slow, so uneventful, that I have felt the truth of what Mark Twain once recorded:
“I have attended operas, whenever I could not help it, for fourteen years now; I am sure I know of no agony comparable to the listening to an unfamiliar opera. I am enchanted with the airs of “Travatore” and other old operas which the hand-organ and music-box have made entirely familiar to my ear. I am carried away with delightful enthusiasm when they are sung at the opera. But, oh, how far between they are! And what long, arid, heartbreaking and headaching “between-times” of that sort of intense but incoherent noise which always so reminds me of the time the orphan asylum burned down.”
So I was prepared to mark down “opera” as one of those things I just was not engineered to appreciate.
But this week I have subjected myself to Wagner’s Ring cycle – and behold! A breakthrough! I have discovered the key I needed to unlock the secret. I’ll share it here now, for anyone who is in the same position. The key is this: don’t watch opera as drama, in the way you’d watch a play or a film; listen to it as music – with a side accompaniment of a story, which only aims to offer some general themes for reflection. Put the music first, and maintain only a dim awareness of the plot.
I’m not quite through the cycle – Götterdämmuring lies before me – but I am savoring each moment, anticipating what is to come, replaying themes and ideas in my mind. Paradoxically, by paying minimal attention to the story, I am now fascinated by it. But I think it was the music that opened the door for me. It’s too early to say how much this new-found enthusiasm will translate to other operas, but I’m expecting it will. But even if I’ve only learned to appreciate The Ring, I’m glad for it.
(For several months I have lost myself in the thoughts of Peter Sloterdijk, a contemporary German philosopher. I need to continue to read and absorb his works, but the following is a “status report” on what I have found so far.)
“When someone tries to ‘agitate’ me in an enlightened direction, my first reaction is a cynical one: The person concerned should get his or her own shit together. That is the nature of things. Admittedly, one should not injure good will without reason; but good will could easily be a little more clever and save me the embarrassment of saying: ‘I already know that.’ For I do not like being asked, ‘Then why don’t you do something?'” – Peter Sloterdijk
The world we face is not the same as the one Peter Sloterdijk faced in 1983. That world, and Sloterdijk’s Germany especially, was split down the middle, with a belligerent Soviet Union to the east and an exceptionally deadly alliance of nations to the west. Nuclear weapons were primed and ready and held at abeyance only by the mutual recognition that destroying all life on earth would be poor sportsmanship. Germany itself was divided physically by The Wall as well as being fractured throughout in multiple ways by having been the principal aggressor in two world wars, and losers in them both. German philosophers were trying to come to terms with how their glorious and magnificent culture – the home of Goethe, Schiller, Lessing, Beethoven, and Brahms, a culture which was itself a monument to the highest ideals of humanity – could have descended so completely and quickly into the deepest inhumanity. In the bicentennial of Kant’s Critique of Pure Reason, Sloterdijk joined others in wondering whether it was indeed still possible to have any faith at all in reason or in any celebrated ideal of humanity.
But it is also true that the world Sloterdijk faced has not changed much in the last 30 years. The Wall is gone, and the threat of a nuclear holocaust has been replaced by comparatively smaller (though still deadly) threats. But humanist intellectuals remain skeptical of the ideals of the Enlightenment. We see now that it never really was what it pretended to be. Its authors celebrated human freedom while providing a cover story for colonial oppression and subjugation. Scientists strove mightily for a knowledge of nature while industrialists destroyed nature through forge and furnace. Its politicians aspired to democracy while creating wage slaves. The Enlightenment, it seems now, was a theater of false promises at best and a crucible of cruel mendacity at worst.
Where then do we find ourselves in our post-Enlightenment times? In Sloterdijk’s 1983 estimation, we are in an age of cynicism, or enlightened false consciousness. We all know, when we stop to think about it, that the world is not as it should be, and that we are not living as we ought to live. But even as we know this, we numb ourselves to it, and even laugh over the disconnection between what we know and how we live. “We do our work and say to ourselves, it would be better to get really involved. We live from day to day, from vacation to vacation, from news show to news show, from problem to problem, from orgasm to orgasm, in private turbulences and medium-term affairs, tense, relaxed. With some things we feel dismay but with most things we really can’t give a damn” (98-9). Our cities “have been transformed into amorphous clumps where alienated streams of traffic transport people to the various scenes of their attempts and failures in life” (118). We realize all this, and acknowledge it – and then show up at work the next day, since we feel deep down that there really is nothing to be done about it. In this way we are conscious of our situation, and unhappy about it, but resigned to it: “To be intelligent and still perform one’s work, that is unhappy consciousness in its modernized form, afflicted with enlightenment. Such consciousness cannot be dumb and trust again; innocence cannot be regained” (7).
As I work through the recent works of Peter Sloterdijk (Spheres I: Bubbles, Spheres II: Globes), I am chiefly amazed and enthused by his ability to find deep symbolic and mythic connections throughout the history of philosophical thought, and to use that understanding to bring our culture into a startlingly fresh relief. His insights make me feel as if the rest of us are sleepwalking through time. As much as we like to pretend otherwise, we are still mythic beings, ordering our experience according to prehistoric passions and plots. Perhaps every age thinks they are doing something different, and are at last bringing clear and objective thought to what was once distorted through prejudice and magical thinking. But it seems to me that this pretense of objectivity is at an all time high in our culture – for we do have, unquestionably, the most advanced sciences ever. We think now in terms of demythologizing – as if that is ever possible for human beings. We think we can carve our own psychologies out of our theory-making efforts and grasp things in themselves as they are in themselves, if only we make good use of careful experiments and double-blind refereeing processes. We even believe we can see ourselves as we are in ourselves, even when it is we who are doing the looking.
This perspective cuts us off from our own history. For we are instituting a new dateline – something like “Before Darwin vs. After Darwin” or more likely “Before WW2 vs. After WW2″, since that also marks the rise of the modern research university, before which (apparently) no trustworthy science was ever done. We tend to see our new theories and insights as qualitatively different from anything proposed earlier. This turns our history into an inchoate prelude to the present. If we do make the effort to cast a glance backward, we feel that we can sort out the thinking that was on the right track from the crazy or “insufficiently critical” thinking of the past – ignoring the fact that then, as now, everything is thoroughly entwined with everything else, and our current efforts at understanding are in fact not qualitatively different than historical efforts. We are always and forever “in a fine mess,” as Oliver Hardy would say, mixing up mythologies and metaphysics with science, literature, and politics.
When we think of the gradual shift among European cultures from paganism to Christianity, we are likely to see one wacked-out metaphysics being replaced by another: where people once saw divinities in nature, and magical forces permeating their lives, they now saw the Holy Ghost and the salvific powers of the Church alongside the forces of sin and grace. No real advance there: just the exchange of one lunacy for another. But the shift from “Before the Modern” to “the Modern” marks exactly the same sort of transformation: metaphysical lunacy abounds though we are unlikely to see it because we are so closely bound up with it. We are likely to insist just as stubbornly as did the new Christian converts that there is genuine progress in the shift, using of course our new found values to prove how genuine the progress is.
Our own metaphysical lunacy sees individuals as free-floating agents, interested first and foremost in extending their privileges. We are objects rattling around in an uncaring world that has no purpose or end. The power of art and religion is couched only in terms of psychology and sociology – their meaning lies in what they do for us or to us. We conflate reasons with causes. There can be no transcendent or categorical ends, as every end can be understood only as a provisional end adopted by an individual or group. We are content to understand the meaning of life as a matter of choice for individuals to set for themselves.
I am not about to try to call us back to the good old days. Rather, I am pointing out this metaphysics – our metaphysics – as one among many different possible ones. It is not that we have finally seen what is true; it is just that this is where we now find ourselves. Once we grasp this, we can try to understand our mythology as mythology, and then try to integrate it with the mythologies our species has adopted at other times and places. We can begin to see our continuity with our past as thinking, culture-making entities. And then we can begin to see how remarkably weird we are – that is, we can see how arbitrary our metaphysics is, and how strange we are when we take it as granted.
If we manage to attain this perspective, we will have found the sort of skepticism David Hume promoted. In a slogan, Hume’s skepticism was an attempt to recall ourselves to ourselves: that is, to bear in mind our own weaknesses and susceptibility to nonrational influence. He never doubted common sense, or the conclusions of “natural philosophy” or science. But he also knew better than to take them seriously as definitive, proven, or inescapable:
When we see, that we have arrived at the utmost extent of human reason, we sit down contented; tho’ we be perfectly satisfied in the main of our ignorance, and perceive that we can give no reason for our most general and most refined principles, beside our experience of their reality; which is the reason of the mere vulgar, and what it required no study at first to have discovered for the most particular and most extraordinary phenomena. And as this impossibility of making any further progress is enough to satisfy the reader, so the writer may derive a more delicate satisfaction from the free confession of his own ignorance, and from his prudence in avoiding that error, into which so many have fallen, of imposing their conjectures and hypotheses on the world for most certain principles. When this mutual contentment can be obtained betwixt the master and scholar, I know not what more we can require of our philosophy.
The chief difference in this regard between Sloterdijk and Hume is one of method. Sloterdijk shows us our provinciality through exhaustively exploring the other strange metaphysical routes we have taken in the past, along with another strange one – “spherological” – we might now adopt as a live possibility. Hume arrives at a contented agnosticism (his own live option) by deconstructing our “reason” from within.
Our Fall 2014 semester just wrapped up. I asked the students in my seminar to write a longer paper on our three philosophers – and then joined in the fun and wrote one myself.
Divine natures: a tale of three brothers
It would be said rightly that the rumors of the death of God are exaggerated, but it is true nevertheless that the traditional monotheistic belief in a divine person transcending the natural world and using it as some kind of means toward a beneficent and glorious end has fallen upon hard times. Back in the days when so little of the natural world was understood, it was easy to see extraordinary events and coincidences as issuing from the hand of God. In the days when everyone looked to some superior lord in both gratitude and fear, it was natural to ask who was the Lord of All. And in the vaulting cathedrals and darkened temples one did more than marvel at great historical architecture. Belief in a supreme, divine person fit naturally in our social structure and natural ignorance, so much so that arguably it may have been the only live option for our ancestors.