Natural and agreeable fools

Methinks I am like a man, who having struck on many shoals, and having narrowly escaped shipwreck in passing a small frith, has yet the temerity to put out to sea in the same leaky weather-beaten vessel, and even carries his ambition so far as to think of compassing the globe under these disadvantageous circumstances. My memory of past errors and perplexities, makes me diffident for the future. The wretched condition, weakness, and disorder of the faculties, I must employ in my enquiries, encrease my apprehensions. And the impossibility of amending or correcting these faculties, reduces me almost to despair, and makes me resolve to perish on the barren rock, on which I am at present, rather than venture myself upon that boundless ocean, which runs out into immensity. This sudden view of my danger strikes me with melancholy; and as it is usual for that passion, above all others, to indulge itself; I cannot forbear feeding my despair, with all those desponding reflections, which the present subject furnishes me with in such abundance….

I call upon others to join me, in order to make a company apart; but no one will hearken to me. Every one keeps at a distance, and dreads that storm, which beats upon me from every side. I have exposed myself to the enmity of all metaphysicians, logicians, mathematicians, and even theologians; and can I wonder at the insults I must suffer? I have declared my disapprobation of their systems; and can I be surprized, if they should express a hatred of mine and of my person? When I look abroad, I foresee on every side, dispute, contradiction, anger, calumny and detraction. When I turn my eye inward, I find nothing but doubt and ignorance. All the world conspires to oppose and contradict me; though such is my weakness, that I feel all my opinions loosen and fall of themselves, when unsupported by the approbation of others. Every step I take is with hesitation, and every new reflection makes me dread an error and absurdity in my reasoning….

For my part, …. I can only observe what is commonly done; which is, that this difficulty is seldom or never thought of; and even where it has once been present to the mind, is quickly forgot, and leaves but a small impression behind it. Very refined reflections have little or no influence upon us; and yet we do not, and cannot establish it for a rule, that they ought not to have any influence; which implies a manifest contradiction.

But what have I here said, that reflections very refined and metaphysical have little or no influence upon us? This opinion I can scarce forbear retracting, and condemning from my present feeling and experience. The intense view of these manifold contradictions and imperfections in human reason has so wrought upon me, and heated my brain, that I am ready to reject all belief and reasoning, and can look upon no opinion even as more probable or likely than another. Where am I, or what? From what causes do I derive my existence, and to what condition shall I return? Whose favour shall I court, and whose anger must I dread? What beings surround me? and on whom have, I any influence, or who have any influence on me? I am confounded with all these questions, and begin to fancy myself in the most deplorable condition imaginable, invironed with the deepest darkness, and utterly deprived of the use of every member and faculty.

Most fortunately it happens, that since reason is incapable of dispelling these clouds, nature herself suffices to that purpose, and cures me of this philosophical melancholy and delirium, either by relaxing this bent of mind, or by some avocation, and lively impression of my senses, which obliterate all these chimeras. I dine, I play a game of backgammon, I converse, and am merry with my friends; and when after three or four hours’ amusement, I would return to these speculations, they appear so cold, and strained, and ridiculous, that I cannot find in my heart to enter into them any farther.

Here then I find myself absolutely and necessarily determined to live, and talk, and act like other people in the common affairs of life. But notwithstanding that my natural propensity, and the course of my animal spirits and passions reduce me to this indolent belief in the general maxims of the world, I still feel such remains of my former disposition, that I am ready to throw all my books and papers into the fire, and resolve never more to renounce the pleasures of life for the sake of reasoning and philosophy. For those are my sentiments in that splenetic humour, which governs me at present. I may, nay I must yield to the current of nature, in submitting to my senses and understanding; and in this blind submission I shew most perfectly my sceptical disposition and principles. But does it follow, that I must strive against the current of nature, which leads me to indolence and pleasure; that I must seclude myself, in some measure, from the commerce and society of men, which is so agreeable; and that I must torture my brains with subtilities and sophistries, at the very time that I cannot satisfy myself concerning the reasonableness of so painful an application, nor have any tolerable prospect of arriving by its means at truth and certainty. Under what obligation do I lie of making such an abuse of time? And to what end can it serve either for the service of mankind, or for my own private interest? No: If I must be a fool, as all those who reason or believe any thing certainly are, my follies shall at least be natural and agreeable. Where I strive against my inclination, I shall have a good reason for my resistance; and will no more be led a wandering into such dreary solitudes, and rough passages, as I have hitherto met with.

These passages, from the conclusion of the first book of Hume’s Treatise of Human Nature, arrest me like no other. If he had written nothing else but these words on a scrap of paper, he would still rank as one of the world’s most acute philosophers. They dramatically portray the emotional life of the intellectual mind, as replete in self-awareness as they are ruthless in accuracy.

Anyone who feels compelled to meditate on the questions he asks – “Where am I, or what? From what causes do I derive my existence, and to what condition shall I return?” – and does not rest content with self-serving fantasies will land on Humean shoals. Two conclusions are irresistible: first: no, we do not have any answers; and second: the tools we have to work with – “the wretched condition, weakness, and disorder of the faculties” – should make us despair of ever getting any. In the acidic observation of Portal 2’s GLaDOS, “You’re not just a regular moron; you were designed to be a moron.”

This unfortunate fact matters, does it not? And yet, “very refined reflections have little or no influence upon us.” They should, of course. This discovery, if genuine, should leave us utterly paralysed, and we can find no reason why it shouldn’t leave us utterly paralysed. But, luckily (???), nature comes along and rescues us. “I dine, I play a game of backgammon, I converse, and am merry with my friends; and when after three or four hours’ amusement, I would return to these speculations, they appear so cold, and strained, and ridiculous, that I cannot find in my heart to enter into them any farther.” The mood of philosophical angst will pass – just give it an hour or so. You’ll get over it, and find something distractingly fun.

Our knowledge is not such a great thing; and our worry over its mediocrity is not such a great thing either. The lesson to be learned from the Humean diagnosis of the human condition is this: it’s no big deal. If we must be fools, let us be at least natural and agreeable fools.

In the end, Hume goes on to find some good in these “strained and ridiculous” speculations. At least they ward off superstition and delusions of philosophical or religious insight. His weather-beaten vessel shores up at the port of Socratic modesty, taking his own wisdom to be the insight that he really has no wisdom. We’re left with living contentedly among appearances, tempering our actions and opinions with the knowledge that we are fools. But let us be agreeable fools nonetheless.


Posted in Kant and/or Hume, Meanings of life / death / social & moral stuff | 4 Comments

The 3QD experience

I’ve contributed essays to the aggregator site for two years, and have just decided to bring that relationship to a close. Nothing went wrong – no falling out, no throwing of lamps, no screaming fits of “I just don’t love you anymore!” I just decided that I’d had my run, and it was time to free up the spot for somebody else.

It has been a true learning experience. I wanted to get better at writing for a broader, nonspecialized audience, and I think there’s been some success on that front. The easiest mistake for a stuffy prof like me to make, when he tries to write in a popular vein, is to take whatever arcane thing interests him and dumb it down, stick in silly examples, and earnestly believe others will then find it interesting. That doesn’t work, I tell you. Nonspecialized audiences are not dumb; they are just nonspecialized. If you want to reach them, you have to tap into the things any thinking human is likely to be interested in. It could be a good story, a central concern of contemporary life, or an age-old existential threat. Then try to engage that topic with equal doses of insight and humor, keeping the banter both light and significant. Easier said than done, of course, but if you whack away at it for a time you’ll get a little better. I’m thankful to 3QD for giving me some batting practice.

It’s also been interesting to try to situate writing for 3QD with my academic job. I heard long ago that the average scholarly article is read by 2.1 people – including the author. Averages mislead, but I’d say that most articles are read by one or two handfuls of people, at the most. But publishing such things is the “gold standard” of the academic business, since each piece is vetted by a couple of experts and selected for publication over dozens or scores of others. It’s like winning an intensely competitive contest where only your mom and dad show up for the awards ceremony. Of course, each scholarly article advances the frontiers of knowledge, etc., etc., but – amazingly – each article does so even as it is swallowed up by a deep well of obscurity after being read by maybe five people. And this very silly business is what gets you tenured and promoted.

My 3QD bits are read by – well, it’s hard to say, but loads more people than read my scholarly bits. Hundreds, thousands? (The 3QD editor said my essays are seen by 15k-20k people, but I can’t say whether those are actual readers or just sentient organisms on whose eyeballs there has been a momentary flash of something I’ve done.)  The essays are not peer-reviewed, and not competitively selected (though I was competitively selected for the slot in the first place.) So, overall, it doesn’t really count, academic-wise. I’ve just been publishing stuff for readers, not slugging it out with experts.

I’ve said this before, but I’ll say again that it seems to me there’s a better way to run this shop. There is certainly a place for experts writing for one another on matters as arcane as they please – that is vitally important for scholarship, I believe. But isn’t there also a place – particularly in the humanities, social sciences, and liberal arts – for engaging with the concerns of non-experts? Well, yes, of course there is. Not everyone should do it, and no one should do only it. But there needs to be more space for it in the graduate curriculum and in the academy, so that more of us more of the time engage the broader culture whose interests we serve.

So I’m very happy with the relation I’ve had with 3QD. I don’t know what happens next. I’ll keep writing stuff for this blog, and for other random venues as they come along. And I’ll keep checking out 3QD – there’s some very enlightening material there, for all of us.

Posted in 3QD essays, Items of the academy / learning, This & that in the life of CH | 3 Comments

Experiencing the moment

David Hume, that most sly student of human experience, declared he couldn’t find himself anywhere. As he gazed inward, he came across sensations, feelings, passions, and moods, but he had never come across aself in the way one might come across a vivid shade of turquoise or a lampshade or a heartbeat. He could find no “simple and continued” thing underlying his perceptions, as a bed of stone lies beneath an ever-changing stream. And so he haplessly concluded that he was nothing more than a stream, a bundle of impressions, a shifting mass of predicates without a subject. And if someone else has come to a different conclusion – if he stumbles across himself in his own experience –

“I must confess I can no longer reason with him. All I can allow him is, that he may be in the right as well as I, and that we are essentially different in this particular. He may, perhaps, perceive something simple and continu’d, which he calls himself; tho’ I am certain there is no such principle in me.”

It wasn’t long before alarm bells went off. In an appendix to his treatise, Hume admitted he was in deep trouble. The basis of his entire philosophy was the view that distinct events are, well, distinct: it is only our thinking that combines distinct events into ideas of enduring things, into stable causal regularities, into expectations of uniformity in nature, and so on. Our minds create the universe out of the diverse. But if we ourselves are diverse – if there is no unity even in us – then how we ever be able to pull off such a trick? Without a simple and continued thing to assemble all the broken fragments into a whole, how is the appearance of a whole ever to come about? “All my hopes vanish,” he wrote, as he entered into this deepest of all labyrinths. He provided no solution, not ever, and he never wrote on this subject again.

Read more…

Posted in 3QD essays, Kant and/or Hume | 2 Comments

On knowledge regimes

Yesterday I came across the phrase “early modern knowledge regime,” and it teased my curiosity. What could this term mean? [I already have a short list of books to start reading, but I’ll begin first with what’s in reach and on top of my neck.]

It probably comes from Foucault: “Truth is a thing of this world: it is produced only by virtue of multiple forms of constraint.  And it induces regular effects of power.  Each society has its regime of truth, its “general politics” of truth: that is, the types of discourse which it accepts and makes function as true; the mechanisms and instances which enable one to distinguish true and false statements, the means by which each is sanctioned; the techniques and procedures accorded value in the acquisition of truth; the status of those who are charged with saying what counts as true” (quoted from Rabinow’s Foucault Reader, 1991).

But the social dimension of knowledge regimes, while pervasive and important, is not the only dimension. Facts of experience have something to say as well. And, yes, while it is true that there is an undeniable social dimension to what counts as a fact, it is also true that the intrinsic character of the facts is something else. What I mean is this. Galileo makes the claim that bodies of unequal mass fall at the same rate. Such a claim, no doubt, is rife with conventional meanings – what counts as a body, what mass (or “heaviness”)  is and how it is measured and who says, how one determines if objects fall at the same rate, and so on. [Though I wonder: just how much of one’s culture is reflected in such thin concepts? Couldn’t people from very different cultures get on the same page rather easily with regard to them?] There are further conventions about how such a thing is proven, by whom, to whom, who values it, and how the claim gets propagated to other people and cultures. But beyond all this, there is a fact: do they or don’t they? Galileo could have been wrong, just as (in fact) Aristotle was. The heavier mass could have fallen faster or slower than the smaller one. And this fact – that “facts are stubborn things” (John Adams) – is crucial to knowledge.

I can’t see how anyone could deny this without at the same time affirming it. But I suppose the thinkers who stress the social dimension of knowledge would insist that the packaging of stubborn facts, so to speak, or the give-and-take economy of them, is what they are talking about. People and institutions supply the “mechanisms and instances” that tell us which facts to attend to, how to employ them, and how to make sense of them. Astrologers and alchemists deal in facts as well: but our prevailing knowledge regime tells us that they are making the wrong use of facts, or they are interpreting them incorrectly. The whole affair of the vacuum pump experiments in the mid-17th century, recounted in Shapin and Schaffer’s Leviathan and the Air Pump, closely documents how an institution, the Royal Society, came to determine what facts were being observed and what they meant. This determination was a mix of politics, religion, prejudices, and subjective preferences in kinds of theories.

Still: I tend to think that, even if the Royal Society had ended up making different determinations, and had (say) gone with Hobbes’ anti-vacuum view rather than Boyle’s view, over time the determination would have been reversed, and subsequent thinkers would have ended up with the view that actually prevailed. Why do I think this? Because, so far as I know, there really can be a vacuum, and nature is not a plenum, and there is no aether or subtle fluid pervading all things: those are the facts.

I can imagine two critical replies to this view.

  • First, one could say “Yes, of course; those are the facts. But the point is that, along the way, social politics do have their influence over what people come to regard as facts; and in the alternative history you imagine, the social politics would also be influencing the later reversal away from Hobbes and back to Boyle. One should not ignore the social dimension, even if it is not the only dimension.” I would hasten to agree, and I can’t see why any reflective person wouldn’t. What would the alternative be?
  • Secondly, one could say, “Maybe; and maybe not. The facts themselves do not force policy decisions about what groups of people should believe. Ranges of different or even incommensurate theories can be built up around the same observations, and theories are always underdetermined. Your confidence that there really is a vacuum does not reflect any stubborn fact. It only expresses your confidence in a theory that has come about in response to the facts, and your confidence only shows the power of the knowledge regime you serve.” Wow. Really? But then is the Foucaultian theory of “regimes of truth” and the social dimension of scientific theories itself somehow better established, more true to the stubborn facts, than contemporary physics? Should I be more confident of one than another?

“Yes,” I can imagine someone replying. “The theories about how people and societies behave when it comes to knowledge are more immediate, and more familiar, than theories about vacua and the aether. We know what we are talking about when we talk about ourselves; we do not know the non-human world nearly so well.” I have to admit, this strikes me as implausible, if only because I tend to think, as a skeptic, that we understand ourselves least of all. The incredible advances in technology serve as some kind of evidence for thinking we understand the non-human world pretty well, and I don’t think our history of understanding ourselves has yielded anything comparable to satellites and particle accelerators. [Or??] Yes, I have read Popper, and I know that accurate predictions and subsequent technological success does not prove a theory; but, barring ridiculously astonishing luck, it does suggest that the theory has managed to get something right. Maybe not everything, of course; and maybe many features of a prevailing theory are due to social forces; but this leads me back to the view on the first critical reply above.

Perhaps the fundamental question or confusion I’m wrestling with is this: does the influence of social forces on science have anything to do with the issue of scientific truth? One might say that science is of course true (or as good as we’ve gotten so far) and it is also shaped by social forces. I like each lobe of this view, but it is hard for me to put them together: if social forces are powerful, aren’t they likely to skew science in ways independent of what’s true? On the other hand, one might say that this is exactly what happens; or they might disparage this notion of “truth” altogether. But it’s hard for me to believe this, as I check BBC news on my phone, start up my car, and follow my satellite navigator to where I want to go.

Posted in Books, Historical episodes | 4 Comments

What are libraries?

[Currently reading: The Meaning of the Library, Princeton University Press, 2015.]


When I went to college, I had a part-time job reshelving books in the library. I really liked it: I was on my own, rolling a little wooden cart through a quiet place, placing things where they belong. It felt serene and meditative. I also came to know thoroughly the Library of Congress classification scheme (philosophy in B, religion in BS, sexy stuff in HQ), and discovered one day to my surprise that I could effortlessly say the alphabet backwards. I spent a lot of time there studying as well, of course, and it felt to me like the only space on campus where serious learning could take place.

Libraries are under threat every now and then by people who confuse “library” with “where you look up stuff.” Of course, these days you can look up stuff wherever you can find wifi, and you don’t need a building dedicated to the task. Indeed, we always have at our fingertips more information than we know what to do with, and that resource is of inestimable value to us as we play Candy Crush for hours on end. So, to these people, libraries appear to be very expensive wastes of space.

Libraries once were rare and valuable repositories of information. The famous Library of Alexandria was said to have housed the entirety of ancient Greek texts; when it burned down, those texts were gone. Forever. Over the centuries perhaps ten percent of the collection was recollected, thanks to the efforts of scribes sent by their masters to travel to the great library and make copies of this or that work. Each and every library, from the ancient world to recent times, was Noah’s Ark, preserving specimens over oceans of time; but whenever one burned down, all hands were lost.

But over time libraries became more than preservatory structures. Educated men gathered in them and around them, and they became the places to go to if you wanted to join the collection yourself. Soon they were centers for scholarship, science, literature, and learning. As printing technology developed and it became possible for individuals to have their own personal libraries, this secondary role – center for scholarship – grew in importance. The library did not merely house materials, but became a central nervous system for bodies of knowledge. Or, shifting to a more contemporary metaphor, the library changed from being a hard drive to a central processor. For the problem of the modern world was not to preserve knowledge, but to put it in order. There was (and is) so much information that one needs a center of scholarship to make sense of it all.

We delude ourselves when we think the internet can do this for us. To the extent that it does – when a search engine miraculously returns good, relevant materials – the success is parasitic upon the work of real scholars using real intelligence to sort signal from noise. The internet helps – boy, does it ever! – but it gets us to the stuff good scholars have made worth getting to. Scholars do their work in communities, and libraries, as centers for scholarship, make those communities possible. Under the guidance of trained and dedicated librarians, our libraries provide the physical and organizational structure that houses the on-going conversation that is knowledge.

Posted in Books, Items of the academy / learning | 2 Comments

Brainwashing, the Red Scare, and the Turing Test

I just came across this brilliant lecture, “Imitation Games: Conspiratorial Sciences and Intelligent Machines,” given recently by Simon Schaffer. I’ve noted Schaffer’s work before on early automata. Here he extends his interest in our fascination with automata to post-WWII paranoia.

Schaffer illustrates the intelligence backdrop to Turing’s work, and particularly the paranoia among communists and capitalists alike about one another’s treachery. Each were convinced the enemy was experimenting with brainwashing – and they were – and they were interested especially in the plot to brainwash a captured enemy to return home and assassinate someone (cue The Manchurian Candidate). Critical here is the notion of “passing” among the enemy as one of their own while being programmed to kill. It was in this circumstance that Turing raised the Imitation Game, or a contest in which a computer tries to pass as a human. Brilliant stuff, and it brings a third dimension to the history of AI.

Posted in Historical episodes, Machines / gadgets / technology / games | Leave a comment

Star Wars awakens

(No spoilers)

star-wars-episode-7-release-date2Star Wars came out in 1977, and I was 12 years old, which means it hit me the way a T-16 can bull’s-eye womp rats (at least with the right pilot). I remember Nixon resigning, and I remember when I first heard about 9/11, and I remember when that Imperial Cruiser came rolling over the top of the screen and kept coming and coming, and I stopped breathing. I can’t think of another cultural artifact that has had such a massive, direct impact on my life. I saw the film 14 times – and this, younglings, is far before the days of VHS or DVDs, so that means I saw it 14 times in the theater. My friends and I devoured Starlog magazines and relied heavily on our pre-internet googling abilities (i.e., making stuff up) to try to peer into every nook and cranny of that long-gone, far, far away galaxy.

o-STARLOG-facebookSo, when “Episode I” came out in 1999, I had every right to geek out. And, like just about every 30-something, I was hugely disappointed. I don’t know what George Lucas would have had to have done to do justice to the impossibly high-degree of awe and respect we had for the Star Wars universe, but Jar-Jar wasn’t it, not even close, and we felt betrayed. To make matters worse, George started messing with the original films – our films! – to make them connect more smoothly with these latter-day prequels. It’s like he felt he owned those movies! – which he didn’t, or not entirely anyway, as they had been interwoven with millions of other lives – like mine.

But time and age bring perspective, or at least larger blood levels of dontgiveashit. So this week, as “Episode VII” was released, I felt I could take a wiser perspective of the franchise. I realize now that no one, not even God, or George Lucas, could make a movie that does for thirty- or forty- or fifty-year olds what a movie can do for a 12-year old. And I realize that making the set of Star Wars films has been devilishly more complex, as an enterprise, than, say, the Harry Potter movies, or the Peter Tolkien LoTR/Hobbit movies, since these later endeavors were conceived in their entirety from the get-go. Lucas made Star Wars and suddenly the dim vision of an epic saga of which it was a part needed to be fleshed out in two more sequels, followed twenty years later by three prequels, with all kinds of other elements also rattling around the Star Wars “extended universe” in the intervening years. No wonder the plot had all the unity of a Jawa yard-sale. (And furthermore, remember that George Lucas is the American Graffiti guy, not J. K. Rowling or J. R. R. Jackson. There’s a lot of American Graffiti in the Millennium Falcon, when you look for it.)

So the family and I settled in to watch all six episodes over the last few days in preparation for The Force Awakens. We watched them in episodic order (1, 2, …) – as opposed to the order of release (4, 5, 6, 1….), or the “machete” order, or the order of the Mynocks, or what have you. We kept scientific measurements of our overall ratings. IMAG0606From these data, we observed that, with the exception of the utterly abysmal Attack of the Clones, our enjoyment generally correlated positively with the overall cost of each movie. Indeed, of the six, The Phantom Menace – Episode I –  emerged as my favorite by a long shot – and this is absolute heresy among Star Wars fans. The Empire Strikes Back was far better than I thought, back in 1980, though Return of the Jedi was still, in my opinion, awful – stinking awful, breath-of-Jabba-the-Hutt awful.

(What made Episode I so good (in my lonely view)? Easy. Liam Neeson – the greatest Jedi in the history of the galaxy. And Darth Maul steeply ramped up the lightsaber fights. And Jar Jar really is kind of funny – if you allow yourself to temporarily regain your twelve-year-old mind.)

What can one say of the original, Episode IV, A New Hope? Well, it was pathbreaking, no doubt. It was imaginative, fun, silly, and exciting – in 1977. In 2015 it is – how to say it? – kinda lame. Look, I know: it’s lame only because of the accelerated evolution of the sci-fi film genre that developed in its wake. It’s lame in the way Edison’s first lightbulb is lame, by today’s standards. Still – for all that – that fact is that it’s painful to watch now, even as I remember with perfect clarity exactly how each scene caused my 12-year-old self to cheer and laugh.

Watching the films together reveals some surprising unifying themes in the Lucasverse. There are always strange, funny aliens who are often partial to music that would be rediscovered eons later in mid-20th century America. Every world presents a complex economic ecosystem, with various species exploiting every opportunity for making a living. Engineers across the galaxy are given free reign to invent vehicles appealing to every sensibility. And there are enough funny droids running about to afford Keystone Studios endless plot opportunities.

The Force Awakens proudly carries forward these traditions, and breaks new ground with a filmmaking mastery that is entirely new to the series. These are real actors, real scriptwriters, and a real director. The very possibility of assembling these talents together into making a fantastic sci-fi film would not have been possible prior to 1977. (Indeed, nowadays it is evidently impossible to make any film in Hollywood that doesn’t involve aliens and explosions; though it’s also true that, as tiresome as these hyper-barbaric blast-fests can become, you can count on them showing a uniformly high level of technical expertise. We tend to take this for granted.) And, like A New Hope, the film has a lot of heart – which, come to think of it, is another central feature of the Star Wars galaxy – and it is relentless in its demand: get into that 12-year-old self and strap yourself in; we are going to have some fun. It knows some of those old Jedi mind tricks that can make 38 years disappear in an instant.

Posted in Machines / gadgets / technology / games, This & that in the life of CH | Leave a comment