A World Without Trust

Imagine two worlds. In one, everyone keeps their promises, honors not only the letter but the spirit of agreements, and are broadly reliable and trustworthy. In the other, promises are just empty words, and people are as opportunistic and as spiteful as they impulsively desire to be in a given moment. Which people, from which world, do you think is more capable of accomplishing anything?

In the very first episode of the popular Netflix series House of Cards, the main character, Frank Underwood, has a promise to him broken. Underwood marvels at it because he didn’t think they were capable of it—he admires it in a way, as though breaking a promise was something hard, and keeping it was something easy. This kind of cheap imitation of Nietzschean cynicism is all about having a will to power which allows one to overcome conventional morality.

But the real accomplishment is not overcoming your own trustworthiness, but the fact that such a thing has enough weight that even cynics feel they must “overcome” it. There are many parts of the world where trust and trustworthiness are not the default outside of the close circle of family or clan. A society in which relative outsiders and strangers are able to make promises to each other and trust they will be kept is a tremendous accomplishment.

Whose accomplishment is it? Hard to say. But it certainly isn’t the accomplishment of any cynical would-be despot. If anyone deserves the credit, it is the countless millions of ordinary people, across many generations, who have strived to live decently and treat each other fairly.

The utter hell of a trustless world cannot really exist for long in this one. But we should thank those decent people who came before us, for putting as much distance between us and it as we’ve got.

Speaking With Certainty

A while back, after my propertarian piece (which isn’t much at all about property), someone challenged me to write a follow-up piece on ancient religious views on property, making sure to account for slavery. I immediately agreed to the challenge, but I was paralyzed.

The Judeo-Christian writing called Leviticus, which is a part of the traditional text known as the Pentateuch, or the Five Scrolls of Moses, or just “Moses,” speaks at length concerning property distribution, property rights, and compensation for irregularities and violations. It is a writing which depicts a vigorous society in motion, a book which Jesus summarizes with the well-known apothegm, “Love your neighbor as yourself.” The very word “neighbor” evokes property and other notions of personal sovereignty.

Nevertheless, it is practically impossible for me to write a general piece touching on Leviticus or Levitical principles because, with respect to its provenance, I am neither a minimalist nor a maximalist, nor am I some sort of milquetoast via media advocate, either. I happen to take a scholarly, evidence-based approach to the provenance of this book, which is a standard view, but is contrary to what is taught in universities both secular and religious or parochial.

In secular universities, and those religious universities whose worldview is formed by Nineteenth Century Continental philosophy, the minimalist Documentary Hypothesis is still taught as de rigueur, a hypothesis which posits that the books of Moses, especially the Levitical material, were fabricated by a power-mongering priestly caste during the Judahite exile in Babylon during the Fifth Century BCE. I am under the impression that this hypothesis is presented as ironclad secular scholarship, i.e., the truth, when it is essentially the telos of the Sacramentarian movement which came to dominate Enlightenment Era religiosity.

Religious fundamentalism, deeply offended by this radical minimalism, developed a response which became reflexively maximalist, in defiance of all evidence (even internal evidence) to the contrary, namely that Moses wrote every jot and tiddle of his five scrolls somewhere between 1550 BCE and 1440 BCE, and never shall a true Christian vary from that view lest he deny the efficacy of the Word of God.

In public discourse, there is no middle ground. One can write classroom papers and discuss privately a more nuanced view, which is based on the evidence–OK, let’s be fair: I would say that, now wouldn’t I? Here: a nuanced view which assembles the evidence guided by a particular view of history, scholarship, science, and philosophy. So I begin again: there is no middle ground in public discourse.

I presented a paper at a regional meeting of the Society For Biblical Literature once upon a time, a radical deconstructive view of methodology with respect to the academic discipline known as “biblical studies.” In it I noted that the disciplines of the pure sciences, linguistics, philosophy, and history had all evolved drastically over the past two hundred years, but biblical studies still labored under the precepts of the Eighteenth and Nineteenth Century, and I actually stated that, if this were any other discipline aside from the contentious religious discipline it is, our colleagues in every other department in all universities across the world would remove us with extreme prejudice.

A professor from Harvard was in attendance, so, naturally, I was intimidated. I mean, I still had aspirations to one day maybe hopefully if-miracles-come-to-pass apply for a position at Harvard or one of the Ivy League schools, or even Michigan University, so I really wanted to come off as bright and snappy. He asked, “What do you do with history?”

Without thinking, I blurted out, “I just ignore it,” which is true, in one sense because of my deep respect for the science of linguistics, but not quite right, again, out of a deep respect for post-modern philosophical currents. What I was getting after was the primary importance of community in interpretation, but I didn’t say as much, so the entire room burst into laughter. I tried salvaging my point, but, you know how these things go.

Then I heard a fellow in the front row mutter, “Why do we have to bring ourselves into resonance with other academic disciplines [a phrase from my paper] when we already know what history is?” [his emphasis].

Well, that was the entire point of my paper, which had failed to convince the Harvard University types: we don’t really know history. I did not go so far as to radicalize my view inasmuch as to say that we construct history wholesale, but it is certainly true that we arrange data within a certain framework until we are pleased with the outcome.

A healthy skepticism of the self is thereby necessary. What am I up to? Can I identify my biases? What are my external influences? Why is this emotionally significant to me? Moreover, when it comes to historical realities (for a lack of a better term), an academic humility is very helpful, namely that we don’t know very much at all, and we know that we don’t know very much at all because we don’t have much physical evidence, and we are, most assuredly, arranging evidence as taught, not as is obvious. We make a convincing case, that is all.

So when we speak of Levitical principles, it is nigh impossible to speak on the same ground. If these principles have their origins in a nomadic group of people who had recently escaped from Thirteenth Century Egypt, drawing heavily on Hittite suzerain-vassal arrangements and the attendant societal characteristics, then our vocabulary will be significantly different than if they have their origins in a cynical repristination drawn from a vaguely Babylonian and/or Syrian religion-society.

That last paragraph should be the lead paragraph when I write my propertarian follow-up.

The Audience to and Author of Your Life

Instead of speaking of nature and nurture, determinism and free will, let’s think about the extent to which you are an audience to and the author of your own life.

We are all undeniably audience to our own life. We don’t choose to be born at all, nor who our parents are (or whether they raise us, or who does) or what nation we grow up in—or what part of history!

Moreover, we cannot rewrite the life we have lived up until now. We are an audience to our own past, disclosed to us through memories and stories about ourself we are told by others.

Continue reading “The Audience to and Author of Your Life”

Incomplete Virtue

In his essential book on virtue ethics, Daniel Russell advanced two arguments that I found highly novel and provocative.

The first is that the virtues are what he calls vague satis concepts, something I explore in depth here. The short version is that they have a threshold beyond which “virtuous enough” just is “virtuous in fact.” And this threshold is vague, in the sense that there are boundary cases that cannot be resolved simply by increasing your level of precision. One example of this is the threshold beyond which one goes from having thin or receding hair to being bald. More significantly, the concept of personhood is a vague satis concept, with boundary cases including long term coma patients, the severely brain damaged, and embryos.

In such cases, Russell argues, we need a model. This model is not simply an averaging of the most representative cases. As he puts it:

When we try to say what personhood really is, we construct a theoretical model of what we take to be the essential features of personhood, in some kind of reflective equilibrium, and realized to the fullest degree, since the model must illuminate the central cases, not just join their ranks. This model, we should note, is an ideal, and therefore not merely a central case: you or I could stand as a central case of personhood, but not as a model of personhood, since particular persons always have shortcomings in some dimension or other of personhood, a shortcoming that the model is to reveal as a shortcoming.

The second argument of interest is that virtue ethicists need a limiting principle on the number of virtues there are. The Stoics and Aquinas resorted to a very limited set of cardinal virtues of which all others were but aspects. Aristotle, however, offered no limitations at all, and most modern virtue ethicists follow him in this. Russell finds this unacceptable. This argument flows from the first one—we need a model of the virtuous person. If the number of virtues approaches infinity, then how could we ever hope to model such a person?

It is this second argument I wish to disagree with. Russell thinks virtues need a limiting principle because the model of the virtuous person that he has in mind is a formally specifiable model. But this is precisely what Aristotle’s notion of phronesis, with its radical particularity, precludes.

What Russell seeks is explanation, rather than understanding, when the latter is more appropriate.

Let us say that virtue is like the infinite, fractal coastline of a finite island. How could we model such a thing?

Simply demanding the subject matter be finite will not help. Pointing out that there is more context than we can take in does not mean that the quest for more context is a bad thing—Russell himself makes a similar argument about all-things-considered rationality:

But committing to making all-things-considered judgments is not the same as committing to the (rather queer) life-project of becoming the best maker of all-things-considered judgments there can be. That project, like every other, consumes resources and opportunities, and can no more be assumed to be a rational one than any other project can. That is a fact about practical rationality: when it comes to making all-things-considered judgments, at some point it is reasonable to stop considering, choose, and hope that the choice is one we can live with, or perhaps grow into. Indeed, trying to become persons who do consider all things before acting is something that we have all-things-considered reasons not to do.

My argument is that even the constructing of the ideal itself follows a similar rationale.

Consider my recent exposition of the hermeneutics of novels:

After finishing a given chapter of a novel, we no doubt have certain expectations about what the book as a whole will be like, based not only on the chapter itself but on our understanding of the genre conventions the novel is operating within, maybe even of our familiarity with the author herself or what other people have insinuated about the book. Once we have completed the novel, however, our understanding will have changed—not only of the novel as a whole, but even of a given chapter and its significance. Rereading the novel, we may find the chapter discloses things to us that it didn’t the first time—and these new disclosures, in turn, inform our understanding of the whole novel. In this way, even after we have read the whole book, we can learn from parts of it.

Even something as seemingly finite as a novel we can only understand incompletely. Summarizing Derrida, Jonathan Culler adds to this picture of incompleteness by arguing that meaning is determined by context, and context is boundless. We can always revisit the context and find some new aspect which sheds light on a different meaning.

But Gadamer’s take on this incompleteness is much more optimistic than Derrida’s. It is also ultimately more optimistic than Russell’s, for the latter is forced to ask for models and limiting principles we do not have, implying that we haven’t had much of an idea about how to live virtuously until now.

For Gadamer, it is less about models than about stories. One such story would be the story of the good life. The same story, told differently, is the story of the virtuous person. People have been contributing to this story for thousands of years. Contra Russell, most people already understand virtue and the good life, their understanding is simply and necessarily incomplete. This understanding can be improved, and we should strive to be lifelong learners in this matter, rather than finding a particular understanding and then clinging to it out of a desire for a false certainty. A courageous virtue ethics is one that asks us to accept our inability to complete it, and the necessary day-to-day role that faith must play in filling in the gaps.

The Singular of Data

My friend Jeff told me a story in response to a comment I made. I had just mentioned the travails of kid sports, especially since I enrolled my kids in a hockey program which includes one third more ice time than last year’s program. I sighed, “All consuming, you know.”

Jeff leaned back and intoned a story about his brother-in-law, whose boys were raised on the road to become hockey stars, but they were only so close to making it into the professional ranks, and now the anxiety is upon them, as young men in the late teens and early twenties, to acquire a meaningful vocation.

I said, “Can you imagine investing that much money and that much effort (giving away so much of the family life, in effect) toward a goal which has such a small chance of realization?”

Jeff shifted in his chair and recounted the tale of a dear friend of his who was a bona-fide rock star, in his own mind. He did nothing but play his guitar and practice with his band of fellow-travelers, living up the hedonistic ideal, touring Europe and Japan every year. “If you buy him a sandwich, he’ll take it home to his mom’s basement, where he lives, and save half of it for dinner the next day.”

Jeff rarely answers any question  with a propositional statement; he’s all stories, all the time. His experience is wide and varied, so I guess he can. What makes him especially delightful is that he doesn’t tell stories to fill empty space in a conversation, he’s answering a question. One story gives the answer, and then he’s done, no stringing endless tangential episodes ad infinitum.

I saw somewhere recently (and, forgive me, I can’t remember the context) someone mention that the Affordable Care Act might be screwing over huge numbers of people, but a) those numbers are still marginal, and b) the fundamentals of ACA are forged in good policy. I take that to mean, in other words, that as long as the proper number of people are served by this public policy, those who are hurt (ground to dust, more like it) by the same public policy are data. I’m under the impression that that number doesn’t even need to rise to a majority; it just needs to meet some data-triggered threshold which satisfies its designers. All others should be able to conform, no? If not, then selection has taken its course, alas.

It’s not that I’m against science, God forbid; it’s that I’m against its magisterial application in all aspects of the human experience. Public policy, public morality, public religiosity (for lack of a better word), public everything falls under the hegemony of science, as though science were some sort of impersonal absolute extracted by innumerable university studies from an easily-accessible material world. Science, in this manifestation, never serves; it is always master.

Okay, I yield the point: “ground to dust” is too much; there are worse things on this earth than ACA. Nevertheless, I will not yield the larger outcry, namely that this sentiment is a resistance to the notion that in our story-less data-gathering, individuals are being sorted in a grand perversity of science-wielding masters so that they lose their individuality, and thus their ability to serve on another. We don’t learn to serve each other by means of data; we learn by means of experience, which is brought forward through civilization through stories, wherein are the ties of myriad strands of data.

Queen Elizabeth was finally convinced that the monopolies, though they appeared to buy consolidation of her throne, were costing her far more than an open market would. The data had always been there, but the stories hadn’t trickled up to the throne.

Get Thee To a Nunnery

On the descent into madness

The contest for the greatest play in the English language comes down to one of two Shakespeare plays: Hamlet and Macbeth. Both of these plays delve deeply into the psyche of ordinary men and women who enter the realm of madness. The plays themselves and the characters therein resonate deeply, crossing boundaries temporal and cultural. In our contemporary culture, the descent into madness is the theme of two of the most popular record albums ever recorded, Pink Floyd’s The Dark Side of the Moon and The Wall. The former used to rival Michael Jackson’s Thriller for worldwide sales, and is still the second most selling record of all time. I’m sure that as soon as either David Gilmore or Roger Waters dies (Rick Wright, RIP) many new fans will restore the rivalry at the top of the all-time charts

Shakespeare draws a picture for us: Hamlet, young Hamlet, possessed by the ghost of his father to avenge his death, has been veritably banished by his uncle to England, whereupon he will be murdered, as everybody knows. By some twist of fate and the adventuring spirit of young Hamlet, he escapes, making his way back to Elsinore. Upon his arrival at the outskirts, he stumbles across an open grave. Holding up the skull of Yorick, his father’s jester, and a favorite person from his childhood, he says, “I knew him.”

Linear perspective insists that all parallel lines converge upon the horizon. Well, here at the open grave, the horizon been brought dramatically forward, and Hamlet experiences the confrontation which is a response to his melodramatic soliloquy: what dreams may come after we have shuffled off this mortal coil must give us pause.

Hamlet 1948 réal : Laurence Olivier Laurence Olivier  Collection Christophel
Hamlet 1948: Laurence Olivier

Not for long, for the grave is not passive; it is active, yawning, galloping, devouring. In a brilliant interpretation of the subtlest kind, Laurence Olivier’s Hamlet, when he hears the approaching funeral procession, tosses the skull of Yorick back into the grave, just in time for old Yorick to receive the recently deceased and politically important Ophelia.

All the powers of the earth are here converging, with love, politics, royalty, vengeance, and that always-pressing anxiety intersecting over a grave. War is ever on the horizon, hemming everyone within easy reach of the same.

And you run and you run to catch up with the sun

But it’s sinking

Racing around to come up behind you again

Banquo’s ghost won’t rest, either, charging up from the grave to confront, wordlessly, the ambitious Macbeth.

Here’s a curious aside: Richard Burton, an actor of some note, refused to play Macbeth because, as he says, he cannot be dominated by a woman in that way. The irony is captivating once you come to the understanding that he drove himself to drink over his treacherous divorce in order to win for himself the great prize, Elizabeth Taylor, who dominated him.

The descent into madness, and its appeal to popular and literary culture, is not limited to obsessive thoughts concerning the grave. “This is the end. There is an end to me. Life has no purpose, no meaning.” No, that’s maudlin stuff. Pap. Child’s play. The descent into madness is the lonely individual coping with the active, ongoing confrontation of the grave, that all our evil deeds and the evil deeds of many others manage to wriggle free from death’s strong bonds in an effort to possess us ahead of time.

Ordinary people have a fascination with the exploration of the descent of ordinary people into madness. A playwright or musician will set the scene in extraordinary circumstances, by my reckoning, to sell tickets on the entertainment value. The literary value, i.e., its meaningfulness to the paying ordinary public, is its deep-seated commonality, the themes which grasp a deep-seated anxiety, an anxiety which many people would declare possesses us all. Some of us, for various reasons, cope better with that anxiety than others.

The meaning of life, in other words, is a question of how to maintain meaningful behavior even while under possession of the grave.

Richard Burton’s Hamlet gestures toward Ophelia’s womb, saying, “Get thee to a nunnery.”

subhumans-from_the_cradle_to_the_grave