The struggle between enlightenment and spirit, detached reason and emotionally embedded life, is one of the characteristic conflicts of modernity. It has played out in arguments and in art, in politics and policy. Charles Taylor has traced the contours of this and related conflicts with remarkable skill and subtlety.
The 1954 science fiction story “The Cold Equations” is a useful example. A middling midcentury parable that is long on exposition and short on plot, it sets up a stark scenario in which fellow feeling and detached reason are at odds, but the latter must necessarily triumph.
Author Tom Godwin describes a future in which resource constraints on the frontier of space allow for almost no margin of error. They give ships just enough fuel for the exact amount of mass they are carrying. So any stowaway must be tossed out into the void immediately, or there won’t be enough fuel to decelerate, and everyone aboard will die.
Or did that American mistake it for something like the performing arts he already knew? A play, or an opera, or even a dance. Did he miss what made it idiosyncratic?
What the American already knows, what he’s capable of understanding as, constitutes what Martin Heidegger calls his hermeneutic situation. It is not knowledge in the sense that we know arithmetic, but something we have that is prior to understanding and provides the necessary conditions for intelligibility.
Imagine in time this American began to see what sets kabuki apart from other performing arts; what is particular to it. He did not just add one more type of performing art to a mental list; his understanding of the performing arts he already knew about is changed by his having understood kabuki. In seeing how they are different from kabuki, he can see their particularity more clearly, and seeing what they have in common is similarly transformative.
This process is what Hans-Georg Gadamer referred to as a fusion of horizons, which in reality constitutes a transformation of both. It is akin to when an English speaker is learning Spanish, and reaches the moment in which they stop trying to mentally translate English sentences word by word.
Once you can formulate what you’re trying to say in Spanish from the start, you’ve broadened your horizons in a meaningful way. Your hermeneutic situation has been transformed; you have not merely added Spanish to English because your understanding of the latter is changed. Things you took for granted about language construction you are now capable of seeing as one possibility among others.
Entrepreneurs are not as free or powerful as we like to imagine, nor can they be entirely subsumed into the larger forces of history.
In Knowledge and Coordination, Dan Klein points to “The Verger,” a short story by W. Somerset Maugham, as a great telling of entrepreneurship in practice. The titular character goes in search of a cigarette and realizes that there is nowhere he can buy one in the vicinity.
“That’s strange,” said Albert Edward.
To make sure he walked right up the street again. No, there was no doubt about it. He stopped and looked reflectively up and down.
“I can’t be the only man as walks along this street and wants a fag,” he said. “I shouldn’t wonder but what a fellow might do very well with a little shop here. Tobacco and sweets, you know.”
He gave a sudden start.
“That’s an idea,” he said. “Strange ‘ow things come to you when you least expect it.”
Klein uses this as an example of entrepreneurship on the model of his mentor, Israel Kirzner. Kirzner is an economist in the Austrian school, who emphasizes the role of alertness and frameworks in entrepreneurship. The Austrian school more broadly defends entrepreneurship as a creative act.
For neoclassical economics, however, this sort of arbitrage is more a feature of the situation than the individual. The shop that the protagonist sets up is like one of the corner stores that is very common in New York City. A lower density neighborhood may have relatively few of them, but then see an influx of people and suddenly gain a few more. The situation creates the arbitrage opportunity, not the entrepreneur.
How many of us have known an addict who has broken our hearts too many times with their promise to get clean, for good this time?
Many of us have struggled with addictions of our own. The rest of us have surely been in a position where we told ourselves or others that we would change in some way, and even believed that we could and would, but did not.
Aristotle spoke of akrasia; of knowing what we should do but failing to do it, or knowing what we should not do and doing it anyway. But my question is: do we know we will fail?
If you’re addicted to heroin, or cocaine, cigarettes or alcohol—how can you know if you will quit, when you say that you will? Can you know it, except in retrospect? And how long must you wait before you can say it in retrospect? People can fall off the wagon after decades. Can we know that we will get clean for good, or should we call no man clean until he is dead?
The Greek term skeptikos means, not a negative doubter, but an investigator, someone going for the skeptesthai or enquiry. As the late sceptic author Sextus Empiricus puts it, there are dogmatic philosophers, who think that they have found the truth; negative dogmatists, who feel entitled to the position that truth cannot be found; and the sceptics, who are unlike both groups in that they are not committed either way. They are still investigating things.
In his autobiography, Charles Darwin lamented that he used to love poetry, but could no longer “endure to read a line” of it. He complains:
My mind seems to have become a kind of machine for grinding general laws out of large collections of facts, but why this should have caused the atrophy of that part of the brain alone, on which the higher tastes depend, I cannot conceive.
I think economics trained me to think this way. When I began a fresh foray into philosophy a couple of years ago now, I approached it from this stance. Every book went into the grinder, to mash up and join with others in the cage of general laws. I steamrolled my way through book after book; when I couldn’t follow them I just pressed on so I could get to the next one. There was no thought of reading for pleasure or respecting the book before me like I might respect a partner in conversation. What is more rude than completely dominating a conversation without consideration for the other person?
But I launched into reading as if quantity equaled quality, as if I could become an expert simply by reading a lot.
I did, indeed, learn a great deal. But for the last year or so, I felt that I had stumbled on authors who helped me grow in an important way—they helped me to see more clearly a wide and yawning ignorance in myself, including an ignorance of how far the ignorance itself extends.
Increasingly, I wonder: isn’t this what philosophy is supposed to teach? For all the flaws of the historical and fictional Socrates, don’t we still admire him for saying that he only knew that he knew nothing?
The subject-object schema is not destiny. It is handed down to us from the time of Descartes and Bacon, quite late in the history of philosophy. After Kant, subjectivity became a prison from which we are never free to directly perceive or interact with objects as things-in-themselves.
In the 20th century, Hans-Georg Gadamer and Ludwig Wittgenstein—starting from very different interests, training, and standpoints—looked to play and games as a way of moving beyond the Kantian trap.
How can something as seemingly trivial as play provide an answer to a serious philosophical problem? When we say “do you think this is a game?” are we not implying that the matter at hand is more important than such a thing?
Francis: What do you think the point of a story is?
Paco: The point?
Francis: You know, their function. Their purpose. Why do we tell them?
Paco: There are many reasons, I imagine.
Francis: I think the most important one is illustrated by “The Zebra Storyteller“. We tell stories to supplement for experience, so that we can be prepared for things that haven’t happened to us personally but can be imagined to happen.
The answer used to seem obvious to me. I was of one mind with Wittgenstein:
For we can avoid unfairness or vacuity in our assertions only by presenting the model as what it is, as an object of comparison—as a sort of yardstick; not as a preconception to which reality must correspond. (The dogmatism into which we fall so easily in doing philosophy.)
But to Aristotelians and Platonists, the model appears to belong to reality rather than being some separate thing we construct as a yardstick. And to Heidegger and Gadamer, preconceptions are front and center in establishing the conditions of understanding.
I wandered through conceptual murkiness as I attempted to understand these various lines of thought. When I encountered the Wittgenstein quote above, a particular conception of the model came sharply into focus.
In what follows, I will argue that Wittgenstein is right, but—as he would no doubt have happily conceded—incomplete in his treatment of models. I will integrate it into Heidegger’s notion of the fore-structure of understanding, which makes up our hermeneutic situation. I will try to avoid being overly technical—you can think of the hermeneutic situation as your standpoint, including your prejudices as well as the traditions of thought and practice in which you are embedded, and specifically how those things pre-form your interpretations.
Imagine a group of friends sits down to play a tabletop RPG.
They picked a Dungeon Master ahead of time, to plan out the adventure and generally be the arbiter of what occurs and what’s allowed.
The remaining friends put together their characters, choosing types (such as warrior or wizard), stats (such as how intelligent their character is as opposed to how strong or nimble), names, species, and so on.
There are rules to these games, but they are fairly flexible, to allow for creativity on the part of the Dungeon Master as well as the players.
Suppose that after playing a few times, some of the players get tired of it, and want to switch to a different RPG. A space adventure, say. Neither the DM, nor the rest of the players, want to give up on what they’ve done so far, though. So they strike a compromise—their characters in their current game will play an in-game version of the space RPG, and accrue experience points based on how well they do.
At first this takes up about a fifth of their gameplay. But gradually, they spend more and more time on the subgame. What’s more, they create more subgames, of many different genres. Some are so completely unlike the one they’re playing as to be hardly comparable—focusing on boring domestic scenarios, for instance. Or working together to solve puzzle games.
At what point can they be said to ever play the original game at all? What if 80 percent of their gameplay takes place in subgames? But now, what if much of that 20 percent was used determining which subgame to play or creating new ones? At what point does the original game vanish entirely, as an entity?
The original game is formally higher up on the hierarchy than the subgames. The DM could decide to have a dragon attack while their characters’ attention is caught up playing house in a subgame. But to the extent that it’s hard to get a group of friends together who will regularly commit their time to a common game like this, the DM can’t just do whatever he wants. If people think he’s being unfair or aren’t having any fun, they can walk away.
If enough people do this, the game will simply be dead.
In short, the DM is constrained in as much as he wants to avoid killing the game entirely.
I ask again: at what point is it absurd to refer to the original game at all?
This was a little out of character. Loved ones wondered whether this was some sort of roundabout suicide attempt.
Nevertheless, with the encouragement of my good friend Alex, I went to the first practice of the GMU Rugby Club for the fall 2005 semester. This was the year before GMU’s basketball team went to the Final Four, so the administration was towards the end of a long period of neglecting GMU’s sports in general. Rugby being a less popular sport than most, the field the team had access to was more hard ground and mud than grass.
(Incidentally, the semester after the Final Four run, we returned to discover that our field had become a beautiful green jewel, tended to lovingly and invested in with fresh cash from an administration suddenly enthusiastic about sports)
When we arrived at that first practice, everyone was engaged in a drill called cross over running. You form four lines, facing each other diagonally. The lines that face each other directly run and pop the ball to the person at the head of the one across from them, who then runs and does the same, and so on.
To my unathletic, inexperienced, timid, and highly awkward self, it was a terrifying sight. You had to make sure that you caught the ball, didn’t crash into someone running from the perpendicular line, and then actually got the ball into the hands of the person across from you. And it all happened so fast! I was certain to make a complete fool of myself.
And in practice, as well as the field, I did make a fool of myself, many times. But it was a kind, forgiving group, who encouraged persistence in the face of continual failure, and went out of their way to call out whenever I did something right. In short, I stuck with it, for those last two years of undergrad.