A Rebuttal to Materialist Science

Yesterday, I came across an article (“Are you sleepwalking now?”) on the digital magazine Aeon that I could not help but respond to, because it seems to be such a patent example of someone misusing science to “prove” things well beyond science’s ken.

The article is here: https://aeon.co/essays/are-you-sleepwalking-now-what-we-know-about-mind-wandering. It is well written and certainly thought-provoking, so it’s potentially worth reading on its own. More to the point, it is required reading for this post.

To practice what I preach, here’s my fair disclosure at the beginning, in case this is the first of my posts that you’re reading. I’m a faithful progressive Christian who believes in both science and God. As an existentialist theologian and somewhat of an epistemological pessimist (I’d say “healthy skeptic,” I believe that personal consciousness and experience is the foundational starting place of examining metaphysical questions. Hence why I might take the article so personally, though I think that my arguments stand on their own and I’m explicitly trying to go out of my way (unlike Dr. Metzinger, I think) to admit to what I believe that cannot be proved and what does or does not actually meet with standards of scientific inquiry.

The article was posted on the 22nd of this year by Dr. Thomas Metzinger, a professor at the university of Mainz, where he teaches theoretical philosophy with a focus on the philosophy of the mind (the subject of his article). He has written numerous books, given a TED talk and is undoubtedly a highly-intelligent person well-versed in the subject matter.

Nevertheless, I have to take issue with the assertions he makes in his article.

The article begins with what I can only describe as a masterful metaphor for the movement of “thoughts and ideas” from un- or subconscious to conscious, one that equates them to the motion of dolphins traveling at speed, occasionally breaking the surface of the water and often under it.

From there, Metzinger poses the questions he believes he can answer. He writes, “Philosophers of mind often fall into the trap of assuming that goal-directed, rational thought is the paradigmatic case of conscious cognition. But if we are only ever partly aware of what is happening in our own minds, surely we can’t be in absolute command of our thoughts, let alone causing them? Is it ever possible to distinguish between mental actions, which we can direct and select, from the more general category of mental events, which simply happen to us? In what sense are we ever genuinely mental agents, capable of acting freely, as opposed to being buffeted by forces beyond our control?” (emphasis Metzinger’s).

This question perhaps the most fundamental philosophical question when it comes to thinking about the mind. Experientially, I think that we can agree that we have thoughts that we would assert we have consciously and willfully called to mind and formed and those thoughts that seem to be generated spontaneously and inexplicably—in other words, the conscious and the subconscious.

The only complaint that I have with Metzinger’s formulation of these questions is the rhetoric that subtly slips in to begin his arguments from the inception of the question. On the other hand, this is easily forgivable as something most, if not all, of us are likely to do even unintentionally.

The next paragraph begins Dr. Metzinger’s tenuous assertions. Relying on the “empirical findings” of neuroscience and experimental psychology in mind-wandering, he asserts that, “Much of the time we like to describe some foundation ‘self’ as the initiator or cause of our actions, but this is a pervasive myth” (again, emphasis is Dr. Metzinger’s).

Here’s my first complaint: there is no description of these “empirical findings.” Dr. Metzinger does not explain what experiments have been conducted, whether they are peer reviewed, whether they have been replicated, what the specific results are—or, really anything other than that they exist and we should allow him to interpret them for us. This is not evidence; this is the basic rhetorical technique of asking the audience to rely on your authority as evidence enough.

The first sentence of the following paragraph gets to the heart of the matter: “Mind-wandering research suggests that we need to get rid of naïve, black-and-white distinctions such as ‘free-will’ versus ‘determinism’, ‘conscious’ versus ‘unconscious’, and what philosopher’s call ‘personal’ versus ‘subpersonal’ processes (roughly, accounts of cognition that look at the whole person’s reasons and beliefs, versus those based on biological or physiological functions).” What!?! How did we go from “empirical findings” suggesting that there are a lot of subconscious activities going on to positing that we should look to a solely biological basis for consciousness? This is a logical non-sequitur in the extreme.

Nevertheless, the statement is revealing: it’s a 21st Century version of the “bag of chemicals” argument made in the early 20th Century (i.e., that all of our thoughts and actions are really the result of chemical reactions in body and brain without any real volition or self) so readily rebutted by G.K. Chesterton in Orthodoxy.

Rather than solely referring to Mr. Chesterton (whose arguments should most definitely be read), I’ll point out a few of the specific problems: (1) lack of any evidence for this provided; (2) lack of consideration of the broader findings of neurological research (which I’ll refer to in more detail in a moment); (3) the solipsism and circularity of the argument (how is it that Dr. Metzinger is so special as to realize the falsity of the illusion and then to explain it to others by random chance of his own mental events)?; (4) the complete and willful ignorance of the human experience. We might phrase the last objection in terms of Occam’s Razor: which is more likely, that when we feel we are exercising our will we are or that there are multiplicative, subtler and (so far) inexplicable mental processes going on that cause this illusion?

In the case of neurological research that seems to point to other than a solely materialistic explanation for cognition, I’d point you to Dr. Mario Beauregard’s The Spiritual Brain: A Neuroscientist’s Case for the Existence of the Soul in counterargument. In that book, Dr. Beauregard (a neuroscientist rather than a philosopher) explains how in certain experiments regarding addiction relief, it has been shown that the active cognition of the mind can actually alter the material function of the brain over time by creating new neural pathways. The whole topic of “neuroplasticity,” which is showing us that our brains remain more subject to change in adulthood than we previously thought, seems to cut against Dr. Metzinger’s argument.

As a caveat, when Dr. Metzinger says we ought to get rid of “black-and-white distinctions,” I think he’s right in that we need more complex and nuanced ways to think about the topic of free will as some interaction between personal volition and influence (or perhaps deterministic) influences. But this is nothing new in the philosophy of the mind (or theology, for that matter) and I’ve myself argued for such a position in previous posts. But when Dr. Metzinger’s seemingly-suggested resolution is to ignore one half of the equation entirely, we’re stepping backward instead of forward.

The logic further falters as Dr. Metzinger continues, writing: “As the dolphin story hints, human beings are not Cartesian egos capable of complete self-determination.” I would remind you that the dolphin story is a metaphor, by itself it cannot logically hint at anything except the to extent that it can be shown that the metaphor validly represents the things it is trying to explain (though this article contains none of that).

There’s a glimmer of reason after this, though, where Dr. Metzinger says, “Nor are we primitive, robotic automata. Instead, our conscious inner life seems to be about the management of spontaneously emerging mental behavior. Most of what populates our awareness unfolds automatically, just like a heartbeat or autoimmune response, but it can still be guided to a greater or lesser degree.”

I’d like to point out in the above that Dr. Metzinger wisely uses the words “seems to be” to indicate that he is speculating here. The problem, though, is that despite these subtle hints about the actual logical foundation of his argument (being very slight), he presents most of his ideas as authoritative through the rest of the article’s language.

For sake of time and space, I’m going to skip a few paragraphs where Dr. Metzinger discusses the positive and negative effects of daydreaming. He continues, “My view is that the mind-wandering and the DMN [what he calls the default-mode network of the active parts of the brain during rest periods) basically serve to keep our sense of self stable and in good shape. Like an automatic maintenance program, they constantly generate new stories, weaving back and forth between different time-horizons, each micro-narrative contributing to the illusion that we are actually the same person over time” (this time, emphasis is mine).

Again, Dr. Metzinger begins with words of speculation (“My view is…”) but then makes assertions as if they are fact. He’s put the cart before the horse here by assuming that the idea of the self is an illusion rather than a reality. And he’s done that without any evidence whatsoever. It seems here, as I think has become fashionable for some intellectuals investigating the still relatively terra incognita of the mind, to assume a Buddhist sort of worldview and then force the science to fit that mold. But the Buddhist idea that the self is an illusion is a religious and philosophical idea, not a scientific one. There is no defensible logic to starting with that assumption and working backwards. That’s simply not how science works.

The truth will out, as they say, and it certainly does in the next paragraph. Dr. Metzinger writes, “I should come clean at this point and confess that I don’t believe in any such entity or thing as ‘the self’” (emphasis mine). It’s a little late in the game here to make that confession—honest scholarship starts with a confession of biases that are known to the writer and probably unknown to the reader so that the reader can read critically. I think that this drives home the disingenuity on Metzinger burying the language of speculation with such extensive assertions of truth.

But it’s the assertion itself that is so ironic—who is making the confession if there is no self? The sentence, under Metzinger’s argument, is itself nonsense. And therein lies perhaps the biggest problem with the materialist approach to the mind—even the people who maintain that position cannot (and do not seem to try to) live as if it were true. The only way it is possible to interact with the world is through an understanding of self. That understanding may see itself as more or less connected to everything around it, but no one acts or thinks without reference to an “I.” If that “I” is an illusion, then there’s really no “I” to make the discovery that it is an illusion in the first place. Hence the circularity of this kind of logic.

To drive the weakness of Dr. Metzinger’s philosophy home, he then refers to “evolutionary psychology,” that perennial favorite of materialist thinkers like Richard Dawkins and Stephen Pinker. Evolutionary psychology is the field of making unfalsifiable assumptions about the development of the brain (and therefore mind) according to subjectively selected “societal needs” and then presenting those assumptions as fact. Dr. Metzinger joins in by arguing about the societal role of the “fiction” of the self, how “[h]umans have evolved to be a bit like method actors,” and asserting that “The self-as-agent is just a useful fiction, a neurocomputational artefact of our evolved self-model.”

This statement is unfalsifiable by scientific method because consciousness and self are, by their very nature, subjective. And yet, Metzinger presents his assumptions as the inevitable conclusions of science despite the fact that true scientific method (nor basic philosophical logic) would touch such a conclusion with a 10-foot pole. Further, Metzinger delicately (and probably quite deliberately) avoids issues like the “hard problem of consciousness” by simply denying that there is one.

In a further bout of spontaneous honesty, “But just as there is no ‘real’ character, there’s also no such thing as ‘a self’, and probably nothing like an immortal soul either.”

Metzinger is, for such an esteemed scholar, remarkably willing to conflate belief with fact and then to work backward from there.

I think it is sufficient to stop with a detailed rebuttal of Metzinger’s argument there, as the rest of the (lengthy) article simply repeats the same logical errors, rhetorical slight-of-hand and materialism as religious belief (in that it is the given from which all other inquiry begins) as science.

On the one hand, perhaps it is the arguments of the religious that have generated this kind of reactionary response. When we deny the usefulness of science because of religion (which, as I’ve often argued, we oughtn’t) it seems a natural (though not logical) response to use science to deny religion. And that’s really what these kinds of arguments are ultimately about (otherwise, why explicitly deny the existence of an immortal soul when the very argument makes such a distinction meaningless).

Frankly, I’m tired of it, on both sides. I’m tired of atheist materialists trying to claim philosophical and metaphysical truth through science and I’m tired of fundamentalist Christians denying evolution because the Bible doesn’t mention it.

To be clear, I have no problem with atheists saying that science leads them to believe in a solely materialist explanation for existence—they’re well within their right to draw that conclusion, even if I think it is the wrong one, just as some are led to faith because of their interpretation of metaphysical likelihoods based on science. Reasonable people may disagree, as we lawyers like to say. It’s when they claim that science proves their belief that I become offended as a person of deep faith who nevertheless is willing to make careful distinction between what science shows us (and often defers to science to inform theology) and what must be left to faith and belief.

At the same time, I’m upset both by the closemindedness and bad theology of those who question science based on Scripture that in no way asserts that that’s a proper (or even valid) way to analyze the world and the fact that, knowing I’m a Christian, many people with whom I’d like to have a real (and respectful) conversation about these kinds of topics will not listen logically because they somehow assume I’m that kind of Christian.

As I’ve said many times in the past, science is simply not equipped to answer metaphysical questions, which unfortunately must be relegated to the realm of belief, conviction, uncertainty and doubt. Let’s use science to examine and explore the material world, to learn what we can about all that we can. But let’s also admit when science is of no use and properly categorize those beliefs about the metaphysical as matters of faith, no matter who they come from, believer or not.

Sacrifice and Eternity

I don’t know what the world to come will be like. I have no special insight into what happens to us and where we go when we die, what will occupy us in eternity. Like most of us, I suppose, I have my hopes and comforting beliefs about what the life eternal will be; my stubbornesses about which I say, “If there’s not X, I’m not going,” like I have some control over the situation; those feelings you–every rare once in a while–feel and think, “This. This feeling lasting forever; that must be heaven.”

I try not to cling too tightly to any preconceived notion of what awaits us, instead trying to trust in God that it will be far greater than I could imagine anyway. Frankly, I’m not too concerned about whether heaven is some entirely spiritual dimension of existence or embodied life in a world restored by God to perfection on the last day.

But there is one thing that I do believe strongly: even in the life to come, there will be sacrifice.

I don’t mean sacrifice on the cosmic scale, no dying that others may live, no giving up all that I have so that others may have something, not the sort of things that make us look at others who can make those kinds of sacrifices with such awe and respect. I mean the more mundane, everyday sacrifices we’re already called to make. The lesser sacrifices necessary to mutual relationship: I’d rather do X in our free time, but since you want to do Y, let’s do that instead and we can do X another time. That sort of thing.

Perhaps this sounds a strange thing to fixate on, but I think it’s a necessary expectation based on the few things that the scriptures and traditions of our faith do seem to tell us about the world to come. Christ promises us “eternal life.” The words used to create that phrase in the New Testament carry the connotation of not just surviving, but thriving–active, vigorous, fulfilled, unceasing, experiential life.

That means a few things. First, we will be ourselves–perfected perhaps, the dross burned away–but recognizable as ourselves. And why wouldn’t we be if we believe that we are purposefully, “wonderfully and fearfully made?” Second, we will not be idle. I don’t know what kind of activity is planned for us–though I’m fairly certain it will not be sitting on clouds strumming harps, singing ceaseless hymns and trying not to think of toilet-paper commercials. Third, we will be together. There’s no good and dependable answer to the question who “we” is (I believe, based on my limited knowledge of God, that all will be there eventually), so let’s sidestep that conundrum for now. Third, we will be in relationship with God and others forever. Again, why wouldn’t we be? The overarching narrative of both the Bible as a whole and the Gospels in particular is that ours is a God who values relationship.

Let’s look at those three things together. We’ll be ourselves. That means that, just like we do now, we will have our personalities, our preferences, our likes and our dislikes (we can go down some dizzying rabbit trails trying to think about the scope and limitations of what preferences and personality traits qualify as being righteously permissible, just as we could with what kinds of activities will be permissible, but let’s not today). We will do things. As a person with preferences, there are some things I like to do better than others. And we will do them with others.

When people come together, it is natural for personalities to clash at times, for preferences to butt up against one another. Unhealthy relationships can be ruined by this, but healthy ones engage in minor sacrifice to work out such petty disputes. Those relationships that do become stronger for it–when you know that I’m willing to give in for your sake sometimes and you’re willing to give in for mine in others, we know that we mean something to one another.

It is fashionable to talk about this phenomenon as a quid pro quo these days, “deposits and withdrawals from the love bank” if we’d like to resort to some especially cloying purple prose. Human nature being what it is–and some people being who they are–I suppose that sometimes that’s a fitting description. But for the best relationships, you don’t make those compromises simply to get future compromises; you do them simply because you care about the other person.

There’s every reason to think that the relationships we enjoy in the life to come will be the best of relationships. For that to be the case, we will eternally need to make some minor sacrifices for each other at various times.

So best to start practicing now, because some aspects of the life to come are not about the where, the when and the what; they’re about the how, how you see others and your relationships with them. That’s one small reason that we say that the Kingdom of Heaven is both a present reality and a future promise: there are aspects of perfected existence that we may participate in here on Earth if we’re willing to. Christ’s incarnation accomplished more than cosmic salvation–it gave us an example of how to think and act so that we do not have to wait to participate in the fullness of the life to come.

As Milton says in Paradise Lost, “The mind is its own place, and in itself can make a heaven of hell, a hell of heaven.” Jesus showed us how to do the former. Those happy little sacrifices we make for the ones we love are an essential aspect of creating that double joy of relationships that both affirm each person involved and become something greater than either. No reason to think that will change in the world to come.

History and Historicity

I wrote in a recent post about some of the difficulties with issues of history and historicity in the Old Testament I’ve had in preparing for my impending journey to Israel. Having had some time to clarify my thoughts, I thought I’d share them.

First, I want to focus on an exemplum of my thoughts and then I’ll speak more generally. Let’s begin with the Bablyonian Captivity. Or, rather, a little bit before that.

In 1 Kings 18, the prophet Elijah confronts Ahab, the monarch of the Kingdom of Israel, on Mount Carmel in a rather memorable set of contests. Really, Elijah is confronting the worship of Baal in the Kingdom of Israel here, but Ahab is culpable for allowing the Israelites to stray from the worship of Yahweh alone.

The four-hundred and fifty prophets of Baal are asked to pick between two bulls brought to the mountain, to cut it to pieces and to smoke if over a fire; Elijah–as Yahweh’s sole remaining prophet–will do the same with the other. Then they will each call upon their respective gods and see who “shows up.” As the Baalite priests beseech their god, they get no response. With memorable taunts (Maybe your god is sleeping and needs to be awakened? Maybe he’s traveling? Maybe he’s busy defecating?), Elijah insults Baal’s prophets until it comes time for him to beseech Yahweh. When he does, the Israelite God sends his “fire” down to earth to light the prepared wood, burn up the bull carcass and the stones, soil and water prepared around the altar. After this, the priests of Baal are slaughtered by the gathered people.

I’m not actually interested in the historicity of this particular story but in what it tells us about the culture of the time (Ahab’s existence is attested outside of the Bible and he was probably king of Israel around the middle of the 9th Century BCE). As we find in the cultures surrounding Isreal-Palestine at that time, gods were viewed to be local; they were the gods of particular cities or nations. We see this explicit in other places even in the Bible, where the Isrealite God states that “he” is the God of Israel (hence the epithet “Israelite God,” I suppose).

What’s happening between the lines in this passage in Kings is a divine turf war. Baal (which is a title that means “lord” and which is borne by several distinct deity figures and used generally to mean “a god”) is a god of the Phoenicians in the city of Tyre. If you look on a map of Biblical Israel, you’ll see that Tyre is on the coast of the Mediterranean Sea (on an island, actually) just a short journey north from Mount Carmel. The question being answered by Elijah’s story is, roughly put, “If Baal is the god of Tyre, and Yahweh is the god of Israel, and they’re both geographically close to one another, which has dominion in the middle ground?” Clearly the answer is Yahweh.

I mention the above passage because it sets us up for the real point about history and historicity in the Old Testament that I want to make in this post. When in the (very early) 6th Century BCE the Babylonians under Nebuchadnezzer sacked Jerusalem and deported the Israelites to Bablyon, a crisis of faith occurred. Again, as a brief aside, this event is attested in the historical record outside of the Bible. If the Israelites were to worship the God of (the nation/land) Israel, how could they do that when they’d been transported to Babylon, the land of the Babylonian gods.

And here comes the prophet Ezekiel. In the verses that open the book that bears his name, Ezekiel tells us that he has a vision of God while among the exiled Israelites in Babylon “on the banks of the Kebar River.”

In this vision, as the Biblical historian Cynthia R. Chapman says, “God gets wheels.” Literally; Ezekiel sees God enthroned upon what I can’t help thinking of as a super-high-tech, four-likeness-of-living-creature-powered motorized wheelchair. That strange image aside, the point of the vision is that the God of Israel is mobile, that God is personally and actually present with the Israelites even in their exile. As a side note, my NIV says that Ezekiel is taken back to the “Kebar River near Tel Aviv”–this should be read as Tel Abib (in modern-day Iraq) by the Chebar River.

Hearing about the underlying spiritual-cultural concerns with regards to these (and other) Old Testament passages did much to “resolve” my problem of “historicity” in the OT (for purposes of this post, I have left aside all of the issues of the construction of the Old Testament text–whether discussion of the three hypotheses of its construction or the timing of its creation).

What I find here is something that makes much more sense to me than either extreme of the historicity debate–humans writing stories of their evolving understanding of and relationship with God. These stories are neither entirely myth nor entirely history; they are stories that draw upon historical experience (and the religious issues raised by that experience), mythological content that may or may not be based in fact (I’m not worried about the answer to that), revelation of the nature of God from God (there’s that spirit-breathed bit), and human reactions and struggles in response to that revelation.

I see this especially as the Israelite understanding of the nature of God breaks free from social precedent and evolves from polytheism to henotheism to true monotheism.

In some ways, what we have in the Old Testament is the macrocosm of Jacob’s struggle with God at Penuel–a back and forth between God and man that may defy explanation but results in relationship.

Does that make interpreting the Bible difficult? Absolutely; I don’t have an answer for you on how we best sort God’s intent from the voice of the writers from the historical record from the cultural context, etc. But I’m certainly willing to say that it’s not supposed to be easy. I can’t imagine that God would decide not to directly appear before all people in an unmistakeable way (which, to be clear, God hasn’t) and yet make Biblical interpretation something as simple as looking at words verbatim.

In the near future, I’m going to return to the Babylonian captivity and the Book of Job to talk a bit about theodicy in Christianity.

 

This Year in Jerusalem

In less than two months, I’m going to Israel. This will be my favorite kind of trip–a study trip with accompanying professors, homework beforehand, and the goal of coming to understand the cultures, history and geography of Israel and its surrounds to better understand scripture.

I am fortunate enough to have this opportunity because of K’s ministry; this is a trip for young clergy and their spouses. Some of my favorite people are going and we could be going to spend two weeks on the Jersey Shore (place or TV show) and I wouldn’t mind. But we’re going to Israel.

We’ve been given books to read and maps to study and mark up before we go. For me, it is a fascinating and tedious process–I see mentions of peoples or places or cultures from the ancient past and go on hours-long rabbit trails that conclude with me writing prodigious notes to myself on the backs of maps that have little to do with the main point of study. Did you know that there’s a good chance that the near-mythical “Hanging Gardens of Bablyon” were actually constructed by Sennacharib in Ninevah? The same Sennacharib mentioned in the Book of Kings as carrying off people from the Kingdom of Israel and nearly destroying the Kingdom of Judah?

None of that time is wasted, though I imagine it’s going to annoy the hell out of K as I offer (unsolicited) trivia throughout our trip. This course of study has made me feel like I’m in grad school again and–being the perpetual student–I’m loving it. Even moreso, it’s directed my thoughts in some ways that I believe will profoundly effect both my theological work and my fiction writing. Many of the things below will likely see their own posts and soon, but here’s some of what I’ve been brought to ponder lately:

Studying maps and geography has been eye-opening in terms of Biblical (and historical understanding). There are so many things that become clearer about both the broader historical context and the context of passages in the Bible when you understand (however abstractly at the current juncture) the “lay of the land.” The locations and movements of the Babylonians, the Assyrians, the Egyptians, the Arameans, the Edomites and the Moabites and all the other ancient cultures of the Levant sheds light the context of the Israelite people as they journey from polytheism to henotheism to monotheism (and back-and-forth quite a bit). The strategic importance of the Levant on the world stage and the many times it has changed hands over the millennia is just fascinating. There is much to be said about all this (much of it already written by experts in books, so who am I to spend lots of words on it?), but suffice to say that I have studied history extensively and am only now starting to see lots of things come into clarity now that I am locating events and peoples on the map relative to one another. Even some idea of motives become clear when you put things together. All in all, I’m actually a little disappointed that, in all of my historical studies, none of my classes spent a whole lot of time (if any at all) using the geography of the time and place to place events in context.

Of course, all of this study of geography makes me think of Avar Narn. This, coupled with the fact that I’ve recently spent some time looking through Tolkien’s old maps and drawings and their evolution has me going back to the layout and maps of my own fantasy world. I have posted some previous maps of parts of Avar Narn on this site, but I am afraid to say (but not really) that they will soon be obsolete and superseded as I replace them with a more thorough effort at consistently mapping the world (or at least the continent where most of my stories will be set) in line with its history, cultures, and languages. I am beginning to very much envy Martin’s choice to name everything in Westeros in English. In breaks from the map work for my impending journey, I’ve re-sketched the continent, placed it longitudinally and latitudinally, worked out air and sea currents (as best I know how) and, correspondingly, the various climate zones of the space.

Because of some changes in the history of Avar Narn which have occurred both in direct rewriting of that history and from ideas better explored (or created) in the first Avar Narn novel (still very much in progress, of course), there will be further changes to the locations of some of the nations and peoples as shown on the previous maps. Alongside that, I’ll be doing some renaming of locations to tie them in better with the linguistic work I’ve done for the setting. There will definitely be more posts about all of this in the future.

Back to the theology side, where I’ve also been thinking a lot about historicity in theology. This has been something of a journey and a struggle for me. Intellectually–as I’ve communicated through the blog before–I don’t believe that concern with historicity is the most fruitful or meaningful of concerns in understanding our faith. I believe that the text of scripture is divinely inspired–but also filled with human agency and not to be taken literally–but that it’s just not that important to know whether the Exodus actually happened as written (there’s no corroborating evidence that it did). What is important is what the story of the Exodus tells us about our God and the evolving Israelite understanding of God as a macrocosm of the way in which the individual gradually struggles with an understanding of the divine.

At the same time, I cannot deny that the historical context of the Biblical writings is illuminating in its interpretation. And in addition, I find that archeological and historical research can help us understand the human motives in the Bible and perhaps filter some of that from the divinely inspired aspects. For instance, there’s very little evidence to support that the conquering of Canaan as told in Joshua occurred; the gradual occupation of the land by the Israelites as told in Judges seems far more likely.

I’m fine with this; it doesn’t bother me that the Bible may not be historically accurate on all fronts–it’s not meant to be a history (well, maybe Joshua and Judges are somewhat) and has a different purpose for us that should not be confused with absolute historicity. In the same way, Genesis is not intended to be a science manual for the creation of all things.

What I’m struggling with is how emotionally bound to questions of historicity I find myself (in spite of myself). There’s a pang of upset in my stomach when I find that an event as told in the Bible is probably not actually how something happened.

I believe in the historicity of Jesus, but I’d be a Christian even if I didn’t–I believe that the story of Jesus tells us Truth about the nature of our Creator, Sustainer and Savior that is unfettered by our existential reference points. So why do I find myself caring so much about the answers to historicity? Especially when it’s clear that–in broad strokes at least–the Old Testament narrative about the Jews’ evolving relationship with God is borne out by the historical record. I think that this is a matter of wanting things to simple, clear and absolute. And this from someone who finds great wonder in the complexity, ambiguity and mystery of existence! Perhaps that’s what it comes down to–a minor crisis of identity. And that makes me wonder whether all obsessions with the historicity of Biblical events spawns from that very human concern.

In general, I think that the Christian testimony–as an unfairly broad generalization–would be perceived as more reasonable (which I believe it inherently is) if we were able to communicate our faith in a way that incorporates questions of historicity without being dominated by them. Many a man has gone in search of Noah’s Ark (with many claims to have found it), but even an indisputable confirmation of its existence would not tell us much about who God is that the Bible doesn’t tell us already.

I’m sure you’ll hear more about that as I post (to the extent that I’m able) from Israel, where the question will, in some part, be a constant concern of mine (if not all of us traveling together). Stay tuned for more on all fronts.

See my journal of the trip here.

Punctuation

This is going to seem like a relatively random posting, but as I’ve been writing on my novel, reviewing a friend’s novel and having some discussions about Biblical interpretation, I’ve been thinking a lot about punctuation lately. Here are some of my musings:

Punctuation is critical in all forms of writing; understanding and properly using punctuation lends authority to anything you write. In my experience, most people do not use proper punctuation. I don’t mean that they make occasional mistakes in their punctuation–everyone does that. I mean that they flagrantly ignore the rules of punctuation and how to use (and intentionally misuse) those rules to greatest effect.

The most egregious culprit is the semi-colon. When I was a graduate student and teaching assistant in English, I would (usually frustratedly and spontaneously after grading the first round of tests) spend a class period reviewing grammar and punctuation with my classes, with a particular focus on the semi-colon. Much to my dismay, what I typically found is that after this session, many students would liberally disperse semi-colons throughout their writing in an effort to seem more capable writers. “What’s the problem with that?” you ask? Nothing, if done correctly. But my students seemed to sprinkle semi-colons over their papers like literary glitter without regard for whether their sentences required glitter. Have I mentioned that I hate glitter? It’s craft herpes–once you’ve contracted it, you’ll be finding it on you forever.

So my students committed a cardinal sin of writing–using something (whether punctuation, a word, a stylistic device, etc.) you don’t understand in an attempt to come across as more talented than you are. Like most good writing techniques, punctuation is most effective when subtle, when it influences the reader without their perceiving that it is doing so. Like much social subtlety, this is crass when recognized and only acceptable in polite society when carefully concealed. The improper use of punctuation breaks the illusion, making this manipulation painfully and embarrassingly clear. A misused piece of punctuation–whether a comma splice or an unneeded semi-colon for instance–thrusts itself into the mind of the reader like an unwanted and socially awkward guest who cannot read the room. It breeds mistrust of the writer and should thus be avoided at all costs.

I think many of us, myself included, are embarrassed to look up rules of punctuation when we don’t know a proper usage. These are things we’re taught in elementary school, so we assume that they are so basic that a person of reasonable intelligence would not forget them. Nothing is farther from the truth. We start learning punctuation and grammar early because these things are difficult and require much practice. Writing is like a muscle, not like riding a bicycle–it atrophies if unused. Because of that, there is nothing wrong with having to refresh your memory about “basic” grammatical concepts. If it’s that big a deal, clear your browser history afterward. But, for the love of God, look up the rule in the first place if there’s any question.

The opposite of the above is, thankfully, also true. The proper use of punctuation is an extremely effective aspect of writing style. To be clear, the word “proper” as used here relies heavily on context. In (most) professional writing, rules of grammar and punctuation should be kept religiously. In fiction writing or circumstances where the perspective and mind of the author are part of the writing itself, the rules should be liberally–but carefully and thoughtfully broken.

I came across an excellent example of this (and probably the impetus for this post) while starting to read a friend’s young adult novel. The novel (at least as far as I’ve gotten) is told in the first person point-of-view of a sixteen year-old young woman. The style of the writing is clipped, using short sentences, sentence fragments and well-placed punctuation to convey the fleeting, sometimes confused and quite excited thoughts of this character as she attends a sort-of debutante party that she knows represents a crucial fork in the road of her life. The character comes to life not just in her words, but in the way that the punctuation groups her thoughts into clusters, abruptly changes subjects and gives us a feel not just for what she thinks but how she thinks. That is great writing; the kind we should all strive for. I’d love to include some examples here, but it’s not my writing to share.

And in that effort, we should bear in mind that there are a number of approaches to punctuation in any writing, but fiction in particular. I would–admittedly making this up as I go–call the above example an heuristic approach to punctuation. But maybe I ought to be less pretentious and call this a “character-based” approach. Alternatives, if you like, might be to call this a “stream-of consciousness” approach or even a “Joycian” approach. The punctuation defines the character, not the author or the style of the writing itself necessarily.

We might alternatively use a dramatic or theatrical approach. In dramatic works, actors are trained to use the punctuation as keys to the pacing, pauses and breaths in speech. Here, the punctuation serves as a code to help the written word mimic normal speech patterns. I find that most people naturally follow this approach when reading aloud, whether or not the piece is dramatic. So, using this method, good punctuation should be used to assist the flow of the text for the reader and to enhance both comprehension and enjoyment of the text. Does this sometimes overlap with the first-described approach? Probably, but not necessarily. Some people don’t think or speak in ways that are easy for others to understand, and not all points of view in narrative are going to be able to characterize and define those involved in the action described.

A more formal adherence to the “rules-as-written” of punctuation would likely serve the same function as the theatrical approach, though perhaps with a different feel. The ease of communication of content is paramount here, but should not be sacrificed for other cognitive effects that might be created in the mind of the reader through creative and effective punctuation.

I don’t think that it’s necessary, nor probably even helpful, to spend a lot of time trying to categorize your punctuational approach by the groups given above (or any others for that matter). What is important is to be intentional about your punctuation. This takes us back to Professor Brooks Landon’s comment that writing is “brain hacking.” Punctuation is an integral part to how your text creates, divides and sequences images and thoughts in the mind of the reader. Your punctuation should always be calculated to bolster the substance of the text to your desired effect. Is that easy? Hell no. But it’s certainly worth the effort.